hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f40fc01f7476d0d3f8da4cf9a47130f26de9736a | 235 | py | Python | pysit/core/__init__.py | zfang-slim/pysit | 8fca42b9749841abc302d1f8195a1437fad7ae4d | [
"BSD-3-Clause"
] | 64 | 2015-09-08T06:23:27.000Z | 2022-03-09T23:35:24.000Z | pysit/core/__init__.py | simonlegrand/pysit | 1fb1a80839ceebef12a8d71aa9c295b65b08bac4 | [
"BSD-3-Clause"
] | 23 | 2015-10-08T01:14:24.000Z | 2021-07-15T11:37:05.000Z | pysit/core/__init__.py | simonlegrand/pysit | 1fb1a80839ceebef12a8d71aa9c295b65b08bac4 | [
"BSD-3-Clause"
] | 48 | 2015-06-25T14:48:22.000Z | 2021-12-06T19:50:25.000Z |
from pysit.core.domain import *
from pysit.core.mesh import *
from pysit.core.wave_source import *
from pysit.core.shot import *
from pysit.core.sources import *
from pysit.core.receivers import *
from pysit.core.acquisition import * | 26.111111 | 36 | 0.787234 | 36 | 235 | 5.111111 | 0.333333 | 0.342391 | 0.494565 | 0.619565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123404 | 235 | 9 | 37 | 26.111111 | 0.893204 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
be82c37b01570b822a13a721ad0e58b3ec7073ed | 3,406 | py | Python | tests/test_timeout_iterator.py | leangaurav/pypi_iterator | 4201446b0764687247bfb4483d84b8237f72f4e4 | [
"MIT"
] | 1 | 2021-03-01T20:30:38.000Z | 2021-03-01T20:30:38.000Z | tests/test_timeout_iterator.py | leangaurav/pypi_iterator | 4201446b0764687247bfb4483d84b8237f72f4e4 | [
"MIT"
] | 1 | 2021-09-07T13:37:38.000Z | 2021-09-10T17:12:57.000Z | tests/test_timeout_iterator.py | leangaurav/pypi_iterator | 4201446b0764687247bfb4483d84b8237f72f4e4 | [
"MIT"
] | 1 | 2021-02-04T12:58:10.000Z | 2021-02-04T12:58:10.000Z | import unittest
import time
from iterators import TimeoutIterator
def iter_simple():
yield 1
yield 2
def iter_with_sleep():
yield 1
time.sleep(0.6)
yield 2
time.sleep(0.4)
yield 3
def iter_with_exception():
yield 1
yield 2
raise Exception
yield 3
class TestTimeoutIterator(unittest.TestCase):
def test_normal_iteration(self):
i = iter_simple()
it = TimeoutIterator(i)
self.assertEqual(next(it), 1)
self.assertEqual(next(it), 2)
self.assertRaises(StopIteration, next, it)
self.assertRaises(StopIteration, next, it)
def test_timeout_block(self):
i = iter_with_sleep()
it = TimeoutIterator(i)
self.assertEqual(next(it), 1)
self.assertEqual(next(it), 2)
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
self.assertRaises(StopIteration, next, it)
def test_fixed_timeout(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.5)
self.assertEqual(next(it), 1)
self.assertEqual(next(it), it.get_sentinel())
self.assertEqual(next(it), 2)
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
def test_timeout_update(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.5)
self.assertEqual(next(it), 1)
self.assertEqual(next(it), it.get_sentinel())
it.set_timeout(0.3)
self.assertEqual(next(it), 2)
self.assertEqual(next(it), it.get_sentinel())
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
def test_custom_sentinel(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.5, sentinel="END")
self.assertEqual(next(it), 1)
self.assertEqual(next(it), "END")
self.assertEqual(next(it), 2)
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
def test_feature_timeout_reset(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.5, reset_on_next=True)
self.assertEqual(next(it), 1) # timeout gets reset after first iteration
self.assertEqual(next(it), 2)
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
def test_function_set_reset_on_next(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.35, reset_on_next=False)
self.assertEqual(next(it), 1)
self.assertEqual(next(it), it.get_sentinel())
it.set_reset_on_next(True)
self.assertEqual(next(it), 2)
self.assertEqual(next(it), 3)
self.assertRaises(StopIteration, next, it)
def test_iterator_raises_exception(self):
i = iter_with_exception()
it = TimeoutIterator(i, timeout=0.5, sentinel="END")
self.assertEqual(next(it), 1)
self.assertEqual(next(it), 2)
self.assertRaises(Exception, next, it)
self.assertRaises(StopIteration, next, it)
def test_interrupt_thread(self):
i = iter_with_sleep()
it = TimeoutIterator(i, timeout=0.5, sentinel="END")
self.assertEqual(next(it), 1)
self.assertEqual(next(it), it.get_sentinel())
it.interrupt()
self.assertEqual(next(it), 2)
self.assertRaises(StopIteration, next, it)
| 30.410714 | 80 | 0.636524 | 435 | 3,406 | 4.850575 | 0.131034 | 0.119431 | 0.270142 | 0.298578 | 0.769668 | 0.767773 | 0.767773 | 0.767773 | 0.728436 | 0.695261 | 0 | 0.020583 | 0.243981 | 3,406 | 111 | 81 | 30.684685 | 0.798835 | 0.011744 | 0 | 0.688889 | 0 | 0 | 0.003567 | 0 | 0 | 0 | 0 | 0 | 0.466667 | 1 | 0.133333 | false | 0 | 0.033333 | 0 | 0.177778 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
be9d10bdbb61206f7638e707150d2083321bce24 | 13,226 | py | Python | tools/pythonpkg/tests/fast/arrow/test_filter_pushdown.py | K377U/duckdb | 6322c0deb9bc1aee3d49f08452d5e03a20395e6b | [
"MIT"
] | 2 | 2020-12-11T15:22:01.000Z | 2021-04-19T17:33:15.000Z | tools/pythonpkg/tests/fast/arrow/test_filter_pushdown.py | lnkuiper/duckdb | dd2f405ae3a74f317e10f0a32254ba2d5e2d8c41 | [
"MIT"
] | 1 | 2021-09-06T23:09:17.000Z | 2021-09-06T23:09:17.000Z | tools/pythonpkg/tests/fast/arrow/test_filter_pushdown.py | lnkuiper/duckdb | dd2f405ae3a74f317e10f0a32254ba2d5e2d8c41 | [
"MIT"
] | null | null | null | import duckdb
import os
import pytest
import tempfile
try:
import pyarrow as pa
import pyarrow.parquet as pq
import pyarrow.dataset as ds
import numpy as np
import pandas as pd
can_run = True
except:
can_run = False
def numeric_operators(data_type):
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a "+data_type+", b "+data_type+", c "+data_type+")")
duckdb_conn.execute("INSERT INTO test VALUES (1,1,1),(10,10,10),(100,10,100),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
print (arrow_table)
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a =1").fetchone()[0] == 1
# Try >
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >1").fetchone()[0] == 2
# Try >=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >=10").fetchone()[0] == 2
# Try <
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <10").fetchone()[0] == 1
# Try <=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <=10").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a=10 and b =1").fetchone()[0] == 0
assert duckdb_conn.execute("SELECT count(*) from testarrow where a =100 and b = 10 and c = 100").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = 100 or b =1").fetchone()[0] == 2
class TestArrowFilterPushdown(object):
def test_filter_pushdown_numeric(self,duckdb_cursor):
if not can_run:
return
numeric_types = ['TINYINT', 'SMALLINT', 'INTEGER', 'BIGINT', 'UTINYINT', 'USMALLINT', 'UINTEGER', 'UBIGINT',
'FLOAT', 'DOUBLE', 'HUGEINT']
for data_type in numeric_types:
numeric_operators(data_type)
def test_filter_pushdown_decimal(self,duckdb_cursor):
if not can_run:
return
numeric_types = ['DECIMAL(4,1)','DECIMAL(9,1)','DECIMAL(18,4)','DECIMAL(30,12)']
for data_type in numeric_types:
numeric_operators(data_type)
def test_filter_pushdown_varchar(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a VARCHAR, b VARCHAR, c VARCHAR)")
duckdb_conn.execute("INSERT INTO test VALUES ('1','1','1'),('10','10','10'),('100','10','100'),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='1'").fetchone()[0] == 1
# Try >
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >'1'").fetchone()[0] == 2
# Try >=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >='10'").fetchone()[0] == 2
# Try <
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <'10'").fetchone()[0] == 1
# Try <=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <='10'").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a='10' and b ='1'").fetchone()[0] == 0
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='100' and b = '10' and c = '100'").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = '100' or b ='1'").fetchone()[0] == 2
def test_filter_pushdown_bool(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a BOOL, b BOOL)")
duckdb_conn.execute("INSERT INTO test VALUES (TRUE,TRUE),(TRUE,FALSE),(FALSE,TRUE),(NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a =True").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a=True and b =True").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = True or b =True").fetchone()[0] == 3
def test_filter_pushdown_time(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a TIME, b TIME, c TIME)")
duckdb_conn.execute("INSERT INTO test VALUES ('00:01:00','00:01:00','00:01:00'),('00:10:00','00:10:00','00:10:00'),('01:00:00','00:10:00','01:00:00'),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='00:01:00'").fetchone()[0] == 1
# Try >
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >'00:01:00'").fetchone()[0] == 2
# Try >=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >='00:10:00'").fetchone()[0] == 2
# Try <
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <'00:10:00'").fetchone()[0] == 1
# Try <=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <='00:10:00'").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a='00:10:00' and b ='00:01:00'").fetchone()[0] == 0
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='01:00:00' and b = '00:10:00' and c = '01:00:00'").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = '01:00:00' or b ='00:01:00'").fetchone()[0] == 2
def test_filter_pushdown_timestamp(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a TIMESTAMP, b TIMESTAMP, c TIMESTAMP)")
duckdb_conn.execute("INSERT INTO test VALUES ('2008-01-01 00:00:01','2008-01-01 00:00:01','2008-01-01 00:00:01'),('2010-01-01 10:00:01','2010-01-01 10:00:01','2010-01-01 10:00:01'),('2020-03-01 10:00:01','2010-01-01 10:00:01','2020-03-01 10:00:01'),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
print (arrow_table)
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='2008-01-01 00:00:01'").fetchone()[0] == 1
# Try >
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >'2008-01-01 00:00:01'").fetchone()[0] == 2
# Try >=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >='2010-01-01 10:00:01'").fetchone()[0] == 2
# Try <
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <'2010-01-01 10:00:01'").fetchone()[0] == 1
# Try <=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <='2010-01-01 10:00:01'").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a='2010-01-01 10:00:01' and b ='2008-01-01 00:00:01'").fetchone()[0] == 0
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='2020-03-01 10:00:01' and b = '2010-01-01 10:00:01' and c = '2020-03-01 10:00:01'").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = '2020-03-01 10:00:01' or b ='2008-01-01 00:00:01'").fetchone()[0] == 2
def test_filter_pushdown_date(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a DATE, b DATE, c DATE)")
duckdb_conn.execute("INSERT INTO test VALUES ('2000-01-01','2000-01-01','2000-01-01'),('2000-10-01','2000-10-01','2000-10-01'),('2010-01-01','2000-10-01','2010-01-01'),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
duckdb_conn.register("testarrow",arrow_table)
# Try ==
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='2000-01-01'").fetchone()[0] == 1
# Try >
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >'2000-01-01'").fetchone()[0] == 2
# Try >=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a >='2000-10-01'").fetchone()[0] == 2
# Try <
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <'2000-10-01'").fetchone()[0] == 1
# Try <=
assert duckdb_conn.execute("SELECT count(*) from testarrow where a <='2000-10-01'").fetchone()[0] == 2
# Try Is Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NULL").fetchone()[0] == 1
# Try Is Not Null
assert duckdb_conn.execute("SELECT count(*) from testarrow where a IS NOT NULL").fetchone()[0] == 3
# Try And
assert duckdb_conn.execute("SELECT count(*) from testarrow where a='2000-10-01' and b ='2000-01-01'").fetchone()[0] == 0
assert duckdb_conn.execute("SELECT count(*) from testarrow where a ='2010-01-01' and b = '2000-10-01' and c = '2010-01-01'").fetchone()[0] == 1
# Try Or
assert duckdb_conn.execute("SELECT count(*) from testarrow where a = '2010-01-01' or b ='2000-01-01'").fetchone()[0] == 2
def test_filter_pushdown_no_projection(self,duckdb_cursor):
if not can_run:
return
duckdb_conn = duckdb.connect()
duckdb_conn.execute("CREATE TABLE test (a INTEGER, b INTEGER, c INTEGER)")
duckdb_conn.execute("INSERT INTO test VALUES (1,1,1),(10,10,10),(100,10,100),(NULL,NULL,NULL)")
duck_tbl = duckdb_conn.table("test")
arrow_table = duck_tbl.arrow()
duckdb_conn.register("testarrowtable",arrow_table)
assert duckdb_conn.execute("SELECT * FROM testarrowtable VALUES where a =1").fetchall() == [(1, 1, 1)]
arrow_dataset = ds.dataset(arrow_table)
duckdb_conn.register("testarrowdataset",arrow_dataset)
assert duckdb_conn.execute("SELECT * FROM testarrowdataset VALUES where a =1").fetchall() == [(1, 1, 1)]
def test_filter_pushdown_2145(self,duckdb_cursor):
if not can_run:
return
date1 = pd.date_range("2018-01-01", "2018-12-31", freq="B")
df1 = pd.DataFrame(np.random.randn(date1.shape[0], 5), columns=list("ABCDE"))
df1["date"] = date1
date2 = pd.date_range("2019-01-01", "2019-12-31", freq="B")
df2 = pd.DataFrame(np.random.randn(date2.shape[0], 5), columns=list("ABCDE"))
df2["date"] = date2
pq.write_table(pa.table(df1), "data1.parquet")
pq.write_table(pa.table(df2), "data2.parquet")
table = pq.ParquetDataset(["data1.parquet", "data2.parquet"]).read()
con = duckdb.connect()
con.register("testarrow",table)
output_df = duckdb.arrow(table).filter("date > '2019-01-01'").df()
expected_df = duckdb.from_parquet("data*.parquet").filter("date > '2019-01-01'").df()
pd.testing.assert_frame_equal(expected_df, output_df)
os.remove("data1.parquet")
os.remove("data2.parquet")
| 51.866667 | 276 | 0.61432 | 1,883 | 13,226 | 4.20871 | 0.073818 | 0.11735 | 0.152303 | 0.165426 | 0.849464 | 0.834827 | 0.810599 | 0.767192 | 0.749401 | 0.744606 | 0 | 0.087178 | 0.22985 | 13,226 | 254 | 277 | 52.070866 | 0.69085 | 0.033343 | 0 | 0.386905 | 0 | 0.10119 | 0.377415 | 0.052615 | 0 | 0 | 0 | 0 | 0.345238 | 1 | 0.059524 | false | 0 | 0.053571 | 0 | 0.172619 | 0.011905 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bea52b92c66f4c9a3c1605806962ba8defe77909 | 79,987 | py | Python | encoder/audio_encoders/config/audio_encoders.py | Microchip-MPLAB-Harmony/audio | 0aef4f742c3a0e6a79d179019e257712b84df467 | [
"0BSD"
] | 10 | 2019-03-19T23:00:12.000Z | 2021-03-18T07:43:33.000Z | encoder/audio_encoders/config/audio_encoders.py | Microchip-MPLAB-Harmony/audio | 0aef4f742c3a0e6a79d179019e257712b84df467 | [
"0BSD"
] | 6 | 2019-11-06T19:22:17.000Z | 2021-11-24T12:35:40.000Z | encoder/audio_encoders/config/audio_encoders.py | Microchip-MPLAB-Harmony/audio | 0aef4f742c3a0e6a79d179019e257712b84df467 | [
"0BSD"
] | 4 | 2019-06-12T05:57:31.000Z | 2021-05-23T08:38:32.000Z | # coding: utf-8
##############################################################################
# Copyright (C) 2018 Microchip Technology Inc. and its subsidiaries.
#
# Subject to your compliance with these terms, you may use Microchip software
# and any derivatives exclusively with Microchip products. It is your
# responsibility to comply with third party license terms applicable to your
# use of third party software (including open source software) that may
# accompany Microchip software.
#
# THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER
# EXPRESS, IMPLIED OR STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED
# WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A
# PARTICULAR PURPOSE.
#
# IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE,
# INCIDENTAL OR CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND
# WHATSOEVER RELATED TO THE SOFTWARE, HOWEVER CAUSED, EVEN IF MICROCHIP HAS
# BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE FORESEEABLE. TO THE
# FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS IN
# ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY,
# THAT YOU HAVE PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE.
##############################################################################
import os
import sys
src_ext = ('.c')
hdr_ext = ('.h')
lib_ext = ('.a')
pcmTable = [("LIB_", "pcm/", "pcm_enc.h", "audio/encoder/audio_encoders/pcm"),
("LIB_", "pcm/", "pcm_enc.c", "audio/encoder/audio_encoders/pcm")]
adpcmTable = [("LIB_", "adpcm/", "adpcm_enc.h", "audio/encoder/audio_encoders/adpcm"),
("LIB_", "adpcm/", "adpcm_enc.c", "audio/encoder/audio_encoders/adpcm")]
opusTable = [("LIB_", "opus/", "opus_enc.c", "audio/encoder/audio_encoders/opus"),
("LIB_", "opus/", "opus_enc.h", "audio/encoder/audio_encoders/opus"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "analysis.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "mlp.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "mlp_data.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_compare.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_decoder.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_demo.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_encoder.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_multistream.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_multistream_decoder.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_multistream_encoder.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "repacketizer.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "repacketizer_demo.c", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "analysis.h", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "mlp.h", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "opus_private.h", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/src/", "tansig_table.h", "audio/decoder/audio_decoders/opus/src/src"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "A2NLSF.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "ana_filt_bank_1.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "API.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "biquad_alt.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "bwexpander.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "bwexpander_32.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "check_control_input.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "CNG.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "code_signs.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "control.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "control_audio_bandwidth.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "control_codec.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "control_SNR.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "debug.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "debug.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decoder_set_fs.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_core.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_frame.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_indices.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_parameters.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_pitch.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "decode_pulses.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "dec_API.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "define.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "encode_indices.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "encode_pulses.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "enc_API.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "errors.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "gain_quant.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "HP_variable_cutoff.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "init_decoder.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "init_encoder.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "Inlines.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "inner_prod_aligned.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "interpolate.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "lin2log.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "log2lin.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "LPC_analysis_filter.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "LPC_inv_pred_gain.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "LP_variable_cutoff.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "MacroCount.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "MacroDebug.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "macros.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "main.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF2A.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_decode.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_del_dec_quant.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_encode.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_stabilize.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_unpack.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_VQ.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NLSF_VQ_weights_laroia.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NSQ.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "NSQ_del_dec.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "pitch_est_defines.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "pitch_est_tables.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "PLC.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "PLC.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "process_NLSFs.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "quant_LTP_gains.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_down2.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_down2_3.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_private.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_private_AR2.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_private_down_FIR.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_private_IIR_FIR.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_private_up2_HQ.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_rom.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_rom.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "resampler_structs.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "shell_coder.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "sigm_Q15.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "SigProc_FIX.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "sort.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_decode_pred.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_encode_pred.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_find_predictor.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_LR_to_MS.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_MS_to_LR.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "stereo_quant_pred.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "structs.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "sum_sqr_shift.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_gain.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_LTP.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_NLSF_CB_NB_MB.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_NLSF_CB_WB.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_other.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_pitch_lag.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tables_pulses_per_block.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "table_LSF_cos.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "tuning_parameters.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "typedef.h", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "VAD.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/", "VQ_WMat_EC.c", "audio/decoder/audio_decoders/opus/src/silk"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "main_sse.h", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "NSQ_del_dec_sse.c", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "NSQ_sse.c", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "SigProc_FIX_sse.h", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "VAD_sse.c", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "VQ_WMat_EC_sse.c", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/x86/", "x86_silk_map.c", "audio/decoder/audio_decoders/opus/src/silk/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/mips/", "macros_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/mips/", "NSQ_del_dec_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/mips/", "sigproc_fix_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "apply_sine_window_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "autocorrelation_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "burg_modified_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "bwexpander_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "corrMatrix_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "encode_frame_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "energy_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "find_LPC_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "find_LTP_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "find_pitch_lags_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "find_pred_coefs_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "inner_product_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "k2a_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "levinsondurbin_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "LPC_analysis_filter_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "LPC_inv_pred_gain_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "LTP_analysis_filter_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "LTP_scale_ctrl_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "main_FLP.h", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "noise_shape_analysis_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "pitch_analysis_core_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "prefilter_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "process_gains_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "regularize_correlations_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "residual_energy_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "scale_copy_vector_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "scale_vector_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "schur_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "SigProc_FLP.h", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "solve_LS_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "sort_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "structs_FLP.h", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "warped_autocorrelation_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/float/", "wrappers_FLP.c", "audio/decoder/audio_decoders/opus/src/silk/float"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "apply_sine_window_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "autocorr_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "burg_modified_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "corrMatrix_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "encode_frame_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "find_LPC_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "find_LTP_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "find_pitch_lags_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "find_pred_coefs_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "k2a_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "k2a_Q16_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "LTP_analysis_filter_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "LTP_scale_ctrl_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "main_FIX.h", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "noise_shape_analysis_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "pitch_analysis_core_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "prefilter_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "process_gains_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "regularize_correlations_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "residual_energy16_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "residual_energy_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "schur64_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "schur_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "solve_LS_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "structs_FIX.h", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "vector_ops_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/", "warped_autocorrelation_FIX.c", "audio/decoder/audio_decoders/opus/src/silk/fixed"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/x86/", "burg_modified_FIX_sse.c", "audio/decoder/audio_decoders/opus/src/silk/fixed/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/x86/", "prefilter_FIX_sse.c", "audio/decoder/audio_decoders/opus/src/silk/fixed/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/x86/", "vector_ops_FIX_sse.c", "audio/decoder/audio_decoders/opus/src/silk/fixed/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/mips/", "noise_shape_analysis_FIX_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/fixed/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/mips/", "prefilter_FIX_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/fixed/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/fixed/mips/", "warped_autocorrelation_FIX_mipsr1.h", "audio/decoder/audio_decoders/opus/src/silk/fixed/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/arm/", "macros_armv4.h", "audio/decoder/audio_decoders/opus/src/silk/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/arm/", "macros_armv5e.h", "audio/decoder/audio_decoders/opus/src/silk/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/arm/", "SigProc_FIX_armv4.h", "audio/decoder/audio_decoders/opus/src/silk/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/silk/arm/", "SigProc_FIX_armv5e.h", "audio/decoder/audio_decoders/opus/src/silk/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "arch.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "bands.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "bands.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt_decoder.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt_encoder.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt_lpc.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "celt_lpc.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "cpu_support.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "cwrs.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "cwrs.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "ecintrin.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entcode.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entcode.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entdec.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entdec.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entenc.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "entenc.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "fixed_debug.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "fixed_generic.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "float_cast.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "kiss_fft.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "kiss_fft.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "laplace.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "laplace.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "mathops.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "mathops.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "mdct.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "mdct.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "mfrngcod.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "modes.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "modes.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "opus_custom_demo.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "os_support.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "pitch.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "pitch.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "quant_bands.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "quant_bands.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "rate.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "rate.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "stack_alloc.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "static_modes_fixed.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "static_modes_fixed_arm_ne10.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "static_modes_float.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "static_modes_float_arm_ne10.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "vq.c", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "vq.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/", "_kiss_fft_guts.h", "audio/decoder/audio_decoders/opus/src/celt"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "celt_lpc_sse.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "celt_lpc_sse.h", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "pitch_sse.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "pitch_sse.h", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "pitch_sse2.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "pitch_sse4_1.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "x86cpu.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "x86cpu.h", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/x86/", "x86_celt_map.c", "audio/decoder/audio_decoders/opus/src/celt/x86"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_cwrs32.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_dft.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_entropy.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_laplace.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_mathops.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_mdct.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_rotation.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/tests/", "test_unit_types.c", "audio/decoder/audio_decoders/opus/src/celt/tests"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "celt_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "fixed_generic_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "kiss_fft_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "mdct_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "pitch_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/mips/", "vq_mipsr1.h", "audio/decoder/audio_decoders/opus/src/celt/mips"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "armcpu.c", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "armcpu.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "arm_celt_map.c", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "celt_ne10_fft.c", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "celt_ne10_mdct.c", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "celt_neon_intr.c", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "fft_arm.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "fixed_armv4.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "fixed_armv5e.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "kiss_fft_armv4.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "kiss_fft_armv5e.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "mdct_arm.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/src/celt/arm/", "pitch_arm.h", "audio/decoder/audio_decoders/opus/src/celt/arm"),
# ("LIB_", "../../decoder/audio_decoders/opus/include/", "opus.h", "audio/decoder/audio_decoders/opus/include"),
# ("LIB_", "../../decoder/audio_decoders/opus/include/", "opus_custom.h", "audio/decoder/audio_decoders/opus/include"),
# ("LIB_", "../../decoder/audio_decoders/opus/include/", "opus_defines.h", "audio/decoder/audio_decoders/opus/include"),
# ("LIB_", "../../decoder/audio_decoders/opus/include/", "opus_multistream.h", "audio/decoder/audio_decoders/opus/include"),
# ("LIB_", "../../decoder/audio_decoders/opus/include/", "opus_types.h", "audio/decoder/audio_decoders/opus/include"),
]
speexTable = [("LIB_", "speex/", "speex_enc.c", "audio/encoder/audio_encoders/speex"),
("LIB_", "speex/", "speex_enc.h", "audio/encoder/audio_encoders/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "arch.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "bits.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "cb_search.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "cb_search.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "cb_search_arm4.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "cb_search_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "cb_search_sse.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_10_16_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_10_32_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_20_32_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_5_256_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_5_64_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "exc_8_128_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fftwrap.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "filters.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "filters.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "filters_arm4.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "filters_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "filters_sse.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fixed_arm4.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fixed_arm5e.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fixed_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fixed_debug.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "fixed_generic.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "gain_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "gain_table_lbr.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "hexc_10_32_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "hexc_table.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "high_lsp_tables.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "kiss_fft.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "kiss_fft.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "kiss_fftr.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "kiss_fftr.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lpc.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lpc.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lpc_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lsp.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lsp.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lsp_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "lsp_tables_nb.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "ltp.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "ltp.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "ltp_arm4.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "ltp_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "ltp_sse.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "math_approx.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "misc_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "modes.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "modes.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "modes_wb.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "nb_celp.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "nb_celp.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "os_support.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "quant_lsp.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "quant_lsp.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "quant_lsp_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "sb_celp.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "sb_celp.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "smallft.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "smallft.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "speex.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "speex_callbacks.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "speex_header.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "stack_alloc.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "stereo.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "testenc.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "testenc_uwb.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "testenc_wb.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vbr.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vbr.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vorbis_psy.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vorbis_psy.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vq.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vq.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vq_arm4.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vq_bfin.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "vq_sse.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "window.c", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/libspeex/", "_kiss_fft_guts.h", "audio/decoder/audio_decoders/speex/libspeex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_bits.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_callbacks.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_config.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_config_types.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_header.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_stereo.h", "audio/decoder/audio_decoders/speex/include/speex"),
# ("LIB_", "../../decoder/audio_decoders/speex/include/speex/", "speex_types.h", "audio/decoder/audio_decoders/speex/include/speex"),
]
encoderTable = [("LIB_", "./", "encoder.h", "audio/encoder/audio_encoders")]
oggCntnrTable= [("LIB_", "../audio_containers/lib_ogg_1_3_2/include/ogg/", "ogg.h", "audio/encoder/audio_containers/lib_ogg_1_3_2/include/ogg"),
("LIB_", "../audio_containers/lib_ogg_1_3_2/include/ogg/", "os_types.h", "audio/encoder/audio_containers/lib_ogg_1_3_2/include/ogg"),
("LIB_", "../audio_containers/include/", "ogg_format_container.h", "audio/encoder/audio_containers/include")]
wavCntnrTable= [("LIB_", "../audio_containers/include/", "wav_format_container.h", "audio/encoder/audio_containers/include")]
ftlTable = [("LIB_", "../templates/", "audio_encoder_config.h.ftl", "audio/encoder/audio_containers/include"),
# ("LIB_", "../audio_encoders/templates/", "encoder.c.ftl", "audio/encoder/audio_containers/include"),
]
oggFtlTable = [("LIB_", "../audio_containers/templates/", "ogg_format_container.c.ftl", "audio/encoder/audio_containers")]
wavFtlTable = [("LIB_", "../audio_containers/templates/", "wav_format_container.c.ftl", "audio/encoder/audio_containers")]
# Wav
def enablePcmEncoderFiles(component, enable):
for fileSymbol, srcPath, file, destPath in pcmTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enablePcmEncoder(symbol, event):
enablePcmEncoderFiles(symbol.getComponent(), event["value"]==True)
setEncoderType(symbol.getComponent())
# ADPCM
def enableAdpcmEncoderFiles(component, enable):
for fileSymbol, srcPath, file, destPath in adpcmTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enableAdpcmEncoder(symbol, event):
enableAdpcmEncoderFiles(symbol.getComponent(), event["value"]==True)
setEncoderType(symbol.getComponent())
# SPEEX
def enableSpeexEncoderFiles(component, enable):
for fileSymbol, srcPath, file, destPath in speexTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enableSpeexEncoder(symbol, event):
enableSpeexEncoderFiles(symbol.getComponent(), event["value"]==True)
setEncoderType(symbol.getComponent())
# OPUS
def enableOpusEncoderFiles(component, enable):
for fileSymbol, srcPath, file, destPath in opusTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enableOpusEncoder(symbol, event):
enableOpusEncoderFiles(symbol.getComponent(), event["value"]==True)
setEncoderType(symbol.getComponent())
# OGG
def enableOggContainerFiles(component, enable):
for fileSymbol, srcPath, file, destPath in oggCntnrTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
for fileSymbol, srcPath, file, destPath in oggFtlTable:
# Set type
baseFileName1 = os.path.splitext(file)[0] # Strip the .ftl extension
baseFileName = os.path.splitext(baseFileName1)[0]
ext = os.path.splitext(baseFileName1)[-1].lower()
print("baseFileName1: " + baseFileName1 + ", baseFileName: " + baseFileName + ", ext: " + ext)
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enableOggContainer(symbol, event):
enableOggContainerFiles(symbol.getComponent(), event["value"]==True)
# WAV
def enableWavContainerFiles(component, enable):
for fileSymbol, srcPath, file, destPath in wavCntnrTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Generate file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
for fileSymbol, srcPath, file, destPath in wavFtlTable:
# Set type
baseFileName1 = os.path.splitext(file)[0] # Strip the .ftl extension
baseFileName = os.path.splitext(baseFileName1)[0]
ext = os.path.splitext(baseFileName1)[-1].lower()
print("baseFileName1: " + baseFileName1 + ", baseFileName: " + baseFileName + ", ext: " + ext)
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec("component.getSymbolByID(\"" + symbol + "\").setEnabled(enable)")
def enableWavContainer(symbol, event):
enableWavContainerFiles(symbol.getComponent(), event["value"]==True)
def setEncoderType(component):
# PCM/WAV
if component.getSymbolByID("CONFIG_USE_PCM_ENCODER").getValue():
component.getSymbolByID("CONFIG_AUDIO_ENCODER_TYPE").setValue("PCM", True)
# component.getSymbolByID("CONFIG_USE_ADPCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_SPEEX_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_OPUS_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setEnabled(True)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setReadOnly(False)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setReadOnly(True)
# ADPCM/WAV
elif component.getSymbolByID("CONFIG_USE_ADPCM_ENCODER").getValue():
component.getSymbolByID("CONFIG_AUDIO_ENCODER_TYPE").setValue("ADPCM", True)
# component.getSymbolByID("CONFIG_USE_PCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_SPEEX_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_OPUS_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setEnabled(True)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setReadOnly(False)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setReadOnly(True)
# SPEEX/OGG
elif component.getSymbolByID("CONFIG_USE_SPEEX_ENCODER").getValue():
component.getSymbolByID("CONFIG_AUDIO_ENCODER_TYPE").setValue("SPEEX", True)
# component.getSymbolByID("CONFIG_USE_ADPCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_PCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_OPUS_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setReadOnly(True)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setEnabled(True)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setReadOnly(False)
# OPUS/OGG
elif component.getSymbolByID("CONFIG_USE_OPUS_ENCODER").getValue():
component.getSymbolByID("CONFIG_AUDIO_ENCODER_TYPE").setValue("OPUS", True)
# component.getSymbolByID("CONFIG_USE_ADPCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_SPEEX_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_PCM_ENCODER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setEnabled(False)
# component.getSymbolByID("CONFIG_USE_WAV_CONTAINER").setReadOnly(True)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setEnabled(True)
# component.getSymbolByID("CONFIG_USE_OGG_CONTAINER").setReadOnly(False)
# str = component.getSymbolByID("CONFIG_AUDIO_ENCODER_TYPE").getValue()
# print(str)
def instantiateComponent(audioEncoderComponent):
CONFIG_USE_ENCODER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_ENCODER", None)
CONFIG_USE_ENCODER.setVisible(False)
CONFIG_USE_ENCODER.setDefaultValue(True)
CONFIG_AUDIO_ENCODER_TYPE = audioEncoderComponent.createStringSymbol("CONFIG_AUDIO_ENCODER_TYPE", None)
CONFIG_AUDIO_ENCODER_TYPE.setVisible(False)
CONFIG_AUDIO_ENCODER_TYPE.setDefaultValue("PCM")
CONFIG_USE_PCM_ENCODER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_PCM_ENCODER", None)
CONFIG_USE_PCM_ENCODER.setVisible(True)
CONFIG_USE_PCM_ENCODER.setLabel("Enable PCM Encoder")
CONFIG_USE_PCM_ENCODER.setDefaultValue(True)
CONFIG_USE_PCM_ENCODER.setDependencies(enablePcmEncoder, ["CONFIG_USE_PCM_ENCODER"])
CONFIG_USE_ADPCM_ENCODER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_ADPCM_ENCODER", None)
CONFIG_USE_ADPCM_ENCODER.setVisible(True)
CONFIG_USE_ADPCM_ENCODER.setLabel("Enable ADPCM Encoder")
CONFIG_USE_ADPCM_ENCODER.setDefaultValue(False)
CONFIG_USE_ADPCM_ENCODER.setDependencies(enableAdpcmEncoder, ["CONFIG_USE_ADPCM_ENCODER"])
CONFIG_USE_SPEEX_ENCODER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_SPEEX_ENCODER", None)
CONFIG_USE_SPEEX_ENCODER.setVisible(False)
CONFIG_USE_SPEEX_ENCODER.setLabel("Enable SPEEX Encoder")
CONFIG_USE_SPEEX_ENCODER.setDefaultValue(False)
CONFIG_USE_SPEEX_ENCODER.setDependencies(enableSpeexEncoder, ["CONFIG_USE_SPEEX_ENCODER"])
CONFIG_USE_OPUS_ENCODER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_OPUS_ENCODER", None)
CONFIG_USE_OPUS_ENCODER.setVisible(False)
CONFIG_USE_OPUS_ENCODER.setLabel("Enable OPUS Encoder")
CONFIG_USE_OPUS_ENCODER.setDefaultValue(False)
CONFIG_USE_OPUS_ENCODER.setDependencies(enableOpusEncoder, ["CONFIG_USE_OPUS_ENCODER"])
AUDIO_FILE_CONTAINERS = audioEncoderComponent.createMenuSymbol("AUDIO_FILE_CONTAINERS", None)
AUDIO_FILE_CONTAINERS.setVisible(True)
AUDIO_FILE_CONTAINERS.setLabel("Audio File Containers")
CONFIG_USE_WAV_CONTAINER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_WAV_CONTAINER", AUDIO_FILE_CONTAINERS)
CONFIG_USE_WAV_CONTAINER.setVisible(True)
CONFIG_USE_WAV_CONTAINER.setLabel("Enable WAV Container")
CONFIG_USE_WAV_CONTAINER.setDefaultValue(True)
CONFIG_USE_WAV_CONTAINER.setDependencies(enableWavContainer, ["CONFIG_USE_WAV_CONTAINER"])
CONFIG_USE_OGG_CONTAINER = audioEncoderComponent.createBooleanSymbol("CONFIG_USE_OGG_CONTAINER", AUDIO_FILE_CONTAINERS)
CONFIG_USE_OGG_CONTAINER.setVisible(False)
CONFIG_USE_OGG_CONTAINER.setLabel("Enable OGG Container")
CONFIG_USE_OGG_CONTAINER.setDefaultValue(False)
CONFIG_USE_OGG_CONTAINER.setDependencies(enableOggContainer, ["CONFIG_USE_OGG_CONTAINER"])
############################################################################
#### Code Generation ####
############################################################################
configName = Variables.get("__CONFIGURATION_NAME") # e.g. "default"
Log.writeInfoMessage("Audio Encoders instantiated")
for fileSymbol, srcPath, file, destPath in pcmTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + file + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_encoders\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(CONFIG_USE_PCM_ENCODER.getValue() == True)")
for fileSymbol, srcPath, file, destPath in adpcmTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + file + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_encoders\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(CONFIG_USE_ADPCM_ENCODER.getValue() == True)")
# for fileSymbol, srcPath, file, destPath in speexTable:
# # Set type
# baseFileName = os.path.splitext(file)[0]
# ext = os.path.splitext(file)[-1].lower()
# if ext in src_ext:
# type = "SOURCE"
# elif ext in hdr_ext:
# type = "HEADER"
# else:
# type = "IMPORTANT"
# # Create unique file symbol
# symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
# exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
# exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
# exec(symbol + ".setOutputName(\"" + file + "\")")
# exec(symbol + ".setDestPath(\"" + destPath + "\")")
# exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_encoders\")")
# exec(symbol + ".setType(\"" + type + "\")")
# exec(symbol + ".setEnabled(CONFIG_USE_SPEEX_ENCODER.getValue() == True)")
# for fileSymbol, srcPath, file, destPath in opusTable:
# # Set type
# baseFileName = os.path.splitext(file)[0]
# ext = os.path.splitext(file)[-1].lower()
# if ext in src_ext:
# type = "SOURCE"
# elif ext in hdr_ext:
# type = "HEADER"
# else:
# type = "IMPORTANT"
# # Create unique file symbol
# symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
# exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
# exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
# exec(symbol + ".setOutputName(\"" + file + "\")")
# exec(symbol + ".setDestPath(\"" + destPath + "\")")
# exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_encoders\")")
# exec(symbol + ".setType(\"" + type + "\")")
# exec(symbol + ".setEnabled(CONFIG_USE_OPUS_ENCODER.getValue() == True)")
# for fileSymbol, srcPath, file, destPath in oggCntnrTable:
# # Set type
# baseFileName = os.path.splitext(file)[0]
# ext = os.path.splitext(file)[-1].lower()
# if ext in src_ext:
# type = "SOURCE"
# elif ext in hdr_ext:
# type = "HEADER"
# else:
# type = "IMPORTANT"
# # Create unique file symbol
# symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
# exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
# exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
# exec(symbol + ".setOutputName(\"" + file + "\")")
# exec(symbol + ".setDestPath(\"" + destPath + "\")")
# exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_containers\")")
# exec(symbol + ".setType(\"" + type + "\")")
# exec(symbol + ".setEnabled(CONFIG_USE_OGG_CONTAINER.getValue() == True)")
for fileSymbol, srcPath, file, destPath in wavCntnrTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + file + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_containers\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(CONFIG_USE_WAV_CONTAINER.getValue() == True)")
for fileSymbol, srcPath, file, destPath in encoderTable:
# Set type
baseFileName = os.path.splitext(file)[0]
ext = os.path.splitext(file)[-1].lower()
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + file + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder/audio_encoders\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(True)")
for fileSymbol, srcPath, file, destPath in ftlTable:
# Set type
baseFileName1 = os.path.splitext(file)[0] # Strip the .ftl extension
baseFileName = os.path.splitext(baseFileName1)[0]
ext = os.path.splitext(baseFileName1)[-1].lower()
#print("baseFileName1: " + baseFileName1 + ", baseFileName: " + baseFileName + ", ext: " + ext)
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + baseFileName1 + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(True)")
exec(symbol + ".setMarkup(True)")
# for fileSymbol, srcPath, file, destPath in oggFtlTable:
# # Set type
# baseFileName1 = os.path.splitext(file)[0] # Strip the .ftl extension
# baseFileName = os.path.splitext(baseFileName1)[0]
# ext = os.path.splitext(baseFileName1)[-1].lower()
# #print("baseFileName1: " + baseFileName1 + ", baseFileName: " + baseFileName + ", ext: " + ext)
# if ext in src_ext:
# type = "SOURCE"
# elif ext in hdr_ext:
# type = "HEADER"
# else:
# type = "IMPORTANT"
# # Create unique file symbol
# symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
# exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
# exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
# exec(symbol + ".setOutputName(\"" + baseFileName1 + "\")")
# exec(symbol + ".setDestPath(\"" + destPath + "\")")
# exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder\")")
# exec(symbol + ".setType(\"" + type + "\")")
# exec(symbol + ".setEnabled(CONFIG_USE_OGG_CONTAINER.getValue() == True)")
# exec(symbol + ".setMarkup(True)")
for fileSymbol, srcPath, file, destPath in wavFtlTable:
# Set type
baseFileName1 = os.path.splitext(file)[0] # Strip the .ftl extension
baseFileName = os.path.splitext(baseFileName1)[0]
ext = os.path.splitext(baseFileName1)[-1].lower()
#print("baseFileName1: " + baseFileName1 + ", baseFileName: " + baseFileName + ", ext: " + ext)
if ext in src_ext:
type = "SOURCE"
elif ext in hdr_ext:
type = "HEADER"
else:
type = "IMPORTANT"
# Create unique file symbol
symbol = fileSymbol + srcPath.replace("/", "_").replace(".", "").upper() + baseFileName.upper() + "_" + type.upper()
exec(symbol + " = audioEncoderComponent.createFileSymbol(\"" + symbol + "\", None)")
exec(symbol + ".setSourcePath(\"" + srcPath + file + "\")")
exec(symbol + ".setOutputName(\"" + baseFileName1 + "\")")
exec(symbol + ".setDestPath(\"" + destPath + "\")")
exec(symbol + ".setProjectPath(\"config/" + configName + "/audio/encoder\")")
exec(symbol + ".setType(\"" + type + "\")")
exec(symbol + ".setEnabled(CONFIG_USE_WAV_CONTAINER.getValue() == True)")
exec(symbol + ".setMarkup(True)")
# if("PIC32" in Variables.get("__PROCESSOR")):
# CONFIG_USE_SPEEX_ENCODER.setVisible(True)
# CONFIG_USE_OPUS_ENCODER.setVisble(True)
# CONFIG_USE_OGG_CONTAINER.setVisible(True)
| 84.463569 | 181 | 0.626689 | 9,155 | 79,987 | 5.252103 | 0.051229 | 0.185679 | 0.309465 | 0.28351 | 0.904519 | 0.878606 | 0.865254 | 0.859992 | 0.845538 | 0.835222 | 0 | 0.00393 | 0.169615 | 79,987 | 946 | 182 | 84.552854 | 0.719994 | 0.679935 | 0 | 0.654237 | 0 | 0 | 0.176563 | 0.083205 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047458 | false | 0 | 0.054237 | 0 | 0.101695 | 0.00678 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bec0ba7061df9fb77c25fa66da924756b95b05df | 3,053 | py | Python | fury/ui/tests/test_helpers.py | SunTzunami/fury | 39a28039fab8ba3070c0a7c1cdb1eed263f59971 | [
"BSD-3-Clause"
] | 149 | 2018-09-20T18:36:16.000Z | 2022-03-29T05:16:25.000Z | fury/ui/tests/test_helpers.py | SunTzunami/fury | 39a28039fab8ba3070c0a7c1cdb1eed263f59971 | [
"BSD-3-Clause"
] | 523 | 2018-09-20T16:57:16.000Z | 2022-03-31T18:52:41.000Z | fury/ui/tests/test_helpers.py | SunTzunami/fury | 39a28039fab8ba3070c0a7c1cdb1eed263f59971 | [
"BSD-3-Clause"
] | 150 | 2018-10-10T07:21:27.000Z | 2022-03-29T08:33:17.000Z | """Test helpers fonction ."""
import numpy.testing as npt
from fury import window, ui
from fury.ui.helpers import clip_overflow, wrap_overflow, check_overflow
def test_clip_overflow():
text = ui.TextBlock2D(text="", position=(50, 50), color=(1, 0, 0))
rectangle = ui.Rectangle2D(position=(50, 50), size=(100, 50))
sm = window.ShowManager()
sm.scene.add(rectangle, text)
text.message = "Hello"
clip_overflow(text, rectangle.size[0])
npt.assert_equal("Hello", text.message)
text.message = "Hello wassup"
clip_overflow(text, rectangle.size[0])
npt.assert_equal("Hello was...", text.message)
text.message = "A very very long message to clip text overflow"
clip_overflow(text, rectangle.size[0])
npt.assert_equal("A very ve...", text.message)
text.message = "Hello"
clip_overflow(text, rectangle.size[0], 'left')
npt.assert_equal("Hello", text.message)
text.message = "Hello wassup"
clip_overflow(text, rectangle.size[0], 'left')
npt.assert_equal("...lo wassup", text.message)
text.message = "A very very long message to clip text overflow"
clip_overflow(text, rectangle.size[0], 'left')
npt.assert_equal("... overflow", text.message)
text.message = "A very very long message to clip text overflow"
clip_overflow(text, rectangle.size[0], 'LeFT')
npt.assert_equal("... overflow", text.message)
text.message = "A very very long message to clip text overflow"
clip_overflow(text, rectangle.size[0], 'RigHT')
npt.assert_equal("A very ve...", text.message)
npt.assert_raises(ValueError, clip_overflow,
text, rectangle.size[0], 'middle')
def test_wrap_overflow():
text = ui.TextBlock2D(text="", position=(50, 50), color=(1, 0, 0))
rectangle = ui.Rectangle2D(position=(50, 50), size=(100, 50))
sm = window.ShowManager()
sm.scene.add(rectangle, text)
text.message = "Hello"
wrap_overflow(text, rectangle.size[0])
npt.assert_equal("Hello", text.message)
text.message = "Hello wassup"
wrap_overflow(text, rectangle.size[0])
npt.assert_equal("Hello wassu\np", text.message)
text.message = "A very very long message to clip text overflow"
wrap_overflow(text, rectangle.size[0])
npt.assert_equal("A very very\n long mess\nage to clip \ntext overflo\nw",
text.message)
text.message = "A very very long message to clip text overflow"
wrap_overflow(text, 0)
npt.assert_equal(text.message, "")
wrap_overflow(text, -2*text.size[0])
npt.assert_equal(text.message, "")
def test_check_overflow():
text = ui.TextBlock2D(text="", position=(50, 50), color=(1, 0, 0))
rectangle = ui.Rectangle2D(position=(50, 50), size=(100, 50))
sm = window.ShowManager()
sm.scene.add(rectangle, text)
text.message = "A very very long message to clip text overflow"
overflow_idx = check_overflow(text, rectangle.size[0], '~')
npt.assert_equal(10, overflow_idx)
npt.assert_equal('A very ver~', text.message)
| 32.478723 | 78 | 0.673436 | 428 | 3,053 | 4.705607 | 0.142523 | 0.147468 | 0.10427 | 0.16137 | 0.844588 | 0.833168 | 0.792453 | 0.792453 | 0.750248 | 0.742304 | 0 | 0.028904 | 0.184081 | 3,053 | 93 | 79 | 32.827957 | 0.779607 | 0.007534 | 0 | 0.68254 | 0 | 0 | 0.187562 | 0 | 0 | 0 | 0 | 0 | 0.253968 | 1 | 0.047619 | false | 0 | 0.047619 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bec7d1589992efe3308d186a0cdce4dca72f3332 | 7,737 | py | Python | Mag/MagModule.py | Praashoo7/MiniProjects | 4c870d17e072d5da7875c00a40f160aa1af98f5c | [
"MIT"
] | null | null | null | Mag/MagModule.py | Praashoo7/MiniProjects | 4c870d17e072d5da7875c00a40f160aa1af98f5c | [
"MIT"
] | null | null | null | Mag/MagModule.py | Praashoo7/MiniProjects | 4c870d17e072d5da7875c00a40f160aa1af98f5c | [
"MIT"
] | null | null | null | #There is a simpler way to do this.
def MagModule():
while True:
n=int(input("Enter the Number of Card : "))
if n!=0:
n1=int(input("Again? : "))
else:
print("INVALID")
break
if n1!=0:
n2=int(input("Enter Again : "))
else:
if n==1:
E=8
print("Your Number is : ",E)
elif n==2:
E=4
print("Your Number is : ",E)
elif n==3:
E=2
print("Your Number is : ",E)
else:
E=1
print("Your Number is : ",E)
break
if n2!=0:
n3=int(input("Enter Again : "))
else:
if n==1 and n1==2:
E=12
print("Your Number is : ",E)
elif n==1 and n1==3:
E=10
print("Your Number is : ",E)
elif n==1 and n1==4:
E=9
print("Your Number is : ",E)
elif n==2 and n1==1:
E=12
print("Your Number is : ",E)
elif n==2 and n1==3:
E=6
print("Your Number is : ",E)
elif n==2 and n1==4:
E=5
print("Your Number is : ",E)
elif n==3 and n1==1:
E=10
print("Your Number is : ",E)
elif n==3 and n1==2:
E=6
print("Your Number is : ",E)
elif n==3 and n1==4:
E=3
print("Your Number is : ",E)
elif n==4 and n1==1:
E=9
print("Your Number is : ",E)
elif n==4 and n1==2:
E=5
print("Your Number is : ",E)
else:
E=3
print("Your Number is : ",E)
break
if n3!=0:
if n==1 and n1==2 and n2==3 and n3==4:
E=15
print("Your Number is : ",E)
elif n==1 and n1==2 and n2==4 and n3==3:
E=15
print("Your Number is : ",E)
elif n==1 and n1==3 and n2==2 and n3==4:
E=15
print("Your Number is : ",E)
elif n==1 and n1==3 and n2==4 and n3==2:
E=15
print("Your Number is : ",E)
elif n==1 and n1==4 and n2==2 and n3==3:
E=15
print("Your Number is : ",E)
elif n==1 and n1==4 and n2==3 and n3==2:
E=15
print("Your Number is : ",E)
elif n==2 and n1==1 and n2==3 and n3==4:
E=15
print("Your Number is : ",E)
elif n==2 and n1==1 and n2==4 and n3==3:
E=15
print("Your Number is : ",E)
elif n==2 and n1==3 and n2==1 and n3==4:
E=15
print("Your Number is : ",E)
elif n==2 and n1==3 and n2==4 and n3==1:
E=15
print("Your Number is : ",E)
elif n==2 and n1==4 and n2==1 and n3==3:
E=15
print("Your Number is : ",E)
elif n==2 and n1==4 and n2==3 and n3==1:
E=15
print("Your Number is : ",E)
elif n==3 and n1==1 and n2==2 and n3==4:
E=15
print("Your Number is : ",E)
elif n==3 and n1==1 and n2==4 and n3==2:
E=15
print("Your Number is : ",E)
elif n==3 and n1==2 and n2==1 and n3==4:
E=15
print("Your Number is : ",E)
elif n==3 and n1==2 and n2==4 and n3==1:
E=15
print("Your Number is : ",E)
elif n==3 and n1==4 and n2==1 and n3==2:
E=15
print("Your Number is : ",E)
elif n==3 and n1==4 and n2==2 and n3==1:
E=15
print("Your Number is : ",E)
elif n==4 and n1==1 and n2==2 and n3==3:
E=15
print("Your Number is : ",E)
elif n==4 and n1==1 and n2==3 and n3==2:
E=15
print("Your Number is : ",E)
elif n==4 and n1==2 and n2==1 and n3==3:
E=15
print("Your Number is : ",E)
elif n==4 and n1==2 and n2==3 and n3==1:
E=15
print("Your Number is : ",E)
elif n==4 and n1==3 and n2==1 and n3==2:
E=15
print("Your Number is : ",E)
else:
E=15
print("Your Number is : ",E)
break
else:
if n==1 and n1==2 and n2==3:
E=14
print("Your Number is : ",E)
elif n==1 and n1==3 and n2==2:
E=14
print("Your Number is : ",E)
elif n==1 and n1==4 and n2==2:
E=13
print("Your Number is : ",E)
elif n==1 and n1==2 and n2==4:
E=13
print("Your Number is : ",E)
elif n==1 and n1==3 and n2==4:
E=11
print("Your Number is : ",E)
elif n==1 and n1==4 and n2==3:
E=11
print("Your Number is : ",E)
elif n==2 and n1==3 and n2==1:
E=14
print("Your Number is : ",E)
elif n==3 and n1==2 and n2==1:
E=14
print("Your Number is : ",E)
elif n==4 and n1==2 and n2==1:
E=13
print("Your Number is : ",E)
elif n==2 and n1==4 and n2==1:
E=13
print("Your Number is : ",E)
elif n==3 and n1==4 and n2==1:
E=11
print("Your Number is : ",E)
elif n==4 and n1==3 and n2==1:
E=11
print("Your Number is : ",E)
elif n==2 and n1==1 and n2==3:
E=14
print("Your Number is : ",E)
elif n==3 and n1==1 and n2==2:
E=14
print("Your Number is : ",E)
elif n==4 and n1==1 and n2==2:
E=13
print("Your Number is : ",E)
elif n==2 and n1==1 and n2==4:
E=13
print("Your Number is : ",E)
elif n==3 and n1==1 and n2==4:
E=11
print("Your Number is : ",E)
elif n==4 and n1==1 and n2==3:
E=11
print("Your Number is : ",E)
elif n==2 and n1==4 and n2==3:
E=7
print("Your Number is : ",E)
elif n==2 and n1==1 and n2==3:
E=14
print("Your Number is : ",E)
elif n==2 and n1==3 and n2==4:
E=7
print("Your Number is : ",E)
elif n==3 and n1==2 and n2==4:
E=7
print("Your Number is : ",E)
elif n==4 and n1==2 and n2==3:
E=7
print("Your Number is : ",E)
elif n==4 and n1==3 and n2==2:
E=7
print("Your Number is : ",E)
else:
E=7
print("Your Number is : ",E)
break
| 35.328767 | 53 | 0.347938 | 1,083 | 7,737 | 2.485688 | 0.045245 | 0.217311 | 0.362184 | 0.410475 | 0.948737 | 0.948737 | 0.945022 | 0.877043 | 0.839525 | 0.824666 | 0 | 0.120307 | 0.52837 | 7,737 | 218 | 54 | 35.490826 | 0.617429 | 0.004394 | 0 | 0.655814 | 0 | 0 | 0.157135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004651 | false | 0 | 0 | 0 | 0.004651 | 0.306977 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
fe74f13a5c1841fd3d96dc6de3589b340dd6ddc0 | 513 | py | Python | pysnark/nobackend.py | gxavier38/pysnark | 8a2a571bef430783adf8fe28cb8bb0b0bf8a7c94 | [
"Cube"
] | 94 | 2019-05-21T09:36:58.000Z | 2022-03-25T12:27:54.000Z | pysnark/nobackend.py | gxavier38/pysnark | 8a2a571bef430783adf8fe28cb8bb0b0bf8a7c94 | [
"Cube"
] | 32 | 2019-11-12T09:59:46.000Z | 2021-12-04T17:53:14.000Z | pysnark/nobackend.py | gxavier38/pysnark | 8a2a571bef430783adf8fe28cb8bb0b0bf8a7c94 | [
"Cube"
] | 13 | 2020-01-02T11:01:17.000Z | 2021-10-02T11:07:11.000Z | class NoneObject:
def __add__(self, other): return NoneObject()
def __sub__(self, other): return NoneObject()
def __mul__(self, other): return NoneObject()
def __neg__(self): return NoneObject()
def privval(val):
return NoneObject()
def pubval(val):
return NoneObject()
def zero():
return NoneObject()
def one():
return NoneObject()
def fieldinverse(val):
return 0
def get_modulus():
return 10000
def add_constraint(v, w, y):
pass
def prove():
pass
| 17.1 | 49 | 0.658869 | 63 | 513 | 5.079365 | 0.412698 | 0.365625 | 0.475 | 0.234375 | 0.2625 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.22807 | 513 | 29 | 50 | 17.689655 | 0.792929 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.571429 | false | 0.095238 | 0 | 0.47619 | 0.904762 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
fe9cfcda6a0e62ef3ea5ecd74d32950871d4e2a9 | 128,424 | py | Python | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/compute_management_client.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/compute_management_client.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/compute_management_client.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | 1 | 2020-06-25T03:12:58.000Z | 2020-06-25T03:12:58.000Z | # coding: utf-8
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from __future__ import absolute_import
from oci._vendor import requests # noqa: F401
from oci._vendor import six
from oci import retry # noqa: F401
from oci.base_client import BaseClient
from oci.config import get_config_value_or_default, validate_config
from oci.signer import Signer
from oci.util import Sentinel
from .models import core_type_mapping
missing = Sentinel("Missing")
class ComputeManagementClient(object):
"""
API covering the [Networking](/iaas/Content/Network/Concepts/overview.htm),
[Compute](/iaas/Content/Compute/Concepts/computeoverview.htm), and
[Block Volume](/iaas/Content/Block/Concepts/overview.htm) services. Use this API
to manage resources such as virtual cloud networks (VCNs), compute instances, and
block storage volumes.
"""
def __init__(self, config, **kwargs):
"""
Creates a new service client
:param dict config:
Configuration keys and values as per `SDK and Tool Configuration <https://docs.cloud.oracle.com/Content/API/Concepts/sdkconfig.htm>`__.
The :py:meth:`~oci.config.from_file` method can be used to load configuration from a file. Alternatively, a ``dict`` can be passed. You can validate_config
the dict using :py:meth:`~oci.config.validate_config`
:param str service_endpoint: (optional)
The endpoint of the service to call using this client. For example ``https://iaas.us-ashburn-1.oraclecloud.com``. If this keyword argument is
not provided then it will be derived using the region in the config parameter. You should only provide this keyword argument if you have an explicit
need to specify a service endpoint.
:param timeout: (optional)
The connection and read timeouts for the client. The default values are connection timeout 10 seconds and read timeout 60 seconds. This keyword argument can be provided
as a single float, in which case the value provided is used for both the read and connection timeouts, or as a tuple of two floats. If
a tuple is provided then the first value is used as the connection timeout and the second value as the read timeout.
:type timeout: float or tuple(float, float)
:param signer: (optional)
The signer to use when signing requests made by the service client. The default is to use a :py:class:`~oci.signer.Signer` based on the values
provided in the config parameter.
One use case for this parameter is for `Instance Principals authentication <https://docs.cloud.oracle.com/Content/Identity/Tasks/callingservicesfrominstances.htm>`__
by passing an instance of :py:class:`~oci.auth.signers.InstancePrincipalsSecurityTokenSigner` as the value for this keyword argument
:type signer: :py:class:`~oci.signer.AbstractBaseSigner`
:param obj retry_strategy: (optional)
A retry strategy to apply to all calls made by this service client (i.e. at the client level). There is no retry strategy applied by default.
Retry strategies can also be applied at the operation level by passing a ``retry_strategy`` keyword argument as part of calling the operation.
Any value provided at the operation level will override whatever is specified at the client level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
"""
validate_config(config, signer=kwargs.get('signer'))
if 'signer' in kwargs:
signer = kwargs['signer']
else:
signer = Signer(
tenancy=config["tenancy"],
user=config["user"],
fingerprint=config["fingerprint"],
private_key_file_location=config.get("key_file"),
pass_phrase=get_config_value_or_default(config, "pass_phrase"),
private_key_content=config.get("key_content")
)
base_client_init_kwargs = {
'regional_client': True,
'service_endpoint': kwargs.get('service_endpoint'),
'timeout': kwargs.get('timeout'),
'base_path': '/20160918',
'service_endpoint_template': 'https://iaas.{region}.{secondLevelDomain}',
'skip_deserialization': kwargs.get('skip_deserialization', False)
}
self.base_client = BaseClient("compute_management", config, signer, core_type_mapping, **base_client_init_kwargs)
self.retry_strategy = kwargs.get('retry_strategy')
self._config = config
self._kwargs = kwargs
def attach_load_balancer(self, instance_pool_id, attach_load_balancer_details, **kwargs):
"""
Attach a load balancer to the instance pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param AttachLoadBalancerDetails attach_load_balancer_details: (required)
Load balancer being attached
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/attachLoadBalancer"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"attach_load_balancer got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=attach_load_balancer_details,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=attach_load_balancer_details,
response_type="InstancePool")
def change_cluster_network_compartment(self, cluster_network_id, change_cluster_network_compartment_details, **kwargs):
"""
Moves a cluster network into a different compartment within the same tenancy. For
information about moving resources between compartments, see
`Moving Resources to a Different Compartment`__.
When you move a cluster network to a different compartment, associated resources such as the instances
in the cluster network, boot volumes, and VNICs are not moved.
__ https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingcompartments.htm#moveRes
:param str cluster_network_id: (required)
The `OCID`__ of the cluster network.
__ https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm
:param ChangeClusterNetworkCompartmentDetails change_cluster_network_compartment_details: (required)
Request to change the compartment of given cluster network.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique identifier for the request.
If you need to contact Oracle about a particular request, please provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks/{clusterNetworkId}/actions/changeCompartment"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"change_cluster_network_compartment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"clusterNetworkId": cluster_network_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_cluster_network_compartment_details)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_cluster_network_compartment_details)
def change_instance_configuration_compartment(self, instance_configuration_id, change_instance_configuration_compartment_details, **kwargs):
"""
Moves an instance configuration into a different compartment within the same tenancy.
For information about moving resources between compartments, see
`Moving Resources to a Different Compartment`__.
When you move an instance configuration to a different compartment, associated resources such as
instance pools are not moved.
**Important:** Most of the properties for an existing instance configuration, including the compartment,
cannot be modified after you create the instance configuration. Although you can move an instance configuration
to a different compartment, you will not be able to use the instance configuration to manage instance pools
in the new compartment. If you want to update an instance configuration to point to a different compartment,
you should instead create a new instance configuration in the target compartment using
`CreateInstanceConfiguration`__.
__ https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingcompartments.htm#moveRes
__ https://docs.cloud.oracle.com/iaas/api/#/en/iaas/20160918/InstanceConfiguration/CreateInstanceConfiguration
:param str instance_configuration_id: (required)
The OCID of the instance configuration.
:param ChangeInstanceConfigurationCompartmentDetails change_instance_configuration_compartment_details: (required)
Request to change the compartment of given instance configuration.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique identifier for the request.
If you need to contact Oracle about a particular request, please provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations/{instanceConfigurationId}/actions/changeCompartment"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"change_instance_configuration_compartment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instanceConfigurationId": instance_configuration_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_instance_configuration_compartment_details)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_instance_configuration_compartment_details)
def change_instance_pool_compartment(self, instance_pool_id, change_instance_pool_compartment_details, **kwargs):
"""
Moves an instance pool into a different compartment within the same tenancy. For
information about moving resources between compartments, see
`Moving Resources to a Different Compartment`__.
When you move an instance pool to a different compartment, associated resources such as the instances in
the pool, boot volumes, VNICs, and autoscaling configurations are not moved.
__ https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingcompartments.htm#moveRes
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param ChangeInstancePoolCompartmentDetails change_instance_pool_compartment_details: (required)
Request to change the compartment of given instance pool.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique identifier for the request.
If you need to contact Oracle about a particular request, please provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/changeCompartment"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"change_instance_pool_compartment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_instance_pool_compartment_details)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_instance_pool_compartment_details)
def create_cluster_network(self, create_cluster_network_details, **kwargs):
"""
Creates a cluster network. For more information about cluster networks, see
`Managing Cluster Networks`__.
__ https://docs.cloud.oracle.com/iaas/Content/Compute/Tasks/managingclusternetworks.htm
:param CreateClusterNetworkDetails create_cluster_network_details: (required)
Cluster network creation details
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.ClusterNetwork`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_cluster_network got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_cluster_network_details,
response_type="ClusterNetwork")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_cluster_network_details,
response_type="ClusterNetwork")
def create_instance_configuration(self, create_instance_configuration, **kwargs):
"""
Creates an instance configuration. An instance configuration is a template that defines the
settings to use when creating Compute instances.
:param CreateInstanceConfigurationBase create_instance_configuration: (required)
Instance configuration creation details
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstanceConfiguration`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_instance_configuration got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_instance_configuration,
response_type="InstanceConfiguration")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_instance_configuration,
response_type="InstanceConfiguration")
def create_instance_pool(self, create_instance_pool_details, **kwargs):
"""
Create an instance pool.
:param CreateInstancePoolDetails create_instance_pool_details: (required)
Instance pool creation details
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_instance_pool_details,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_instance_pool_details,
response_type="InstancePool")
def delete_instance_configuration(self, instance_configuration_id, **kwargs):
"""
Deletes an instance configuration.
:param str instance_configuration_id: (required)
The OCID of the instance configuration.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations/{instanceConfigurationId}"
method = "DELETE"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_instance_configuration got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instanceConfigurationId": instance_configuration_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
def detach_load_balancer(self, instance_pool_id, detach_load_balancer_details, **kwargs):
"""
Detach a load balancer from the instance pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param DetachLoadBalancerDetails detach_load_balancer_details: (required)
Load balancer being detached
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/detachLoadBalancer"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"detach_load_balancer got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=detach_load_balancer_details,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=detach_load_balancer_details,
response_type="InstancePool")
def get_cluster_network(self, cluster_network_id, **kwargs):
"""
Gets information about the specified cluster network.
:param str cluster_network_id: (required)
The `OCID`__ of the cluster network.
__ https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.ClusterNetwork`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks/{clusterNetworkId}"
method = "GET"
expected_kwargs = ["retry_strategy"]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_cluster_network got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"clusterNetworkId": cluster_network_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="ClusterNetwork")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="ClusterNetwork")
def get_instance_configuration(self, instance_configuration_id, **kwargs):
"""
Gets the specified instance configuration
:param str instance_configuration_id: (required)
The OCID of the instance configuration.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstanceConfiguration`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations/{instanceConfigurationId}"
method = "GET"
expected_kwargs = ["retry_strategy"]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_instance_configuration got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instanceConfigurationId": instance_configuration_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstanceConfiguration")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstanceConfiguration")
def get_instance_pool(self, instance_pool_id, **kwargs):
"""
Gets the specified instance pool
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}"
method = "GET"
expected_kwargs = ["retry_strategy"]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
def get_instance_pool_load_balancer_attachment(self, instance_pool_id, instance_pool_load_balancer_attachment_id, **kwargs):
"""
Gets information about a load balancer that is attached to the specified instance pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str instance_pool_load_balancer_attachment_id: (required)
The OCID of the load balancer attachment.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePoolLoadBalancerAttachment`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/loadBalancerAttachments/{instancePoolLoadBalancerAttachmentId}"
method = "GET"
expected_kwargs = ["retry_strategy"]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_instance_pool_load_balancer_attachment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id,
"instancePoolLoadBalancerAttachmentId": instance_pool_load_balancer_attachment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePoolLoadBalancerAttachment")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePoolLoadBalancerAttachment")
def launch_instance_configuration(self, instance_configuration_id, instance_configuration, **kwargs):
"""
Launches an instance from an instance configuration.
If the instance configuration does not include all of the parameters that are
required to launch an instance, such as the availability domain and subnet ID, you must
provide these parameters when you launch an instance from the instance configuration.
For more information, see the :class:`InstanceConfiguration`
resource.
:param str instance_configuration_id: (required)
The OCID of the instance configuration.
:param InstanceConfigurationInstanceDetails instance_configuration: (required)
Instance configuration Instance Details
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.Instance`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations/{instanceConfigurationId}/actions/launch"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"launch_instance_configuration got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instanceConfigurationId": instance_configuration_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=instance_configuration,
response_type="Instance")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=instance_configuration,
response_type="Instance")
def list_cluster_network_instances(self, compartment_id, cluster_network_id, **kwargs):
"""
Lists the instances in the specified cluster network.
:param str compartment_id: (required)
The `OCID`__ of the compartment.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str cluster_network_id: (required)
The `OCID`__ of the cluster network.
__ https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm
:param str display_name: (optional)
A filter to return only resources that match the given display name exactly.
:param int limit: (optional)
For list pagination. The maximum number of results per page, or items to return in a paginated
\"List\" call. For important details about how pagination works, see
`List Pagination`__.
Example: `50`
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str page: (optional)
For list pagination. The value of the `opc-next-page` response header from the previous \"List\"
call. For important details about how pagination works, see
`List Pagination`__.
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str sort_by: (optional)
The field to sort by. You can provide one sort order (`sortOrder`). Default order for
TIMECREATED is descending. Default order for DISPLAYNAME is ascending. The DISPLAYNAME
sort order is case sensitive.
**Note:** In general, some \"List\" operations (for example, `ListInstances`) let you
optionally filter by availability domain if the scope of the resource type is within a
single availability domain. If you call one of these \"List\" operations without specifying
an availability domain, the resources are grouped by availability domain, then sorted.
Allowed values are: "TIMECREATED", "DISPLAYNAME"
:param str sort_order: (optional)
The sort order to use, either ascending (`ASC`) or descending (`DESC`). The DISPLAYNAME sort order
is case sensitive.
Allowed values are: "ASC", "DESC"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type list of :class:`~oci.core.models.InstanceSummary`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks/{clusterNetworkId}/instances"
method = "GET"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"display_name",
"limit",
"page",
"sort_by",
"sort_order"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_cluster_network_instances got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"clusterNetworkId": cluster_network_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_by' in kwargs:
sort_by_allowed_values = ["TIMECREATED", "DISPLAYNAME"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortBy": kwargs.get("sort_by", missing),
"sortOrder": kwargs.get("sort_order", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceSummary]")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceSummary]")
def list_cluster_networks(self, compartment_id, **kwargs):
"""
Lists the cluster networks in the specified compartment.
:param str compartment_id: (required)
The `OCID`__ of the compartment.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str display_name: (optional)
A filter to return only resources that match the given display name exactly.
:param int limit: (optional)
For list pagination. The maximum number of results per page, or items to return in a paginated
\"List\" call. For important details about how pagination works, see
`List Pagination`__.
Example: `50`
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str page: (optional)
For list pagination. The value of the `opc-next-page` response header from the previous \"List\"
call. For important details about how pagination works, see
`List Pagination`__.
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str sort_by: (optional)
The field to sort by. You can provide one sort order (`sortOrder`). Default order for
TIMECREATED is descending. Default order for DISPLAYNAME is ascending. The DISPLAYNAME
sort order is case sensitive.
**Note:** In general, some \"List\" operations (for example, `ListInstances`) let you
optionally filter by availability domain if the scope of the resource type is within a
single availability domain. If you call one of these \"List\" operations without specifying
an availability domain, the resources are grouped by availability domain, then sorted.
Allowed values are: "TIMECREATED", "DISPLAYNAME"
:param str sort_order: (optional)
The sort order to use, either ascending (`ASC`) or descending (`DESC`). The DISPLAYNAME sort order
is case sensitive.
Allowed values are: "ASC", "DESC"
:param str lifecycle_state: (optional)
A filter to only return resources that match the given lifecycle
state. The state value is case-insensitive.
Allowed values are: "PROVISIONING", "SCALING", "STARTING", "STOPPING", "TERMINATING", "STOPPED", "TERMINATED", "RUNNING"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type list of :class:`~oci.core.models.ClusterNetworkSummary`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks"
method = "GET"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"display_name",
"limit",
"page",
"sort_by",
"sort_order",
"lifecycle_state"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_cluster_networks got unknown kwargs: {!r}".format(extra_kwargs))
if 'sort_by' in kwargs:
sort_by_allowed_values = ["TIMECREATED", "DISPLAYNAME"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["PROVISIONING", "SCALING", "STARTING", "STOPPING", "TERMINATING", "STOPPED", "TERMINATED", "RUNNING"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortBy": kwargs.get("sort_by", missing),
"sortOrder": kwargs.get("sort_order", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[ClusterNetworkSummary]")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[ClusterNetworkSummary]")
def list_instance_configurations(self, compartment_id, **kwargs):
"""
Lists the instance configurations in the specified compartment.
:param str compartment_id: (required)
The `OCID`__ of the compartment.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param int limit: (optional)
For list pagination. The maximum number of results per page, or items to return in a paginated
\"List\" call. For important details about how pagination works, see
`List Pagination`__.
Example: `50`
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str page: (optional)
For list pagination. The value of the `opc-next-page` response header from the previous \"List\"
call. For important details about how pagination works, see
`List Pagination`__.
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str sort_by: (optional)
The field to sort by. You can provide one sort order (`sortOrder`). Default order for
TIMECREATED is descending. Default order for DISPLAYNAME is ascending. The DISPLAYNAME
sort order is case sensitive.
**Note:** In general, some \"List\" operations (for example, `ListInstances`) let you
optionally filter by availability domain if the scope of the resource type is within a
single availability domain. If you call one of these \"List\" operations without specifying
an availability domain, the resources are grouped by availability domain, then sorted.
Allowed values are: "TIMECREATED", "DISPLAYNAME"
:param str sort_order: (optional)
The sort order to use, either ascending (`ASC`) or descending (`DESC`). The DISPLAYNAME sort order
is case sensitive.
Allowed values are: "ASC", "DESC"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type list of :class:`~oci.core.models.InstanceConfigurationSummary`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations"
method = "GET"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"limit",
"page",
"sort_by",
"sort_order"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_instance_configurations got unknown kwargs: {!r}".format(extra_kwargs))
if 'sort_by' in kwargs:
sort_by_allowed_values = ["TIMECREATED", "DISPLAYNAME"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortBy": kwargs.get("sort_by", missing),
"sortOrder": kwargs.get("sort_order", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceConfigurationSummary]")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceConfigurationSummary]")
def list_instance_pool_instances(self, compartment_id, instance_pool_id, **kwargs):
"""
List the instances in the specified instance pool.
:param str compartment_id: (required)
The `OCID`__ of the compartment.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str display_name: (optional)
A filter to return only resources that match the given display name exactly.
:param int limit: (optional)
For list pagination. The maximum number of results per page, or items to return in a paginated
\"List\" call. For important details about how pagination works, see
`List Pagination`__.
Example: `50`
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str page: (optional)
For list pagination. The value of the `opc-next-page` response header from the previous \"List\"
call. For important details about how pagination works, see
`List Pagination`__.
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str sort_by: (optional)
The field to sort by. You can provide one sort order (`sortOrder`). Default order for
TIMECREATED is descending. Default order for DISPLAYNAME is ascending. The DISPLAYNAME
sort order is case sensitive.
**Note:** In general, some \"List\" operations (for example, `ListInstances`) let you
optionally filter by availability domain if the scope of the resource type is within a
single availability domain. If you call one of these \"List\" operations without specifying
an availability domain, the resources are grouped by availability domain, then sorted.
Allowed values are: "TIMECREATED", "DISPLAYNAME"
:param str sort_order: (optional)
The sort order to use, either ascending (`ASC`) or descending (`DESC`). The DISPLAYNAME sort order
is case sensitive.
Allowed values are: "ASC", "DESC"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type list of :class:`~oci.core.models.InstanceSummary`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/instances"
method = "GET"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"display_name",
"limit",
"page",
"sort_by",
"sort_order"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_instance_pool_instances got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_by' in kwargs:
sort_by_allowed_values = ["TIMECREATED", "DISPLAYNAME"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortBy": kwargs.get("sort_by", missing),
"sortOrder": kwargs.get("sort_order", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceSummary]")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="list[InstanceSummary]")
def list_instance_pools(self, compartment_id, **kwargs):
"""
Lists the instance pools in the specified compartment.
:param str compartment_id: (required)
The `OCID`__ of the compartment.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str display_name: (optional)
A filter to return only resources that match the given display name exactly.
:param int limit: (optional)
For list pagination. The maximum number of results per page, or items to return in a paginated
\"List\" call. For important details about how pagination works, see
`List Pagination`__.
Example: `50`
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str page: (optional)
For list pagination. The value of the `opc-next-page` response header from the previous \"List\"
call. For important details about how pagination works, see
`List Pagination`__.
__ https://docs.cloud.oracle.com/iaas/Content/API/Concepts/usingapi.htm#nine
:param str sort_by: (optional)
The field to sort by. You can provide one sort order (`sortOrder`). Default order for
TIMECREATED is descending. Default order for DISPLAYNAME is ascending. The DISPLAYNAME
sort order is case sensitive.
**Note:** In general, some \"List\" operations (for example, `ListInstances`) let you
optionally filter by availability domain if the scope of the resource type is within a
single availability domain. If you call one of these \"List\" operations without specifying
an availability domain, the resources are grouped by availability domain, then sorted.
Allowed values are: "TIMECREATED", "DISPLAYNAME"
:param str sort_order: (optional)
The sort order to use, either ascending (`ASC`) or descending (`DESC`). The DISPLAYNAME sort order
is case sensitive.
Allowed values are: "ASC", "DESC"
:param str lifecycle_state: (optional)
A filter to only return resources that match the given lifecycle state. The state value is case-insensitive.
Allowed values are: "PROVISIONING", "SCALING", "STARTING", "STOPPING", "TERMINATING", "STOPPED", "TERMINATED", "RUNNING"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type list of :class:`~oci.core.models.InstancePoolSummary`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools"
method = "GET"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"display_name",
"limit",
"page",
"sort_by",
"sort_order",
"lifecycle_state"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_instance_pools got unknown kwargs: {!r}".format(extra_kwargs))
if 'sort_by' in kwargs:
sort_by_allowed_values = ["TIMECREATED", "DISPLAYNAME"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["PROVISIONING", "SCALING", "STARTING", "STOPPING", "TERMINATING", "STOPPED", "TERMINATED", "RUNNING"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortBy": kwargs.get("sort_by", missing),
"sortOrder": kwargs.get("sort_order", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json"
}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[InstancePoolSummary]")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="list[InstancePoolSummary]")
def reset_instance_pool(self, instance_pool_id, **kwargs):
"""
Performs the reset (power off and power on) action on the specified instance pool,
which performs the action on all the instances in the pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/reset"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"reset_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
def softreset_instance_pool(self, instance_pool_id, **kwargs):
"""
Performs the softreset (ACPI shutdown and power on) action on the specified instance pool,
which performs the action on all the instances in the pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/softreset"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"softreset_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
def start_instance_pool(self, instance_pool_id, **kwargs):
"""
Performs the start (power on) action on the specified instance pool,
which performs the action on all the instances in the pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/start"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"start_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
def stop_instance_pool(self, instance_pool_id, **kwargs):
"""
Performs the stop (power off) action on the specified instance pool,
which performs the action on all the instances in the pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}/actions/stop"
method = "POST"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"stop_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="InstancePool")
def terminate_cluster_network(self, cluster_network_id, **kwargs):
"""
Terminates the specified cluster network.
When you delete a cluster network, all of its resources are permanently deleted,
including associated instances and instance pools.
:param str cluster_network_id: (required)
The `OCID`__ of the cluster network.
__ https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks/{clusterNetworkId}"
method = "DELETE"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"terminate_cluster_network got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"clusterNetworkId": cluster_network_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
def terminate_instance_pool(self, instance_pool_id, **kwargs):
"""
Terminate the specified instance pool.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}"
method = "DELETE"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"terminate_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params)
def update_cluster_network(self, cluster_network_id, update_cluster_network_details, **kwargs):
"""
Updates the specified cluster network. The OCID of the cluster network remains the same.
:param str cluster_network_id: (required)
The `OCID`__ of the cluster network.
__ https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm
:param UpdateClusterNetworkDetails update_cluster_network_details: (required)
Update cluster network
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.ClusterNetwork`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/clusterNetworks/{clusterNetworkId}"
method = "PUT"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_cluster_network got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"clusterNetworkId": cluster_network_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_cluster_network_details,
response_type="ClusterNetwork")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_cluster_network_details,
response_type="ClusterNetwork")
def update_instance_configuration(self, instance_configuration_id, update_instance_configuration_details, **kwargs):
"""
Updates the free-form tags, defined tags, and display name of an instance configuration.
:param str instance_configuration_id: (required)
The OCID of the instance configuration.
:param UpdateInstanceConfigurationDetails update_instance_configuration_details: (required)
Updates the freeFormTags, definedTags, and display name of an instance configuration.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstanceConfiguration`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instanceConfigurations/{instanceConfigurationId}"
method = "PUT"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_instance_configuration got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instanceConfigurationId": instance_configuration_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_instance_configuration_details,
response_type="InstanceConfiguration")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_instance_configuration_details,
response_type="InstanceConfiguration")
def update_instance_pool(self, instance_pool_id, update_instance_pool_details, **kwargs):
"""
Update the specified instance pool.
The OCID of the instance pool remains the same.
:param str instance_pool_id: (required)
The `OCID`__ of the instance pool.
__ https://docs.cloud.oracle.com/Content/General/Concepts/identifiers.htm
:param UpdateInstancePoolDetails update_instance_pool_details: (required)
Update instance pool configuration
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or
server error without risk of executing that same action again. Retry tokens expire after 24
hours, but can be invalidated before then due to conflicting operations (for example, if a resource
has been deleted and purged from the system, then a retry of the original creation request
may be rejected).
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match`
parameter to the value of the etag from a previous GET or POST response for that resource. The resource
will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.core.models.InstancePool`
:rtype: :class:`~oci.response.Response`
"""
resource_path = "/instancePools/{instancePoolId}"
method = "PUT"
# Don't accept unknown kwargs
expected_kwargs = [
"retry_strategy",
"opc_retry_token",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_instance_pool got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"instancePoolId": instance_pool_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.retry_strategy
if kwargs.get('retry_strategy'):
retry_strategy = kwargs.get('retry_strategy')
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_instance_pool_details,
response_type="InstancePool")
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_instance_pool_details,
response_type="InstancePool")
| 47.991031 | 245 | 0.645572 | 15,658 | 128,424 | 5.134564 | 0.033657 | 0.062739 | 0.023732 | 0.005921 | 0.917261 | 0.911427 | 0.901576 | 0.895705 | 0.894324 | 0.887359 | 0 | 0.001267 | 0.275011 | 128,424 | 2,675 | 246 | 48.008972 | 0.862232 | 0.445672 | 0 | 0.85429 | 0 | 0 | 0.167803 | 0.034786 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02145 | false | 0.00074 | 0.006657 | 0 | 0.070266 | 0.00074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2285bfb1bc777f450a958e2c59c4244f9cd10196 | 194 | py | Python | tests/test_issue334_configure_cmakelists_non_cp1252_encoding.py | oiffrig/scikit-build | 4e2928d93ba275f5cfc3837c174c25e6c4a73ac0 | [
"MIT"
] | 1 | 2021-12-14T18:49:49.000Z | 2021-12-14T18:49:49.000Z | tests/test_issue334_configure_cmakelists_non_cp1252_encoding.py | oiffrig/scikit-build | 4e2928d93ba275f5cfc3837c174c25e6c4a73ac0 | [
"MIT"
] | null | null | null | tests/test_issue334_configure_cmakelists_non_cp1252_encoding.py | oiffrig/scikit-build | 4e2928d93ba275f5cfc3837c174c25e6c4a73ac0 | [
"MIT"
] | 1 | 2021-11-12T01:03:02.000Z | 2021-11-12T01:03:02.000Z |
from . import project_setup_py_test
@project_setup_py_test("issue-334-configure-cmakelist-non-cp1252-encoding", ["install"], disable_languages_test=True)
def test_install_command():
pass
| 24.25 | 117 | 0.798969 | 27 | 194 | 5.37037 | 0.740741 | 0.165517 | 0.193103 | 0.248276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039548 | 0.087629 | 194 | 7 | 118 | 27.714286 | 0.779661 | 0 | 0 | 0 | 0 | 0 | 0.290155 | 0.253886 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
229335f96962c533f4ed681771c2bc4e3a2add00 | 1,764 | py | Python | aviary/scripts/spades_assembly_short.py | julianzaugg/aviary | b99cab48e99cce4afea3c002f4922061afda2977 | [
"BSD-3-Clause"
] | null | null | null | aviary/scripts/spades_assembly_short.py | julianzaugg/aviary | b99cab48e99cce4afea3c002f4922061afda2977 | [
"BSD-3-Clause"
] | 3 | 2020-11-14T11:59:24.000Z | 2021-01-11T22:33:26.000Z | aviary/scripts/spades_assembly_short.py | rhysnewell/BinSnek | b99cab48e99cce4afea3c002f4922061afda2977 | [
"BSD-3-Clause"
] | null | null | null | import subprocess
import os
if snakemake.config['short_reads_2'] != 'none':
if len(snakemake.config['short_reads_2']) > 1:
subprocess.Popen(
"spades.py --memory %s --meta -t %d -o data/spades_assembly -k 21,33,55,81,99,127 %s %s" %
(snakemake.config["max_memory"], snakemake.threads,
[" ".join(['-pe-1 ' + str(tup[0] + 1), tup[1]]) for tup in enumerate(snakemake.config['short_reads_1'])],
[" ".join(['-pe-2 ' + str(tup[0] + 1), tup[1]]) for tup in enumerate(snakemake.config['short_reads_2'])]),
shell=True).wait()
else:
subprocess.Popen(
"spades.py --memory %s --meta -t %d -o data/spades_assembly -k 21,33,55,81,99,127 -1 %s -2 %s" %
(snakemake.config["max_memory"], snakemake.threads, " ".join(snakemake.config["short_reads_1"]),
" ".join(snakemake.config["short_reads_2"])), shell=True).wait()
elif snakemake.config['short_reads_1'] != 'none':
if len(snakemake.config['short_reads_2']) > 1:
subprocess.Popen(
"spades.py --memory %s --meta -t %d -o data/spades_assembly -k 21,33,55,81,99,127 %s" %
(snakemake.config["max_memory"], snakemake.threads,
[" ".join(['-pe-12 ' + str(tup[0] + 1), tup[1]]) for tup in enumerate(snakemake.config['short_reads_1'])]),
shell=True).wait()
else:
subprocess.Popen(
"spades.py --memory %s --meta -t %d -o data/spades_assembly -k 21,33,55,81,99,127 -12 %s" %
(snakemake.config["max_memory"], snakemake.threads, " ".join(snakemake.config["short_reads_1"])),
shell=True).wait()
subprocess.Popen(
"ln -s data/spades_assembly/scaffolds.fasta data/final_contigs.fasta", shell=True
).wait()
| 50.4 | 120 | 0.595805 | 249 | 1,764 | 4.100402 | 0.204819 | 0.205681 | 0.195886 | 0.244858 | 0.877571 | 0.826641 | 0.818805 | 0.818805 | 0.766895 | 0.675808 | 0 | 0.058568 | 0.215986 | 1,764 | 34 | 121 | 51.882353 | 0.679682 | 0 | 0 | 0.466667 | 0 | 0.133333 | 0.35034 | 0.034014 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
22dceec312b756c5c7bdc3aba613d2a7fe3e9a3c | 45,451 | py | Python | tests/components/manual/test_alarm_control_panel.py | pdbogen/home-assistant | e602de55ac09be9ab8cbb354519a1b1b57fbe362 | [
"Apache-2.0"
] | 2 | 2022-01-24T18:59:56.000Z | 2022-02-04T22:12:48.000Z | tests/components/manual/test_alarm_control_panel.py | pdbogen/home-assistant | e602de55ac09be9ab8cbb354519a1b1b57fbe362 | [
"Apache-2.0"
] | 1 | 2020-08-27T01:16:43.000Z | 2020-08-27T01:16:43.000Z | tests/components/manual/test_alarm_control_panel.py | pdbogen/home-assistant | e602de55ac09be9ab8cbb354519a1b1b57fbe362 | [
"Apache-2.0"
] | 1 | 2020-05-24T07:37:49.000Z | 2020-05-24T07:37:49.000Z | """The tests for the manual Alarm Control Panel component."""
from datetime import timedelta
from unittest.mock import MagicMock, patch
from homeassistant.components import alarm_control_panel
from homeassistant.components.demo import alarm_control_panel as demo
from homeassistant.const import (
STATE_ALARM_ARMED_AWAY,
STATE_ALARM_ARMED_CUSTOM_BYPASS,
STATE_ALARM_ARMED_HOME,
STATE_ALARM_ARMED_NIGHT,
STATE_ALARM_ARMING,
STATE_ALARM_DISARMED,
STATE_ALARM_PENDING,
STATE_ALARM_TRIGGERED,
)
from homeassistant.core import CoreState, State
from homeassistant.setup import async_setup_component
import homeassistant.util.dt as dt_util
from tests.common import async_fire_time_changed, mock_component, mock_restore_cache
from tests.components.alarm_control_panel import common
CODE = "HELLO_CODE"
async def test_setup_demo_platform(hass):
"""Test setup."""
mock = MagicMock()
add_entities = mock.MagicMock()
await demo.async_setup_platform(hass, {}, add_entities)
assert add_entities.call_count == 1
async def test_arm_home_no_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, CODE)
assert STATE_ALARM_ARMED_HOME == hass.states.get(entity_id).state
async def test_arm_home_no_pending_when_code_not_req(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"code_arm_required": False,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, 0)
assert STATE_ALARM_ARMED_HOME == hass.states.get(entity_id).state
async def test_arm_home_with_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, CODE, entity_id)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
state = hass.states.get(entity_id)
assert state.attributes["next_state"] == STATE_ALARM_ARMED_HOME
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_ARMED_HOME
async def test_arm_home_with_invalid_code(hass):
"""Attempt to arm home without a valid code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, CODE + "2")
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_arm_away_no_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE, entity_id)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
async def test_arm_away_no_pending_when_code_not_req(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"code_arm_required": False,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, 0, entity_id)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
async def test_arm_home_with_template_code(hass):
"""Attempt to arm with a template-based code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code_template": '{{ "abc" }}',
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, "abc")
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMED_HOME == state.state
async def test_arm_away_with_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
state = hass.states.get(entity_id)
assert state.attributes["next_state"] == STATE_ALARM_ARMED_AWAY
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_ARMED_AWAY
async def test_arm_away_with_invalid_code(hass):
"""Attempt to arm away without a valid code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE + "2")
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_arm_night_no_pending(hass):
"""Test arm night method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_night(hass, CODE)
assert STATE_ALARM_ARMED_NIGHT == hass.states.get(entity_id).state
async def test_arm_night_no_pending_when_code_not_req(hass):
"""Test arm night method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"code_arm_required": False,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_night(hass, 0)
assert STATE_ALARM_ARMED_NIGHT == hass.states.get(entity_id).state
async def test_arm_night_with_pending(hass):
"""Test arm night method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_night(hass, CODE, entity_id)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
state = hass.states.get(entity_id)
assert state.attributes["next_state"] == STATE_ALARM_ARMED_NIGHT
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_ARMED_NIGHT
# Do not go to the pending state when updating to the same state
await common.async_alarm_arm_night(hass, CODE, entity_id)
assert STATE_ALARM_ARMED_NIGHT == hass.states.get(entity_id).state
async def test_arm_night_with_invalid_code(hass):
"""Attempt to night home without a valid code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_night(hass, CODE + "2")
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_no_pending(hass):
"""Test triggering when no pending submitted method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=60)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
async def test_trigger_with_delay(hass):
"""Test trigger method and switch from pending to triggered."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"delay_time": 1,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert STATE_ALARM_PENDING == state.state
assert STATE_ALARM_TRIGGERED == state.attributes["next_state"]
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert STATE_ALARM_TRIGGERED == state.state
async def test_trigger_zero_trigger_time(hass):
"""Test disabled trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 0,
"trigger_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass)
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_zero_trigger_time_with_pending(hass):
"""Test disabled trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 2,
"trigger_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass)
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"delay_time": 2,
"trigger_time": 3,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
state = hass.states.get(entity_id)
assert state.attributes["next_state"] == STATE_ALARM_TRIGGERED
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_TRIGGERED
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_DISARMED
async def test_trigger_with_unused_specific_delay(hass):
"""Test trigger method and switch from pending to triggered."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"delay_time": 5,
"arming_time": 0,
"armed_home": {"delay_time": 10},
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert STATE_ALARM_PENDING == state.state
assert STATE_ALARM_TRIGGERED == state.attributes["next_state"]
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_TRIGGERED
async def test_trigger_with_specific_delay(hass):
"""Test trigger method and switch from pending to triggered."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"delay_time": 10,
"arming_time": 0,
"armed_away": {"delay_time": 1},
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert STATE_ALARM_PENDING == state.state
assert STATE_ALARM_TRIGGERED == state.attributes["next_state"]
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_TRIGGERED
async def test_trigger_with_pending_and_delay(hass):
"""Test trigger method and switch from pending to triggered."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"delay_time": 2,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_PENDING
assert state.attributes["next_state"] == STATE_ALARM_TRIGGERED
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_PENDING
assert state.attributes["next_state"] == STATE_ALARM_TRIGGERED
future += timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_TRIGGERED
async def test_trigger_with_pending_and_specific_delay(hass):
"""Test trigger method and switch from pending to triggered."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"delay_time": 10,
"arming_time": 0,
"armed_away": {"delay_time": 2},
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_PENDING
assert state.attributes["next_state"] == STATE_ALARM_TRIGGERED
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_PENDING
assert state.attributes["next_state"] == STATE_ALARM_TRIGGERED
future += timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_TRIGGERED
async def test_armed_home_with_specific_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 10,
"armed_home": {"arming_time": 2},
}
},
)
entity_id = "alarm_control_panel.test"
await common.async_alarm_arm_home(hass)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_HOME == hass.states.get(entity_id).state
async def test_armed_away_with_specific_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 10,
"armed_away": {"arming_time": 2},
}
},
)
entity_id = "alarm_control_panel.test"
await common.async_alarm_arm_away(hass)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
async def test_armed_night_with_specific_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 10,
"armed_night": {"arming_time": 2},
}
},
)
entity_id = "alarm_control_panel.test"
await common.async_alarm_arm_night(hass)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_NIGHT == hass.states.get(entity_id).state
async def test_trigger_with_specific_pending(hass):
"""Test arm home method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"delay_time": 10,
"disarmed": {"delay_time": 2},
"trigger_time": 3,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
await common.async_alarm_trigger(hass)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_disarm_after_trigger(hass):
"""Test disarm after trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"delay_time": 0,
"disarm_after_trigger": True,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_zero_specific_trigger_time(hass):
"""Test trigger method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"disarmed": {"trigger_time": 0},
"arming_time": 0,
"disarm_after_trigger": True,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_unused_zero_specific_trigger_time(hass):
"""Test disarm after trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"armed_home": {"trigger_time": 0},
"delay_time": 0,
"disarm_after_trigger": True,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_specific_trigger_time(hass):
"""Test disarm after trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"disarmed": {"trigger_time": 5},
"delay_time": 0,
"disarm_after_trigger": True,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_trigger_with_no_disarm_after_trigger(hass):
"""Test disarm after trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"arming_time": 0,
"delay_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE, entity_id)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
async def test_back_to_back_trigger_with_no_disarm_after_trigger(hass):
"""Test disarm after trigger."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"arming_time": 0,
"delay_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE, entity_id)
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass, entity_id=entity_id)
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_AWAY == hass.states.get(entity_id).state
async def test_disarm_while_pending_trigger(hass):
"""Test disarming while pending state."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"trigger_time": 5,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
await common.async_alarm_disarm(hass, entity_id=entity_id)
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_disarm_during_trigger_with_invalid_code(hass):
"""Test disarming while code is invalid."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"delay_time": 5,
"code": CODE + "2",
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_trigger(hass)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
await common.async_alarm_disarm(hass, entity_id=entity_id)
assert STATE_ALARM_PENDING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=5)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_TRIGGERED == hass.states.get(entity_id).state
async def test_disarm_with_template_code(hass):
"""Attempt to disarm with a valid or invalid template-based code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code_template": '{{ "" if from_state == "disarmed" else "abc" }}',
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_home(hass, "def")
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMED_HOME == state.state
await common.async_alarm_disarm(hass, "def")
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMED_HOME == state.state
await common.async_alarm_disarm(hass, "abc")
state = hass.states.get(entity_id)
assert STATE_ALARM_DISARMED == state.state
async def test_arm_custom_bypass_no_pending(hass):
"""Test arm custom bypass method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_custom_bypass(hass, CODE)
assert STATE_ALARM_ARMED_CUSTOM_BYPASS == hass.states.get(entity_id).state
async def test_arm_custom_bypass_no_pending_when_code_not_req(hass):
"""Test arm custom bypass method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"code_arm_required": False,
"arming_time": 0,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_custom_bypass(hass, 0)
assert STATE_ALARM_ARMED_CUSTOM_BYPASS == hass.states.get(entity_id).state
async def test_arm_custom_bypass_with_pending(hass):
"""Test arm custom bypass method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_custom_bypass(hass, CODE, entity_id)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
state = hass.states.get(entity_id)
assert state.attributes["next_state"] == STATE_ALARM_ARMED_CUSTOM_BYPASS
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == STATE_ALARM_ARMED_CUSTOM_BYPASS
async def test_arm_custom_bypass_with_invalid_code(hass):
"""Attempt to custom bypass without a valid code."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 1,
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_custom_bypass(hass, CODE + "2")
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
async def test_armed_custom_bypass_with_specific_pending(hass):
"""Test arm custom bypass method."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 10,
"armed_custom_bypass": {"arming_time": 2},
}
},
)
entity_id = "alarm_control_panel.test"
await common.async_alarm_arm_custom_bypass(hass)
assert STATE_ALARM_ARMING == hass.states.get(entity_id).state
future = dt_util.utcnow() + timedelta(seconds=2)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
assert STATE_ALARM_ARMED_CUSTOM_BYPASS == hass.states.get(entity_id).state
async def test_arm_away_after_disabled_disarmed(hass):
"""Test pending state with and without zero trigger time."""
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"code": CODE,
"arming_time": 0,
"delay_time": 1,
"armed_away": {"arming_time": 1},
"disarmed": {"trigger_time": 0},
"disarm_after_trigger": False,
}
},
)
entity_id = "alarm_control_panel.test"
assert STATE_ALARM_DISARMED == hass.states.get(entity_id).state
await common.async_alarm_arm_away(hass, CODE)
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMING == state.state
assert STATE_ALARM_DISARMED == state.attributes["previous_state"]
assert STATE_ALARM_ARMED_AWAY == state.attributes["next_state"]
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMING == state.state
assert STATE_ALARM_DISARMED == state.attributes["previous_state"]
assert STATE_ALARM_ARMED_AWAY == state.attributes["next_state"]
future = dt_util.utcnow() + timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert STATE_ALARM_ARMED_AWAY == state.state
await common.async_alarm_trigger(hass, entity_id=entity_id)
state = hass.states.get(entity_id)
assert STATE_ALARM_PENDING == state.state
assert STATE_ALARM_ARMED_AWAY == state.attributes["previous_state"]
assert STATE_ALARM_TRIGGERED == state.attributes["next_state"]
future += timedelta(seconds=1)
with patch(
("homeassistant.components.manual.alarm_control_panel.dt_util.utcnow"),
return_value=future,
):
async_fire_time_changed(hass, future)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert STATE_ALARM_TRIGGERED == state.state
async def test_restore_armed_state(hass):
"""Ensure armed state is restored on startup."""
mock_restore_cache(
hass, (State("alarm_control_panel.test", STATE_ALARM_ARMED_AWAY),)
)
hass.state = CoreState.starting
mock_component(hass, "recorder")
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 0,
"trigger_time": 0,
"disarm_after_trigger": False,
}
},
)
state = hass.states.get("alarm_control_panel.test")
assert state
assert state.state == STATE_ALARM_ARMED_AWAY
async def test_restore_disarmed_state(hass):
"""Ensure disarmed state is restored on startup."""
mock_restore_cache(hass, (State("alarm_control_panel.test", STATE_ALARM_DISARMED),))
hass.state = CoreState.starting
mock_component(hass, "recorder")
assert await async_setup_component(
hass,
alarm_control_panel.DOMAIN,
{
"alarm_control_panel": {
"platform": "manual",
"name": "test",
"arming_time": 0,
"trigger_time": 0,
"disarm_after_trigger": False,
}
},
)
state = hass.states.get("alarm_control_panel.test")
assert state
assert state.state == STATE_ALARM_DISARMED
| 29.667755 | 88 | 0.627775 | 5,279 | 45,451 | 5.058344 | 0.024436 | 0.062914 | 0.105044 | 0.090364 | 0.948882 | 0.944201 | 0.933116 | 0.927836 | 0.924915 | 0.922256 | 0 | 0.003741 | 0.270775 | 45,451 | 1,531 | 89 | 29.687133 | 0.801925 | 0.002618 | 0 | 0.742504 | 0 | 0 | 0.161355 | 0.070155 | 0 | 0 | 0 | 0 | 0.165785 | 1 | 0 | false | 0.014991 | 0.008818 | 0 | 0.008818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3b336de5d593c904e6bb2ceea9e6e24d23f7f34 | 326 | py | Python | beauty_and_pics/contest_app/models/__init__.py | entpy/beauty-and-pics | 50d9c79c05061dc8594a70871cb9df5920cc4b28 | [
"MIT"
] | null | null | null | beauty_and_pics/contest_app/models/__init__.py | entpy/beauty-and-pics | 50d9c79c05061dc8594a70871cb9df5920cc4b28 | [
"MIT"
] | null | null | null | beauty_and_pics/contest_app/models/__init__.py | entpy/beauty-and-pics | 50d9c79c05061dc8594a70871cb9df5920cc4b28 | [
"MIT"
] | null | null | null | from django.contrib import admin
from contest_app.models.contest_types import Contest_Type
from contest_app.models.contests import Contest
from contest_app.models.metrics import Metric
from contest_app.models.votes import Vote
from contest_app.models.points import Point
from contest_app.models.hall_of_fame import HallOfFame
| 40.75 | 57 | 0.874233 | 51 | 326 | 5.392157 | 0.411765 | 0.24 | 0.305455 | 0.436364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08589 | 326 | 7 | 58 | 46.571429 | 0.922819 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a3b3c1ccfeaf671b28eebb130d2a7e4c76650ff2 | 20,027 | py | Python | tests/components/mqtt_statestream/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 6 | 2016-11-25T06:36:27.000Z | 2021-11-16T11:20:23.000Z | tests/components/mqtt_statestream/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 52 | 2020-07-14T14:12:26.000Z | 2022-03-31T06:24:02.000Z | tests/components/mqtt_statestream/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 3 | 2021-05-18T16:42:18.000Z | 2021-07-19T22:04:21.000Z | """The tests for the MQTT statestream component."""
import pytest
import homeassistant.components.mqtt_statestream as statestream
from homeassistant.core import State
from homeassistant.setup import async_setup_component
from tests.async_mock import ANY, call
from tests.common import mock_state_change_event
@pytest.fixture(autouse=True)
def mock_storage(hass_storage):
"""Autouse hass_storage for the TestCase tests."""
async def add_statestream(
hass,
base_topic=None,
publish_attributes=None,
publish_timestamps=None,
publish_include=None,
publish_exclude=None,
):
"""Add a mqtt_statestream component."""
config = {}
if base_topic:
config["base_topic"] = base_topic
if publish_attributes:
config["publish_attributes"] = publish_attributes
if publish_timestamps:
config["publish_timestamps"] = publish_timestamps
if publish_include:
config["include"] = publish_include
if publish_exclude:
config["exclude"] = publish_exclude
return await async_setup_component(
hass, statestream.DOMAIN, {statestream.DOMAIN: config}
)
async def test_fails_with_no_base(hass, mqtt_mock):
"""Setup should fail if no base_topic is set."""
assert await add_statestream(hass) is False
async def test_setup_succeeds_without_attributes(hass, mqtt_mock):
"""Test the success of the setup with a valid base_topic."""
assert await add_statestream(hass, base_topic="pub")
async def test_setup_succeeds_with_attributes(hass, mqtt_mock):
"""Test setup with a valid base_topic and publish_attributes."""
assert await add_statestream(hass, base_topic="pub", publish_attributes=True)
async def test_state_changed_event_sends_message(hass, mqtt_mock):
"""Test the sending of a new message if event changed."""
e_id = "fake.entity"
base_topic = "pub"
# Add the statestream component for publishing state updates
assert await add_statestream(hass, base_topic=base_topic)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State(e_id, "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
async def test_state_changed_event_sends_message_and_timestamp(hass, mqtt_mock):
"""Test the sending of a message and timestamps if event changed."""
e_id = "another.entity"
base_topic = "pub"
# Add the statestream component for publishing state updates
assert await add_statestream(
hass, base_topic=base_topic, publish_attributes=None, publish_timestamps=True
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State(e_id, "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
calls = [
call.async_publish("pub/another/entity/state", "on", 1, True),
call.async_publish("pub/another/entity/last_changed", ANY, 1, True),
call.async_publish("pub/another/entity/last_updated", ANY, 1, True),
]
mqtt_mock.async_publish.assert_has_calls(calls, any_order=True)
assert mqtt_mock.async_publish.called
async def test_state_changed_attr_sends_message(hass, mqtt_mock):
"""Test the sending of a new message if attribute changed."""
e_id = "fake.entity"
base_topic = "pub"
# Add the statestream component for publishing state updates
assert await add_statestream(hass, base_topic=base_topic, publish_attributes=True)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
test_attributes = {"testing": "YES", "list": ["a", "b", "c"], "bool": False}
# Set a state of an entity
mock_state_change_event(hass, State(e_id, "off", attributes=test_attributes))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
calls = [
call.async_publish("pub/fake/entity/state", "off", 1, True),
call.async_publish("pub/fake/entity/testing", '"YES"', 1, True),
call.async_publish("pub/fake/entity/list", '["a", "b", "c"]', 1, True),
call.async_publish("pub/fake/entity/bool", "false", 1, True),
]
mqtt_mock.async_publish.assert_has_calls(calls, any_order=True)
assert mqtt_mock.async_publish.called
async def test_state_changed_event_include_domain(hass, mqtt_mock):
"""Test that filtering on included domain works as expected."""
base_topic = "pub"
incl = {"domains": ["fake"]}
excl = {}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake2.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_include_entity(hass, mqtt_mock):
"""Test that filtering on included entity works as expected."""
base_topic = "pub"
incl = {"entities": ["fake.entity"]}
excl = {}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_exclude_domain(hass, mqtt_mock):
"""Test that filtering on excluded domain works as expected."""
base_topic = "pub"
incl = {}
excl = {"domains": ["fake2"]}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake2.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_exclude_entity(hass, mqtt_mock):
"""Test that filtering on excluded entity works as expected."""
base_topic = "pub"
incl = {}
excl = {"entities": ["fake.entity2"]}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_exclude_domain_include_entity(hass, mqtt_mock):
"""Test filtering with excluded domain and included entity."""
base_topic = "pub"
incl = {"entities": ["fake.entity"]}
excl = {"domains": ["fake"]}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_include_domain_exclude_entity(hass, mqtt_mock):
"""Test filtering with included domain and excluded entity."""
base_topic = "pub"
incl = {"domains": ["fake"]}
excl = {"entities": ["fake.entity2"]}
# Add the statestream component for publishing state updates
# Set the filter to allow fake.* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_include_globs(hass, mqtt_mock):
"""Test that filtering on included glob works as expected."""
base_topic = "pub"
incl = {"entity_globs": ["*.included_*"]}
excl = {}
# Add the statestream component for publishing state updates
# Set the filter to allow *.included_* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity with included glob
mock_state_change_event(hass, State("fake2.included_entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake2/included_entity/state
mqtt_mock.async_publish.assert_called_with(
"pub/fake2/included_entity/state", "on", 1, True
)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake2.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_exclude_globs(hass, mqtt_mock):
"""Test that filtering on excluded globs works as expected."""
base_topic = "pub"
incl = {}
excl = {"entity_globs": ["*.excluded_*"]}
# Add the statestream component for publishing state updates
# Set the filter to allow *.excluded_* items
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included by glob
mock_state_change_event(hass, State("fake.excluded_entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_exclude_domain_globs_include_entity(hass, mqtt_mock):
"""Test filtering with excluded domain and glob and included entity."""
base_topic = "pub"
incl = {"entities": ["fake.entity"]}
excl = {"domains": ["fake"], "entity_globs": ["*.excluded_*"]}
# Add the statestream component for publishing state updates
# Set the filter to exclude with include filter
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that doesn't match any filters
mock_state_change_event(hass, State("fake2.included_entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with(
"pub/fake2/included_entity/state", "on", 1, True
)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included by domain
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included by glob
mock_state_change_event(hass, State("fake.excluded_entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
async def test_state_changed_event_include_domain_globs_exclude_entity(hass, mqtt_mock):
"""Test filtering with included domain and glob and excluded entity."""
base_topic = "pub"
incl = {"domains": ["fake"], "entity_globs": ["*.included_*"]}
excl = {"entities": ["fake.entity2"]}
# Add the statestream component for publishing state updates
# Set the filter to include with exclude filter
assert await add_statestream(
hass, base_topic=base_topic, publish_include=incl, publish_exclude=excl
)
await hass.async_block_till_done()
# Reset the mock because it will have already gotten calls for the
# mqtt_statestream state change on initialization, etc.
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity included by domain
mock_state_change_event(hass, State("fake.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with("pub/fake/entity/state", "on", 1, True)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity included by glob
mock_state_change_event(hass, State("fake.included_entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
# Make sure 'on' was published to pub/fake/entity/state
mqtt_mock.async_publish.assert_called_with(
"pub/fake/included_entity/state", "on", 1, True
)
assert mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that shouldn't be included
mock_state_change_event(hass, State("fake.entity2", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
mqtt_mock.async_publish.reset_mock()
# Set a state of an entity that doesn't match any filters
mock_state_change_event(hass, State("fake2.entity", "on"))
await hass.async_block_till_done()
await hass.async_block_till_done()
assert not mqtt_mock.async_publish.called
| 36.882136 | 88 | 0.730314 | 2,895 | 20,027 | 4.793092 | 0.046287 | 0.049005 | 0.064644 | 0.099452 | 0.91352 | 0.894638 | 0.889521 | 0.886206 | 0.834679 | 0.822643 | 0 | 0.00237 | 0.178159 | 20,027 | 542 | 89 | 36.950185 | 0.840695 | 0.23668 | 0 | 0.714286 | 0 | 0 | 0.095137 | 0.033503 | 0 | 0 | 0 | 0 | 0.192691 | 1 | 0.003322 | false | 0 | 0.019934 | 0 | 0.026578 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
432620bb5346316ae5d80f30ac467fe04490a65c | 42,165 | py | Python | sdk/python/pulumi_yandex/function.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 9 | 2021-04-20T15:39:41.000Z | 2022-02-20T09:14:39.000Z | sdk/python/pulumi_yandex/function.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 56 | 2021-04-20T11:31:03.000Z | 2022-03-31T15:53:06.000Z | sdk/python/pulumi_yandex/function.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['FunctionArgs', 'Function']
@pulumi.input_type
class FunctionArgs:
def __init__(__self__, *,
entrypoint: pulumi.Input[str],
memory: pulumi.Input[int],
runtime: pulumi.Input[str],
user_hash: pulumi.Input[str],
content: Optional[pulumi.Input['FunctionContentArgs']] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
execution_timeout: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
package: Optional[pulumi.Input['FunctionPackageArgs']] = None,
service_account_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a Function resource.
:param pulumi.Input[str] entrypoint: Entrypoint for Yandex Cloud Function
:param pulumi.Input[int] memory: Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
:param pulumi.Input[str] runtime: Runtime for Yandex Cloud Function
:param pulumi.Input[str] user_hash: User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
:param pulumi.Input['FunctionContentArgs'] content: Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
:param pulumi.Input[str] description: Description of the Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] environment: A set of key/value environment variables for Yandex Cloud Function
:param pulumi.Input[str] execution_timeout: Execution timeout in seconds for Yandex Cloud Function
:param pulumi.Input[str] folder_id: Folder ID for the Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Yandex Cloud Function
:param pulumi.Input[str] name: Yandex Cloud Function name used to define trigger
:param pulumi.Input['FunctionPackageArgs'] package: Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
:param pulumi.Input[str] service_account_id: Service account ID for Yandex Cloud Function
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
"""
pulumi.set(__self__, "entrypoint", entrypoint)
pulumi.set(__self__, "memory", memory)
pulumi.set(__self__, "runtime", runtime)
pulumi.set(__self__, "user_hash", user_hash)
if content is not None:
pulumi.set(__self__, "content", content)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if execution_timeout is not None:
pulumi.set(__self__, "execution_timeout", execution_timeout)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if name is not None:
pulumi.set(__self__, "name", name)
if package is not None:
pulumi.set(__self__, "package", package)
if service_account_id is not None:
pulumi.set(__self__, "service_account_id", service_account_id)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter
def entrypoint(self) -> pulumi.Input[str]:
"""
Entrypoint for Yandex Cloud Function
"""
return pulumi.get(self, "entrypoint")
@entrypoint.setter
def entrypoint(self, value: pulumi.Input[str]):
pulumi.set(self, "entrypoint", value)
@property
@pulumi.getter
def memory(self) -> pulumi.Input[int]:
"""
Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
"""
return pulumi.get(self, "memory")
@memory.setter
def memory(self, value: pulumi.Input[int]):
pulumi.set(self, "memory", value)
@property
@pulumi.getter
def runtime(self) -> pulumi.Input[str]:
"""
Runtime for Yandex Cloud Function
"""
return pulumi.get(self, "runtime")
@runtime.setter
def runtime(self, value: pulumi.Input[str]):
pulumi.set(self, "runtime", value)
@property
@pulumi.getter(name="userHash")
def user_hash(self) -> pulumi.Input[str]:
"""
User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
"""
return pulumi.get(self, "user_hash")
@user_hash.setter
def user_hash(self, value: pulumi.Input[str]):
pulumi.set(self, "user_hash", value)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input['FunctionContentArgs']]:
"""
Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input['FunctionContentArgs']]):
pulumi.set(self, "content", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the Yandex Cloud Function
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value environment variables for Yandex Cloud Function
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter(name="executionTimeout")
def execution_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Execution timeout in seconds for Yandex Cloud Function
"""
return pulumi.get(self, "execution_timeout")
@execution_timeout.setter
def execution_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "execution_timeout", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
Folder ID for the Yandex Cloud Function
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the Yandex Cloud Function
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Yandex Cloud Function name used to define trigger
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def package(self) -> Optional[pulumi.Input['FunctionPackageArgs']]:
"""
Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
"""
return pulumi.get(self, "package")
@package.setter
def package(self, value: Optional[pulumi.Input['FunctionPackageArgs']]):
pulumi.set(self, "package", value)
@property
@pulumi.getter(name="serviceAccountId")
def service_account_id(self) -> Optional[pulumi.Input[str]]:
"""
Service account ID for Yandex Cloud Function
"""
return pulumi.get(self, "service_account_id")
@service_account_id.setter
def service_account_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "service_account_id", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _FunctionState:
def __init__(__self__, *,
content: Optional[pulumi.Input['FunctionContentArgs']] = None,
created_at: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
entrypoint: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
execution_timeout: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
image_size: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
loggroup_id: Optional[pulumi.Input[str]] = None,
memory: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
package: Optional[pulumi.Input['FunctionPackageArgs']] = None,
runtime: Optional[pulumi.Input[str]] = None,
service_account_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
user_hash: Optional[pulumi.Input[str]] = None,
version: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Function resources.
:param pulumi.Input['FunctionContentArgs'] content: Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
:param pulumi.Input[str] created_at: Creation timestamp of the Yandex Cloud Function.
:param pulumi.Input[str] description: Description of the Yandex Cloud Function
:param pulumi.Input[str] entrypoint: Entrypoint for Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] environment: A set of key/value environment variables for Yandex Cloud Function
:param pulumi.Input[str] execution_timeout: Execution timeout in seconds for Yandex Cloud Function
:param pulumi.Input[str] folder_id: Folder ID for the Yandex Cloud Function
:param pulumi.Input[int] image_size: Image size for Yandex Cloud Function.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Yandex Cloud Function
:param pulumi.Input[str] loggroup_id: Loggroup ID size for Yandex Cloud Function.
:param pulumi.Input[int] memory: Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
:param pulumi.Input[str] name: Yandex Cloud Function name used to define trigger
:param pulumi.Input['FunctionPackageArgs'] package: Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
:param pulumi.Input[str] runtime: Runtime for Yandex Cloud Function
:param pulumi.Input[str] service_account_id: Service account ID for Yandex Cloud Function
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
:param pulumi.Input[str] user_hash: User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
:param pulumi.Input[str] version: Version for Yandex Cloud Function.
"""
if content is not None:
pulumi.set(__self__, "content", content)
if created_at is not None:
pulumi.set(__self__, "created_at", created_at)
if description is not None:
pulumi.set(__self__, "description", description)
if entrypoint is not None:
pulumi.set(__self__, "entrypoint", entrypoint)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if execution_timeout is not None:
pulumi.set(__self__, "execution_timeout", execution_timeout)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if image_size is not None:
pulumi.set(__self__, "image_size", image_size)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if loggroup_id is not None:
pulumi.set(__self__, "loggroup_id", loggroup_id)
if memory is not None:
pulumi.set(__self__, "memory", memory)
if name is not None:
pulumi.set(__self__, "name", name)
if package is not None:
pulumi.set(__self__, "package", package)
if runtime is not None:
pulumi.set(__self__, "runtime", runtime)
if service_account_id is not None:
pulumi.set(__self__, "service_account_id", service_account_id)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if user_hash is not None:
pulumi.set(__self__, "user_hash", user_hash)
if version is not None:
pulumi.set(__self__, "version", version)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input['FunctionContentArgs']]:
"""
Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input['FunctionContentArgs']]):
pulumi.set(self, "content", value)
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> Optional[pulumi.Input[str]]:
"""
Creation timestamp of the Yandex Cloud Function.
"""
return pulumi.get(self, "created_at")
@created_at.setter
def created_at(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "created_at", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the Yandex Cloud Function
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def entrypoint(self) -> Optional[pulumi.Input[str]]:
"""
Entrypoint for Yandex Cloud Function
"""
return pulumi.get(self, "entrypoint")
@entrypoint.setter
def entrypoint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "entrypoint", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value environment variables for Yandex Cloud Function
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter(name="executionTimeout")
def execution_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Execution timeout in seconds for Yandex Cloud Function
"""
return pulumi.get(self, "execution_timeout")
@execution_timeout.setter
def execution_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "execution_timeout", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
Folder ID for the Yandex Cloud Function
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter(name="imageSize")
def image_size(self) -> Optional[pulumi.Input[int]]:
"""
Image size for Yandex Cloud Function.
"""
return pulumi.get(self, "image_size")
@image_size.setter
def image_size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "image_size", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the Yandex Cloud Function
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="loggroupId")
def loggroup_id(self) -> Optional[pulumi.Input[str]]:
"""
Loggroup ID size for Yandex Cloud Function.
"""
return pulumi.get(self, "loggroup_id")
@loggroup_id.setter
def loggroup_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "loggroup_id", value)
@property
@pulumi.getter
def memory(self) -> Optional[pulumi.Input[int]]:
"""
Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
"""
return pulumi.get(self, "memory")
@memory.setter
def memory(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "memory", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Yandex Cloud Function name used to define trigger
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def package(self) -> Optional[pulumi.Input['FunctionPackageArgs']]:
"""
Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
"""
return pulumi.get(self, "package")
@package.setter
def package(self, value: Optional[pulumi.Input['FunctionPackageArgs']]):
pulumi.set(self, "package", value)
@property
@pulumi.getter
def runtime(self) -> Optional[pulumi.Input[str]]:
"""
Runtime for Yandex Cloud Function
"""
return pulumi.get(self, "runtime")
@runtime.setter
def runtime(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "runtime", value)
@property
@pulumi.getter(name="serviceAccountId")
def service_account_id(self) -> Optional[pulumi.Input[str]]:
"""
Service account ID for Yandex Cloud Function
"""
return pulumi.get(self, "service_account_id")
@service_account_id.setter
def service_account_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "service_account_id", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="userHash")
def user_hash(self) -> Optional[pulumi.Input[str]]:
"""
User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
"""
return pulumi.get(self, "user_hash")
@user_hash.setter
def user_hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user_hash", value)
@property
@pulumi.getter
def version(self) -> Optional[pulumi.Input[str]]:
"""
Version for Yandex Cloud Function.
"""
return pulumi.get(self, "version")
@version.setter
def version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version", value)
class Function(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[pulumi.InputType['FunctionContentArgs']]] = None,
description: Optional[pulumi.Input[str]] = None,
entrypoint: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
execution_timeout: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
memory: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
package: Optional[pulumi.Input[pulumi.InputType['FunctionPackageArgs']]] = None,
runtime: Optional[pulumi.Input[str]] = None,
service_account_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
user_hash: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Allows management of [Yandex Cloud Function](https://cloud.yandex.com/docs/functions/)
## Example Usage
```python
import pulumi
import pulumi_yandex as yandex
test_function = yandex.Function("test-function",
content=yandex.FunctionContentArgs(
zip_filename="function.zip",
),
description="any description",
entrypoint="main",
execution_timeout="10",
memory=128,
runtime="python37",
service_account_id="are1service2account3id",
tags=["my_tag"],
user_hash="any_user_defined_string")
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['FunctionContentArgs']] content: Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
:param pulumi.Input[str] description: Description of the Yandex Cloud Function
:param pulumi.Input[str] entrypoint: Entrypoint for Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] environment: A set of key/value environment variables for Yandex Cloud Function
:param pulumi.Input[str] execution_timeout: Execution timeout in seconds for Yandex Cloud Function
:param pulumi.Input[str] folder_id: Folder ID for the Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Yandex Cloud Function
:param pulumi.Input[int] memory: Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
:param pulumi.Input[str] name: Yandex Cloud Function name used to define trigger
:param pulumi.Input[pulumi.InputType['FunctionPackageArgs']] package: Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
:param pulumi.Input[str] runtime: Runtime for Yandex Cloud Function
:param pulumi.Input[str] service_account_id: Service account ID for Yandex Cloud Function
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
:param pulumi.Input[str] user_hash: User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: FunctionArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Allows management of [Yandex Cloud Function](https://cloud.yandex.com/docs/functions/)
## Example Usage
```python
import pulumi
import pulumi_yandex as yandex
test_function = yandex.Function("test-function",
content=yandex.FunctionContentArgs(
zip_filename="function.zip",
),
description="any description",
entrypoint="main",
execution_timeout="10",
memory=128,
runtime="python37",
service_account_id="are1service2account3id",
tags=["my_tag"],
user_hash="any_user_defined_string")
```
:param str resource_name: The name of the resource.
:param FunctionArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(FunctionArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[pulumi.InputType['FunctionContentArgs']]] = None,
description: Optional[pulumi.Input[str]] = None,
entrypoint: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
execution_timeout: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
memory: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
package: Optional[pulumi.Input[pulumi.InputType['FunctionPackageArgs']]] = None,
runtime: Optional[pulumi.Input[str]] = None,
service_account_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
user_hash: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = FunctionArgs.__new__(FunctionArgs)
__props__.__dict__["content"] = content
__props__.__dict__["description"] = description
if entrypoint is None and not opts.urn:
raise TypeError("Missing required property 'entrypoint'")
__props__.__dict__["entrypoint"] = entrypoint
__props__.__dict__["environment"] = environment
__props__.__dict__["execution_timeout"] = execution_timeout
__props__.__dict__["folder_id"] = folder_id
__props__.__dict__["labels"] = labels
if memory is None and not opts.urn:
raise TypeError("Missing required property 'memory'")
__props__.__dict__["memory"] = memory
__props__.__dict__["name"] = name
__props__.__dict__["package"] = package
if runtime is None and not opts.urn:
raise TypeError("Missing required property 'runtime'")
__props__.__dict__["runtime"] = runtime
__props__.__dict__["service_account_id"] = service_account_id
__props__.__dict__["tags"] = tags
if user_hash is None and not opts.urn:
raise TypeError("Missing required property 'user_hash'")
__props__.__dict__["user_hash"] = user_hash
__props__.__dict__["created_at"] = None
__props__.__dict__["image_size"] = None
__props__.__dict__["loggroup_id"] = None
__props__.__dict__["version"] = None
super(Function, __self__).__init__(
'yandex:index/function:Function',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
content: Optional[pulumi.Input[pulumi.InputType['FunctionContentArgs']]] = None,
created_at: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
entrypoint: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
execution_timeout: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
image_size: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
loggroup_id: Optional[pulumi.Input[str]] = None,
memory: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
package: Optional[pulumi.Input[pulumi.InputType['FunctionPackageArgs']]] = None,
runtime: Optional[pulumi.Input[str]] = None,
service_account_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
user_hash: Optional[pulumi.Input[str]] = None,
version: Optional[pulumi.Input[str]] = None) -> 'Function':
"""
Get an existing Function resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['FunctionContentArgs']] content: Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
:param pulumi.Input[str] created_at: Creation timestamp of the Yandex Cloud Function.
:param pulumi.Input[str] description: Description of the Yandex Cloud Function
:param pulumi.Input[str] entrypoint: Entrypoint for Yandex Cloud Function
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] environment: A set of key/value environment variables for Yandex Cloud Function
:param pulumi.Input[str] execution_timeout: Execution timeout in seconds for Yandex Cloud Function
:param pulumi.Input[str] folder_id: Folder ID for the Yandex Cloud Function
:param pulumi.Input[int] image_size: Image size for Yandex Cloud Function.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Yandex Cloud Function
:param pulumi.Input[str] loggroup_id: Loggroup ID size for Yandex Cloud Function.
:param pulumi.Input[int] memory: Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
:param pulumi.Input[str] name: Yandex Cloud Function name used to define trigger
:param pulumi.Input[pulumi.InputType['FunctionPackageArgs']] package: Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
:param pulumi.Input[str] runtime: Runtime for Yandex Cloud Function
:param pulumi.Input[str] service_account_id: Service account ID for Yandex Cloud Function
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
:param pulumi.Input[str] user_hash: User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
:param pulumi.Input[str] version: Version for Yandex Cloud Function.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _FunctionState.__new__(_FunctionState)
__props__.__dict__["content"] = content
__props__.__dict__["created_at"] = created_at
__props__.__dict__["description"] = description
__props__.__dict__["entrypoint"] = entrypoint
__props__.__dict__["environment"] = environment
__props__.__dict__["execution_timeout"] = execution_timeout
__props__.__dict__["folder_id"] = folder_id
__props__.__dict__["image_size"] = image_size
__props__.__dict__["labels"] = labels
__props__.__dict__["loggroup_id"] = loggroup_id
__props__.__dict__["memory"] = memory
__props__.__dict__["name"] = name
__props__.__dict__["package"] = package
__props__.__dict__["runtime"] = runtime
__props__.__dict__["service_account_id"] = service_account_id
__props__.__dict__["tags"] = tags
__props__.__dict__["user_hash"] = user_hash
__props__.__dict__["version"] = version
return Function(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def content(self) -> pulumi.Output[Optional['outputs.FunctionContent']]:
"""
Version deployment content for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `content.0.zip_filename` - Filename to zip archive for the version.
"""
return pulumi.get(self, "content")
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> pulumi.Output[str]:
"""
Creation timestamp of the Yandex Cloud Function.
"""
return pulumi.get(self, "created_at")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description of the Yandex Cloud Function
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def entrypoint(self) -> pulumi.Output[str]:
"""
Entrypoint for Yandex Cloud Function
"""
return pulumi.get(self, "entrypoint")
@property
@pulumi.getter
def environment(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A set of key/value environment variables for Yandex Cloud Function
"""
return pulumi.get(self, "environment")
@property
@pulumi.getter(name="executionTimeout")
def execution_timeout(self) -> pulumi.Output[Optional[str]]:
"""
Execution timeout in seconds for Yandex Cloud Function
"""
return pulumi.get(self, "execution_timeout")
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> pulumi.Output[str]:
"""
Folder ID for the Yandex Cloud Function
"""
return pulumi.get(self, "folder_id")
@property
@pulumi.getter(name="imageSize")
def image_size(self) -> pulumi.Output[int]:
"""
Image size for Yandex Cloud Function.
"""
return pulumi.get(self, "image_size")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A set of key/value label pairs to assign to the Yandex Cloud Function
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="loggroupId")
def loggroup_id(self) -> pulumi.Output[str]:
"""
Loggroup ID size for Yandex Cloud Function.
"""
return pulumi.get(self, "loggroup_id")
@property
@pulumi.getter
def memory(self) -> pulumi.Output[int]:
"""
Memory in megabytes (**aligned to 128MB**) for Yandex Cloud Function
"""
return pulumi.get(self, "memory")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Yandex Cloud Function name used to define trigger
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def package(self) -> pulumi.Output[Optional['outputs.FunctionPackage']]:
"""
Version deployment package for Yandex Cloud Function code. Can be only one `package` or `content` section.
* `package.0.sha_256` - SHA256 hash of the version deployment package.
* `package.0.bucket_name` - Name of the bucket that stores the code for the version.
* `package.0.object_name` - Name of the object in the bucket that stores the code for the version.
"""
return pulumi.get(self, "package")
@property
@pulumi.getter
def runtime(self) -> pulumi.Output[str]:
"""
Runtime for Yandex Cloud Function
"""
return pulumi.get(self, "runtime")
@property
@pulumi.getter(name="serviceAccountId")
def service_account_id(self) -> pulumi.Output[Optional[str]]:
"""
Service account ID for Yandex Cloud Function
"""
return pulumi.get(self, "service_account_id")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Sequence[str]]:
"""
Tags for Yandex Cloud Function. Tag "$latest" isn't returned.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="userHash")
def user_hash(self) -> pulumi.Output[str]:
"""
User-defined string for current function version. User must change this string any times when function changed. Function will be updated when hash is changed.
"""
return pulumi.get(self, "user_hash")
@property
@pulumi.getter
def version(self) -> pulumi.Output[str]:
"""
Version for Yandex Cloud Function.
"""
return pulumi.get(self, "version")
| 44.619048 | 202 | 0.641717 | 4,945 | 42,165 | 5.306168 | 0.044692 | 0.104387 | 0.08697 | 0.062883 | 0.919052 | 0.892488 | 0.870079 | 0.857312 | 0.849194 | 0.829681 | 0 | 0.003547 | 0.251061 | 42,165 | 944 | 203 | 44.666314 | 0.827354 | 0.344575 | 0 | 0.765363 | 1 | 0 | 0.091312 | 0.003006 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165736 | false | 0.001862 | 0.013035 | 0 | 0.27933 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4a33d06854deceab7496e260f24440c5324240d8 | 2,416 | py | Python | database.py | ourway/HeyBooster | ff46769600c0b43f69cc59fea160f8ae1d91e7f7 | [
"MIT"
] | null | null | null | database.py | ourway/HeyBooster | ff46769600c0b43f69cc59fea160f8ae1d91e7f7 | [
"MIT"
] | 3 | 2020-07-27T01:02:29.000Z | 2021-06-02T02:39:09.000Z | database.py | ourway/HeyBooster | ff46769600c0b43f69cc59fea160f8ae1d91e7f7 | [
"MIT"
] | null | null | null | import pymongo
import os
class db(object):
user = os.environ.get('DB_USER')
name = os.environ.get('DB_NAME')
pw = os.environ.get('DB_PASSWORD')
URI = "mongodb://%s:%s@heybooster-shard-00-00-yue91.mongodb.net:27017,heybooster-shard-00-01-yue91.mongodb.net:27017,heybooster-shard-00-02-yue91.mongodb.net:27017/test?ssl=true&replicaSet=heybooster-shard-0&authSource=admin&retryWrites=true&w=majority" % (
user, pw)
@staticmethod
def init():
client = pymongo.MongoClient(db.URI)
db.DATABASE = client[db.name]
@staticmethod
def insert(collection, data):
db.DATABASE[collection].insert(data)
def insert_one(collection, data):
return db.DATABASE[collection].insert_one(data)
@staticmethod
def find_one(collection, query):
return db.DATABASE[collection].find_one(query)
def find(collection, query):
return db.DATABASE[collection].find(query)
def find_and_modify(collection, query, **kwargs):
print(kwargs)
db.DATABASE[collection].find_and_modify(query=query,
update={"$set": kwargs}, upsert=False,
full_response=True)
class db2(object):
user = os.environ.get('DB_USER')
pw = os.environ.get('DB_PASSWORD')
URI = "mongodb://%s:%s@cluster0-shard-00-00-kk3ol.mongodb.net:27017,cluster0-shard-00-01-kk3ol.mongodb.net:27017,cluster0-shard-00-02-kk3ol.mongodb.net:27017/test?ssl=true&replicaSet=Cluster0-shard-0&authSource=admin&retryWrites=true&w=majority" % (
user, pw)
@staticmethod
def init():
client = pymongo.MongoClient(db2.URI)
db2.DATABASE = client[db.name]
@staticmethod
def insert(collection, data):
db2.DATABASE[collection].insert(data)
def insert_one(collection, data):
return db2.DATABASE[collection].insert_one(data)
@staticmethod
def find_one(collection, query):
return db2.DATABASE[collection].find_one(query)
def find(collection, query):
return db2.DATABASE[collection].find(query)
def find_and_modify(collection, query, **kwargs):
print(kwargs)
db2.DATABASE[collection].find_and_modify(query=query,
update={"$set": kwargs}, upsert=False,
full_response=True)
| 35.529412 | 261 | 0.635762 | 292 | 2,416 | 5.181507 | 0.202055 | 0.118969 | 0.059484 | 0.046266 | 0.893589 | 0.893589 | 0.893589 | 0.707204 | 0.707204 | 0.634501 | 0 | 0.041734 | 0.236341 | 2,416 | 67 | 262 | 36.059701 | 0.77832 | 0 | 0 | 0.588235 | 0 | 0.039216 | 0.220613 | 0.199503 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0.039216 | 0.039216 | 0.117647 | 0.568627 | 0.039216 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
4a60ca76b219f76fb5037906a98960cc215d8392 | 2,278 | py | Python | examples/hello_world.py | FredericoNesti/gym-idsgame | 4170cb5cb3ec787adf5911364e0c6395412b9de9 | [
"MIT"
] | 15 | 2020-10-07T12:28:37.000Z | 2022-03-10T01:23:46.000Z | examples/hello_world.py | FredericoNesti/gym-idsgame | 4170cb5cb3ec787adf5911364e0c6395412b9de9 | [
"MIT"
] | 2 | 2020-10-07T01:44:05.000Z | 2022-03-10T12:07:43.000Z | examples/hello_world.py | FredericoNesti/gym-idsgame | 4170cb5cb3ec787adf5911364e0c6395412b9de9 | [
"MIT"
] | 5 | 2021-02-11T15:47:26.000Z | 2022-03-30T17:42:25.000Z | import gym
from gym_idsgame.envs import IdsGameEnv
def attack_against_baseline_defense_env():
versions = range(0,20)
version = versions[0]
env_name = "idsgame-minimal_defense-v" + str(version)
env = gym.make(env_name)
done = False
while not done:
attack_action = env.attacker_action_space.sample()
defense_action = None
a = (attack_action, defense_action)
obs, reward, done, info = env.step(a)
def attack_against_random_defense_env():
versions = range(0,20)
version = versions[0]
env_name = "idsgame-random_defense-v" + str(version)
env = gym.make(env_name)
done = False
while not done:
attack_action = env.attacker_action_space.sample()
defense_action = None
a = (attack_action, defense_action)
obs, reward, done, info = env.step(a)
def defense_against_baseline_attack_env():
versions = range(0,20)
version = versions[0]
env_name = "idsgame-maximal_attack-v" + str(version)
env = gym.make(env_name)
done = False
while not done:
attack_action = None
defense_action = env.defender_action_space.sample()
a = (attack_action, defense_action)
obs, reward, done, info = env.step(a)
def defense_against_random_attack_env():
versions = range(0,20)
version = versions[0]
env_name = "idsgame-random_attack-v" + str(version)
env = gym.make(env_name)
done = False
while not done:
attack_action = None
defense_action = env.defender_action_space.sample()
a = (attack_action, defense_action)
obs, reward, done, info = env.step(a)
def two_agents_env():
versions = range(0,20)
version = versions[0]
env_name = "idsgame-v" + str(version)
env = gym.make(env_name)
done = False
while not done:
attack_action = env.attacker_action_space.sample()
defense_action = env.defender_action_space.sample()
a = (attack_action, defense_action)
obs, reward, done, info = env.step(a)
def main():
#attack_against_baseline_defense_env()
attack_against_random_defense_env()
#defense_against_baseline_attack_env()
#defense_against_random_attack_env()
#two_agents_env()
if __name__ == '__main__':
main() | 30.783784 | 59 | 0.669447 | 302 | 2,278 | 4.748344 | 0.142384 | 0.048815 | 0.07113 | 0.059275 | 0.930962 | 0.809623 | 0.809623 | 0.809623 | 0.809623 | 0.809623 | 0 | 0.011377 | 0.22827 | 2,278 | 74 | 60 | 30.783784 | 0.804323 | 0.054873 | 0 | 0.737705 | 0 | 0 | 0.052558 | 0.044651 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098361 | false | 0 | 0.032787 | 0 | 0.131148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a6326eb9e120fd3ba0737342c42574db0a42a0b | 335,109 | py | Python | py/ztools/squirrel.py | HerrTrigger/NSC_BUILDER | e9083e83383281bdd9e167d3141163dcc56b6710 | [
"MIT"
] | 828 | 2018-11-05T02:43:40.000Z | 2022-03-27T08:49:56.000Z | py/ztools/squirrel.py | HerrTrigger/NSC_BUILDER | e9083e83383281bdd9e167d3141163dcc56b6710 | [
"MIT"
] | 141 | 2018-11-05T19:59:23.000Z | 2022-01-10T01:17:32.000Z | py/ztools/squirrel.py | HerrTrigger/NSC_BUILDER | e9083e83383281bdd9e167d3141163dcc56b6710 | [
"MIT"
] | 119 | 2018-11-05T06:57:37.000Z | 2022-03-25T18:10:33.000Z | # -*- coding: utf-8 -*-
'''
_____ _ __
/ ___/____ ___ __(_)____________ / /
\__ \/ __ `/ / / / / ___/ ___/ _ \/ /
___/ / /_/ / /_/ / / / / / / __/ /
/____/\__, /\__,_/_/_/ /_/ \___/_/
/_/
By julesontheroad:
https://github.com/julesontheroad/
Squirrel is a fork of NUT made to support NSC Builder
https://github.com/julesontheroad/NSC_BUILDER
The original NUT is made and actively supported by blawar
https://github.com/blawar/nut
This fork doesn't follow NUT's main line and strips many features from nut
(like CDNSP support) while adds several functions based in new code.
This program specialices in content building and file management for several
Nintendo Switch formats.
Squirrel original's purpose is to support NSC_Builder though it serves as a
standalone program with many functions, some of them not being used currently in NSC_Builder.
'''
import argparse
import sys
import os
import re
import io
import pathlib
import urllib3
import json
from zipfile import ZipFile
os.chdir(os.path.dirname(os.path.abspath(__file__)))
sys.path.insert(0, 'lib')
try:
sys.path.insert(0, 'private')
except:pass
import sq_settings
sq_settings.set_prod_environment()
import Keys
import Config
import Status
# SET ENVIRONMENT
squirrel_dir=os.path.abspath(os.curdir)
NSCB_dir=os.path.abspath('../'+(os.curdir))
if os.path.exists(os.path.join(squirrel_dir,'ztools')):
NSCB_dir=squirrel_dir
zconfig_dir=os.path.join(NSCB_dir, 'zconfig')
ztools_dir=os.path.join(NSCB_dir,'ztools')
squirrel_dir=ztools_dir
elif os.path.exists(os.path.join(NSCB_dir,'ztools')):
squirrel_dir=squirrel_dir
ztools_dir=os.path.join(NSCB_dir, 'ztools')
zconfig_dir=os.path.join(NSCB_dir, 'zconfig')
else:
ztools_dir=os.path.join(NSCB_dir, 'ztools')
zconfig_dir=os.path.join(NSCB_dir, 'zconfig')
if os.path.exists(zconfig_dir):
DATABASE_folder=os.path.join(zconfig_dir, 'DB')
else:
DATABASE_folder=os.path.join(squirrel_dir, 'DB')
if not os.path.exists(DATABASE_folder):
os.makedirs(DATABASE_folder)
if __name__ == '__main__':
try:
urllib3.disable_warnings()
parser = argparse.ArgumentParser()
parser.add_argument('file',nargs='*')
# INFORMATION
parser.add_argument('-i', '--info', help='show info about title or file')
parser.add_argument('--filelist', nargs='+', help='Prints file list from NSP/XCI secure partition')
parser.add_argument('--ADVfilelist', nargs='+', help='Prints ADVANCED file list from NSP/XCI secure partition')
parser.add_argument('--ADVcontentlist', nargs='+', help='Prints ADVANCED content list from NSP/XCI arranged by base titleid')
parser.add_argument('--Read_cnmt', nargs='+', help='Read cnmt file inside NSP/XCI')
parser.add_argument('--Read_nacp', nargs='+', help='Read nacp file inside NSP/XCI')
parser.add_argument('--Read_icon', nargs='+', help='Read icon files inside NSP/XCI')
parser.add_argument('--Read_npdm', nargs='+', help='Read npdm file inside NSP/XCI')
parser.add_argument('--Read_hfs0', nargs='+', help='Read hfs0')
parser.add_argument('--fw_req', nargs='+', help='Get information about fw requirements for NSP/XCI')
parser.add_argument('--Read_xci_head', nargs='+', help='Get information about xci header and cert')
parser.add_argument('-nscdb', '--addtodb', nargs='+', help='Adds content to database')
parser.add_argument('-nscdb_new', '--addtodb_new', nargs='+', help='Adds content to database')
parser.add_argument('-v', '--verify', nargs='+', help='Verify nsp or xci file')
parser.add_argument('-vk', '--verify_key', nargs='+', help='Verify a key against a preorder nsp\nsx')
# CNMT Flag funtions
parser.add_argument('--set_cnmt_titleid', nargs='+', help='Changes cnmt.nca titleid')
parser.add_argument('--set_cnmt_version', nargs='+', help='Changes cnmt.nca version number')
parser.add_argument('--set_cnmt_RSV', nargs='+', help='Changes cnmt.nca RSV')
parser.add_argument('--update_hash', nargs='+', help='Updates cnmt.nca hashes')
parser.add_argument('--xml_gen', nargs='+', help='Generates cnmt.xml')
# REPACK
parser.add_argument('-c', '--create', help='create / pack a NSP')
parser.add_argument('-cpr', '--compress', nargs='+', help='Compress a nsp or xci')
parser.add_argument('-dcpr', '--decompress', help='deCompress a nsz, xcz or ncz')
parser.add_argument('--create_hfs0', help='create / pack a hfs0')
parser.add_argument('--create_rhfs0', help='create / pack a root hfs0')
parser.add_argument('--create_xci', help='create / pack a xci')
parser.add_argument('-xci_st', '--xci_super_trim', nargs='+', help='Supertrim xci')
parser.add_argument('-xci_tr', '--xci_trim', nargs='+', help='Trims xci')
parser.add_argument('-xci_untr', '--xci_untrim', nargs='+', help='Untrims xci')
parser.add_argument('-dc', '--direct_creation', nargs='+', help='Create directly a nsp or xci')
parser.add_argument('-dmul', '--direct_multi', nargs='+', help='Create directly a multi nsp or xci')
parser.add_argument('-ed', '--erase_deltas', nargs='+', help='Take of deltas from updates')
parser.add_argument('-rbnsp', '--rebuild_nsp', nargs='+', help='Rebuild nsp by cnmt order')
parser.add_argument('-rst', '--restore', nargs='+', help='Restore a xci or nsp file')
# nca/nsp identification
parser.add_argument('--ncatitleid', nargs='+', help='Returns titleid from an nca input')
parser.add_argument('--ncatype', nargs='+', help='Returns type of an nca file')
parser.add_argument('--nsptitleid', nargs='+', help='Returns titleid for a nsp file')
parser.add_argument('--nsptype', nargs='+', help='Returns type for a nsp file')
parser.add_argument('--ReadversionID', nargs='+', help='Returns version number for nsp Oorxci')
parser.add_argument('--nsp_htrights', nargs='+', help='Returns true if nsp has titlerights')
parser.add_argument('--nsp_hticket', nargs='+', help='Returns true if nsp has ticket')
# Remove titlerights functions
parser.add_argument('--remove-title-rights', nargs='+', help='Removes title rights encryption from all NCA\'s in the NSP.')
parser.add_argument('--RTRNCA_h_nsp', nargs='+', help='Removes title rights encryption from a single nca reading from original nsp')
parser.add_argument('--RTRNCA_h_tick', nargs='+', help='Removes title rights encryption from a single nca reading from extracted ticket')
parser.add_argument('--set_masterkey', nargs='+', help='Changes the master key encryption for NSP.')
# Gamecard flag functions
parser.add_argument('--seteshop', nargs='+', help='Set all nca in an nsp as eshop')
parser.add_argument('--setcgame', nargs='+', help='Set all nca in an nsp card')
parser.add_argument('--seteshop_nca', nargs='+', help='Set a single nca as eshop')
parser.add_argument('--setcgame_nca', nargs='+', help='Set a single nca as card')
parser.add_argument('--cardstate', nargs='+', help='Returns value for isgamecard flag from an nca')
parser.add_argument('--remlinkacc', nargs='+', help='Removelinkedaccount')
# NSP Copy functions
parser.add_argument('-x', '--extract', nargs='+', help='Extracts all files from nsp or xci')
parser.add_argument('-raw_x', '--raw_extraction', nargs='+', help='Extracts files without checking readability, useful when there is bad files')
parser.add_argument('-nfx', '--nca_file_extraction', nargs='+', help='Extracts files files within nca files from nsp/xci\nca file')
parser.add_argument('-plx', '--extract_plain_nca', nargs='+', help='Extracts nca files as plaintext or generate a plaintext file from an nca file')
parser.add_argument('--NSP_copy_ticket', nargs='+', help='Extracts ticket from target nsp')
parser.add_argument('--NSP_copy_nca', nargs='+', help='Extracts all nca files from target nsp')
parser.add_argument('--NSP_copy_other', nargs='+', help='Extracts all kinds of files different from nca or ticket from target nsp')
parser.add_argument('--NSP_copy_xml', nargs='+', help='Extracts xml files from target nsp')
parser.add_argument('--NSP_copy_cert', nargs='+', help='Extracts cert files from target nsp')
parser.add_argument('--NSP_copy_jpg', nargs='+', help='Extracts jpg files from target nsp')
parser.add_argument('--NSP_copy_cnmt', nargs='+', help='Extracts cnmt files from target nsp')
parser.add_argument('--copy_pfs0_meta', nargs='+', help='Extracts meta pfs0 from target nsp')
parser.add_argument('--copy_nacp', nargs='+', help='Extracts nacp files from target nsp')
# XCI Copy functions
parser.add_argument('--XCI_copy_hfs0', nargs='+', help='Extracts hfs0 partition files from target xci')
parser.add_argument('--XCI_c_hfs0_secure', nargs='+', help='Extracts secure hfs0 partition files from target xci')
parser.add_argument('--XCI_c_hfs0_normal', nargs='+', help='Extracts normal hfs0 partition files from target xci')
parser.add_argument('--XCI_c_hfs0_update', nargs='+', help='Extracts update hfs0 partition files from target xci')
parser.add_argument('--XCI_copy_nca_secure', nargs='+', help='Extracts nca from secure partition')
parser.add_argument('--XCI_copy_nca_normal', nargs='+', help='Extracts nca from normal partition')
parser.add_argument('--XCI_copy_nca_update', nargs='+', help='Extracts nca from update partition')
parser.add_argument('--XCI_copy_rhfs0', nargs='+', help='Extracts root.hfs0')
# Dedicated copy functions. NCA Types.
parser.add_argument('--NSP_copy_nca_meta', nargs='+', help='Extracts nca files with type meta from target nsp')
parser.add_argument('--NSP_copy_nca_control', nargs='+', help='Extracts nca files with type control from target nsp')
parser.add_argument('--NSP_copy_nca_manual', nargs='+', help='Extracts nca files with type manual from target nsp')
parser.add_argument('--NSP_copy_nca_program', nargs='+', help='Extracts nca files with type program from target nsp')
parser.add_argument('--NSP_copy_nca_data', nargs='+', help='Extracts nca files with type data from target nsp')
parser.add_argument('--NSP_copy_nca_pdata', nargs='+', help='Extracts nca fles with type public data from target nsp')
# Dedicated copy functions. TITLERIGHTS.
parser.add_argument('--NSP_copy_tr_nca', nargs='+', help='Extracts nca files with titlerights from target nsp')
parser.add_argument('--NSP_copy_ntr_nca', nargs='+', help='Extracts nca files without titlerights from target nsp')
parser.add_argument('--NSP_c_KeyBlock', nargs='+', help='Extracts keyblock from nsca files with titlerigths from target nsp')
parser.add_argument('--C_clean', nargs='+', help='Extracts nca files and removes it.s titlerights from target NSP OR XCI')
parser.add_argument('--C_clean_ND', nargs='+', help='Extracts nca files and removes it.s titlerights from target NSP OR XCI without deltas')
# Dedicated copy functions. SPLIT OR UPDATE.
parser.add_argument('--splitter', nargs='+', help='Split content by titleid according to cnmt files')
parser.add_argument('-dspl', '--direct_splitter', nargs='+', help='Split content by titleid according to cnmt files')
parser.add_argument('--updbase', nargs='+', help='Prepare base file to update it')
# Combinations
parser.add_argument('--gen_placeholder', help='Creates nsp or xci placeholder')
parser.add_argument('--placeholder_combo', nargs='+', help='Extracts nca files for placeholder nsp')
parser.add_argument('--license_combo', nargs='+', help='Extracts nca files for license nsp')
parser.add_argument('--mlicense_combo', nargs='+', help='Extracts nca files for tinfoil license nsp')
parser.add_argument('--zip_combo', nargs='+', help='Extracts and generate files to make a restore zip')
# Auxiliary
parser.add_argument('-o', '--ofolder', nargs='+', help='Set output folder for copy instructions')
parser.add_argument('-ifo', '--ifolder', help='Input folder')
parser.add_argument('-ifo_s', '--ifolder_secure', help='Input secure folder')
parser.add_argument('-ifo_n', '--ifolder_normal', help='Input normal folder')
parser.add_argument('-ifo_u', '--ifolder_update', help='Input update folder')
parser.add_argument('-tfile', '--text_file', help='Output text file')
parser.add_argument('-tfile_aux', '--text_file_aux', help='Auxiliary text file')
parser.add_argument('-dbfile', '--db_file', help='Output text file for database')
parser.add_argument('-b', '--buffer', nargs='+', help='Set buffer for copy instructions')
parser.add_argument('-ext', '--external', nargs='+', help='Set original nsp or ticket for remove nca titlerights functions')
parser.add_argument('-pv', '--patchversion', nargs='+', help='Number fot patch Required system version or program, patch or addcontent version')
parser.add_argument('-kp', '--keypatch', nargs='+', help='patch masterkey to input number')
parser.add_argument('-rsvc', '--RSVcap', nargs='+', help='RSV cap when patching. Default is FW4.0')
parser.add_argument('-pe', '--pathend', nargs='+', help='Output to subfolder')
parser.add_argument('-cskip', '--cskip', nargs='+', help='Skip dlc or update')
parser.add_argument('-fat', '--fat', nargs='+', help='Split xci for fat32 or exfat')
parser.add_argument('-fx', '--fexport', nargs='+', help='Export splitted nsp to files or folder')
parser.add_argument('-t', '--type', nargs='+', help='Type of file')
parser.add_argument('-tid', '--titleid', nargs='+', help='Filter with titleid')
parser.add_argument('-bid', '--baseid', nargs='+', help='Filter with base titleid')
parser.add_argument('-ND', '--nodelta', nargs='+', help='Exclude deltas')
parser.add_argument('-dbformat', '--dbformat', nargs='+', help='Database format extended, nutdb or keyless-extended')
parser.add_argument('-rn', '--rename', nargs='+', help='Filter with base titleid')
parser.add_argument('-uin', '--userinput', help='Reads a user input')
parser.add_argument('-incxml', '--includexml', nargs='+', help='Include xml by default true')
parser.add_argument('-trans', '--translate', nargs='+', help='Google translation support for nutdb descriptions')
parser.add_argument('-nodcr', '--nodecompress', help="Don't decompress nsz_xcz in several modes")
# LISTMANAGER
parser.add_argument('-cl', '--change_line', help='Change line in text file')
parser.add_argument('-rl', '--read_line', help='Read line in text file')
parser.add_argument('-stripl', '--strip_lines', nargs='+', help='Strips lines from a text file')
parser.add_argument('-showcline', '--show_current_line', nargs='+', help='Shows current line')
parser.add_argument('-countlines', '--count_n_lines', nargs='+', help='Count the number of lines')
parser.add_argument('-dff', '--delete_item', nargs='+', help='Deletes a os item listed in text file, a file or a folder')
parser.add_argument('-ln', '--line_number', help='line number')
parser.add_argument('-nl', '--new_line', help='new line')
parser.add_argument('-ff', '--findfile', help='find different types of files')
parser.add_argument('-fil', '--filter', nargs='+', help='filter using strings')
parser.add_argument('-splid', '--split_list_by_id', nargs='+', help='split a list by file id')
parser.add_argument('-mv_oupd', '--mv_old_updates', nargs='+', help='Moves old updates to another folder')
parser.add_argument('-mv_odlc', '--mv_old_dlcs', nargs='+', help='Moves old dlcs to another folder')
parser.add_argument('-cr_ilist', '--cr_incl_list', nargs='+', help='Creates a include list from a textfile and a folder or 2 textfiles')
parser.add_argument('-cr_elist', '--cr_excl_list', nargs='+', help='Creates a exclude list from a textfile and a folder or 2 textfiles')
parser.add_argument('-cr_xcioutlist', '--cr_outdated_xci_list', nargs='+', help='Creates a outdated xci list from a textfile and a folder')
parser.add_argument('-cr_xexplist', '--cr_expand_list', nargs='+', help='Expands the list with games by baseid')
parser.add_argument('-chdlcn', '--chck_dlc_numb', nargs='+', help='Checks if xci has corrent number of dlcs')
parser.add_argument('-blckl', '--black_list', nargs='+', help='Deletes blacklisted files from a list')
# Archive
if sys.platform == 'win32':
parser.add_argument('-archive','--archive', help='Archive to folder')
parser.add_argument('-zippy','--zippy', help='Zip a file')
parser.add_argument('-joinfile','--joinfile', nargs='+', help='Join split file')
# OTHER
parser.add_argument('-nint_keys','--nint_keys', help='Verify NS keys')
parser.add_argument('-renf','--renamef', help='Rename file with proper name')
parser.add_argument('-renftxt','--renameftxt', help='Rename file with proper name using a text list')
parser.add_argument('-snz','--sanitize', help='Remove unreadable characters from names')
parser.add_argument('-roma','--romanize', nargs='+', help='Translate kanji and extended kanna to romaji and sanitize name')
parser.add_argument('-oaid','--onlyaddid', help='Rename file with proper name')
parser.add_argument('-renm','--renmode', help='Rename mode (force,skip_corr_tid,skip_if_tid)')
parser.add_argument('-addl','--addlangue', help='Add language string')
parser.add_argument('-nover','--noversion', help="Don't add version (false,true,xci_no_v0)")
parser.add_argument('-dlcrn','--dlcrname', help="If false keeps base name in dlcs")
parser.add_argument('-cltg','--cleantags', help="Clean tags in filenames")
parser.add_argument('-tgtype','--tagtype', help="Type of tag to remove")
parser.add_argument('-vorg','--v_organize', help="Aux variable to organize files")
parser.add_argument('-vt','--vertype', help="Verification type for auto, needs --text_file. Opt: dec,sig,full [DECryption, decryption and SIGnature, previous and hash check]")
parser.add_argument('-threads','--threads', help="Number threads to use for certain functions")
parser.add_argument('-pararell','--pararell', help="Number threads to use for certain functions")
parser.add_argument('-lib_call','--library_call', nargs='+', help="Call a library function within squirrel")
parser.add_argument('-loop','--loop', nargs='+', help="Loop the text file using secondary module")
# Hidden
parser.add_argument('-dev_env','--dev_environment', help=argparse.SUPPRESS)#Changes key environment to dev if True
parser.add_argument('-pos','--position', help=argparse.SUPPRESS)#tqdm position, aux argument for pararell
parser.add_argument('-ninst','--n_instances', help=argparse.SUPPRESS)#number of instances, aux argument for pararell
parser.add_argument('-xarg','--explicit_argument', nargs='+', help=argparse.SUPPRESS)#Explicit arguments for lib_call for files with ","
parser.add_argument('-mtpeval','--mtp_eval_link', nargs='+', help=argparse.SUPPRESS)#Explicit arguments for lib_call for files with ","
# -> parser.add_argument('-act', '--action', nargs='+', help=argparse.SUPPRESS)
# -> parser.add_argument('-preverify', '--preverification', nargs='+', help=argparse.SUPPRESS)
# -> parser.add_argument('-verDB', '--verificationDB', nargs='+', help=argparse.SUPPRESS) #verificationDB
args = parser.parse_args()
Status.start()
indent = 1
tabs = '\t' * indent
trans=False
if args.file==list():
args.file=None
if args.dev_environment:
from importlib import reload
if str(args.dev_environment).upper()=="TRUE":
sq_settings.set_dev_environment()
reload(Keys)
import sq_tools
import listmanager
import Titles
import Fs
import Print
import Nsps
import DBmodule as dbmodule
from hashlib import sha256
from pathlib import Path
from binascii import hexlify as hx, unhexlify as uhx
if sys.platform == 'win32':
import win32con, win32api
import shutil
from tqdm import tqdm
from datetime import datetime
import math
import pykakasi
from Fs.pyNCA3 import NCA3
from shutil import disk_usage
if args.library_call:
if (args.library_call[0]).startswith('Drive.'):
sys.path.insert(0, 'Drive')
args.library_call[0]=str(args.library_call[0]).replace("Drive.", "")
if (args.library_call[0]).startswith('mtp.'):
sys.path.insert(0, 'mtp')
args.library_call[0]=str(args.library_call[0]).replace("mtp.", "")
if (args.library_call[0]).startswith('cmd.'):
sys.path.insert(0, 'cmd')
args.library_call[0]=str(args.library_call[0]).replace("cmd.", "")
import secondary
if args.explicit_argument:
vret=secondary.call_library(args.library_call,args.explicit_argument)
else:
vret=secondary.call_library(args.library_call)
Status.close()
if args.mtp_eval_link:
tfile=args.mtp_eval_link[0]
userfile=args.mtp_eval_link[1]
link=input("Enter your choice: ")
link=link.strip()
if '&' in link:
varout='999'
elif len(link)<2:
varout=link
else:
varout='999'
with open(userfile,"w", encoding='utf8') as userinput:
userinput.write(varout)
if link.startswith('https://1fichier.com'):
with open(tfile,"a", encoding='utf8') as textfile:
textfile.write(link+'\n')
elif link.startswith('https://drive.google.com'):
with open(tfile,"a", encoding='utf8') as textfile:
textfile.write(link+'\n')
if args.threads and not args.compress and not args.decompress:
import secondary
workers=1
try:
workers=int(args.threads)
except:pass
try:
if workers>1:
secondary.route(args,workers)
#secondary.printargs(args)
Status.close()
else:pass
except:pass
elif args.pararell and args.threads :
import secondary
instances=2
if args.pararell=='true':
args.pararell=None
try:
instances=int(args.threads)
if instances<= 0:
instances=1
except:
instances=2
args.threads=0
items=secondary.pararell(args,instances)
if items==0:
try:
os.remove(args.text_file)
except:
pass
for attr in vars(args):
setattr(args,attr,None)
if args.loop and args.ifolder:
if args.loop[0]!='true' and args.loop[0]!='false' and args.text_file!='false':
if os.path.exists(args.text_file):
try:
os.remove(args.text_file)
except:
pass
import secondary
args0=args
args0.type=args0.loop
args0.loop=None
args0.findfile=args0.ifolder
args0.ifolder=None
secondary.pass_command(args0)
args.ifolder=None
args.findfile=None
loop=list()
loop.append('true')
args.loop=loop
if args.loop and args.text_file:
if str(args.loop[0]).lower()=='true':
import secondary
args.loop=None
items=secondary.pass_command(args)
if items==0:
try:
os.remove(args.text_file)
except:
pass
for attr in vars(args):
setattr(args,attr,None)
else:
args.loop=None
# NCA/NSP IDENTIFICATION
# ..................................................
# Get titleid from nca file
# ..................................................
if args.ncatitleid:
for filename in args.ncatitleid:
try:
f = Fs.Nca(filename, 'rb')
f.printtitleId()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Get type from nca file
# ..................................................
if args.ncatype:
for filename in args.ncatype:
try:
f = Fs.Nca(filename, 'rb')
f.print_nca_type()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Get titleid from nsp file
# ..................................................
if args.nsptitleid:
for fileName in args.nsptitleid:
try:
f = Fs.Nsp(fileName, 'r+b')
titleid=f.getnspid()
Print.info(titleid)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Read version number from nsp or xci
# ..................................................
if args.ReadversionID:
for filename in args.ReadversionID:
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
f.get_cnmt_verID()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.get_cnmt_verID()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Identify type of nsp
# ..................................................
if args.nsptype:
for filename in args.nsptype:
try:
f = Fs.Nsp(filename, 'rb')
TYPE=f.nsptype()
print(TYPE)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Identify if nsp has titlerights
# ..................................................
if args.nsp_htrights:
for filename in args.nsp_htrights:
try:
f = Fs.Nsp(filename, 'rb')
if f.trights_set() == 'TRUE':
Print.info('TRUE')
if f.trights_set() == 'FALSE':
Print.info('FALSE')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Identify if nsp has ticket
# ..................................................
if args.nsp_hticket:
for filename in args.nsp_hticket:
try:
f = Fs.Nsp(filename, 'rb')
if f.exist_ticket() == 'TRUE':
Print.info('TRUE')
if f.exist_ticket() == 'FALSE':
Print.info('FALSE')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Identify if nsp has ticket
# ..................................................
if args.nsp_hticket:
for filename in args.nsp_hticket:
try:
f = Fs.Nsp(filename, 'rb')
if f.exist_ticket() == 'TRUE':
Print.info('TRUE')
if f.exist_ticket() == 'FALSE':
Print.info('FALSE')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# REMOVE TITLERIGHTS FUNCTIONS
# ..................................................
# Remove titlerights from input NSP
# ..................................................
if args.remove_title_rights:
for filename in args.remove_title_rights:
try:
f = Fs.Nsp(filename, 'r+b')
f.removeTitleRights()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Change Master keys
# ..................................................
if args.set_masterkey:
file=args.set_masterkey[0]
if args.set_masterkey[1]:
try:
mkey=int(args.set_masterkey[1])
if mkey==1:
mkey=0
f = Fs.Nsp(file, 'r+b')
f.setMasterKeyRev(mkey)
f.flush()
f.close()
pass
Status.close()
except:
print("Invalid masterkey number")
else:
print("Missing masterkey number")
# ..................................................................
# Remove titlerights from an NSP using information from original NSP
# ..................................................................
if args.RTRNCA_h_nsp:
for filename in args.external:
try:
f = Fs.Nsp(filename, 'r+b')
masterKeyRev=f.nspmasterkey()
titleKeyDec=f.nsptitlekeydec()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
for filename in args.RTRNCA_h_nsp:
try:
f = Fs.Nca(filename, 'r+b')
f.removeTitleRightsnca(masterKeyRev,titleKeyDec)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# .........................................................................
# Remove titlerights from an NCA using information from an extracted TICKET
# .........................................................................
if args.RTRNCA_h_tick:
for filename in args.external:
try:
f = Fs.Ticket(filename, 'r+b')
f.open(filename, 'r+b')
masterKeyRev=f.getMasterKeyRevision()
titleKeyDec=f.get_titlekeydec()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
for filename in args.RTRNCA_h_tick:
try:
f = Fs.Nca(filename, 'r+b')
f.removeTitleRightsnca(masterKeyRev,titleKeyDec)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# GAMECARD FLAG FUNCTIONS
# ...................................................
# Set isgamecard flag from all nca in an NSP as ESHOP
# ...................................................
if args.seteshop:
for filename in args.seteshop:
try:
f = Fs.Nsp(filename, 'r+b')
f.seteshop()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Set isgamecard flag from all nca in an NSP as CARD
# ...................................................
if args.setcgame:
for filename in args.setcgame:
try:
f = Fs.Nsp(filename, 'r+b')
f.setcgame()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Set isgamecard flag for one nca as ESHOP
# ...................................................
if args.seteshop_nca:
for filename in args.seteshop_nca:
try:
f = Fs.Nca(filename, 'r+b')
f.header.setgamecard(0)
Print.info('IsGameCard flag is now set as: ' + str(f.header.getgamecard()))
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Set isgamecard flag for one nca as CARD
# ...................................................
if args.setcgame_nca:
for filename in args.setcgame_nca:
try:
f = Fs.Nca(filename, 'r+b')
f.header.setgamecard(1)
Print.info('IsGameCard flag is now set as: ' + str(f.header.getgamecard()))
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Get isgamecard flag from a NCA file
# ...................................................
if args.cardstate:
for filename in args.cardstate:
try:
f = Fs.Nca(filename, 'rb')
f.cardstate()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Set value for network account
# ...................................................
if args.remlinkacc:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for inpt in args.remlinkacc:
filename=inpt
try:
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
f = Fs.Nsp(filename,'r+b')
ctrl_list=f.gen_ctrl_list()
f.flush()
f.close()
for item in ctrl_list:
print('-------------------------------------------------')
print('Processing: '+str(item))
print('-------------------------------------------------')
f = Fs.Nsp(filename,'r+b')
check=f.patch_netlicense()
f.flush()
f.close()
if check == True:
f = Fs.Nsp(filename, 'r+b')
leveldata,superhashoffset=f.reb_lv_hashes(item)
f.flush()
f.close()
n=len(leveldata)-1
for i in range(len(leveldata)):
j=n-i
if j==0:
break
f = Fs.Nsp(filename, 'r+b')
superhash=f.set_lv_hash(j,leveldata,item)
f.flush()
f.close()
f = Fs.Nsp(filename, 'r+b')
f.set_lvsuperhash(leveldata,superhashoffset,item)
f.flush()
f.close()
f = Fs.Nsp(filename, 'r+b')
f.ctrl_upd_hblock_hash(item)
f.flush()
f.close()
elif filename.endswith('.xci') or filename.endswith('.xcz'):
f = Fs.factory(filename)
f.open(filename, 'r+b')
ctrl_list=f.gen_ctrl_list()
f.flush()
f.close()
for item in ctrl_list:
print('-------------------------------')
print('Processing: '+str(item))
print('-------------------------------')
f = Fs.factory(filename)
f.open(filename, 'r+b')
check=f.patch_netlicense(item)
f.flush()
f.close()
if check == True:
f = Fs.factory(filename)
f.open(filename, 'r+b')
leveldata,superhashoffset=f.reb_lv_hashes(item)
f.flush()
f.close()
n=len(leveldata)-1
for i in range(len(leveldata)):
j=n-i
if j==0:
break
f = Fs.factory(filename)
f.open(filename, 'r+b')
superhash=f.set_lv_hash(j,leveldata,item)
f.flush()
f.close()
f = Fs.factory(filename)
f.open(filename, 'r+b')
f.set_lvsuperhash(leveldata,superhashoffset,item)
f.flush()
f.close()
f = Fs.factory(filename)
f.open(filename, 'r+b')
f.ctrl_upd_hblock_hash(item)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# COPY FUNCTIONS
# ...................................................
# Copy TICKET from NSP file
# ...................................................
if args.NSP_copy_ticket:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_ticket:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
for filename in args.NSP_copy_ticket:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_ticket(ofolder)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all FILES from NSP\XCI file
# ...................................................
if args.extract:
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
ofolder=False
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
else:
for filename in args.extract:
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
test=filename.lower()
if test.endswith('.nsp') or test.endswith('.nsx') or test.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
f.open(filename, 'rb')
f.extract_all(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif test.endswith('.xci') or test.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.extract_all(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all NCA from NSP file
# ...................................................
if args.NSP_copy_nca:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
for filename in args.NSP_copy_nca:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................................
# Copy all hfs0 partitions (update, normal,secure,logo) from XCI file
# ...................................................................
if args.XCI_copy_hfs0:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_copy_hfs0:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filePath in args.XCI_copy_hfs0:
f = Fs.factory(filePath)
f.open(filePath, 'rb')
f.copy_hfs0(ofolder,buffer,"all")
f.close()
Status.close()
# ...........................................
# Copy update partition from XCI file as hfs0
# ...........................................
if args.XCI_c_hfs0_update:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_c_hfs0_update:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filePath in args.XCI_c_hfs0_update:
f = Fs.factory(filePath)
f.open(filePath, 'rb')
f.copy_hfs0(ofolder,buffer,"update")
f.close()
Status.close()
# ...........................................
# Copy normal partition from XCI file as hfs0
# ...........................................
if args.XCI_c_hfs0_normal:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_c_hfs0_normal:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filePath in args.XCI_c_hfs0_normal:
f = Fs.factory(filePath)
f.open(filePath, 'rb')
f.copy_hfs0(ofolder,buffer,"normal")
f.close()
Status.close()
# ...........................................
# Copy secure partition from XCI file as hfs0
# ...........................................
if args.XCI_c_hfs0_secure:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_c_hfs0_secure:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filePath in args.XCI_c_hfs0_secure:
f = Fs.factory(filePath)
f.open(filePath, 'rb')
f.copy_hfs0(ofolder,buffer,'secure')
f.close()
Status.close()
# ...........................................
# Copy nca from secure partition from XCI
# ...........................................
if args.XCI_copy_nca_secure:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_copy_nca_secure:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
for filePath in args.XCI_copy_nca_secure:
f = Fs.Xci(filePath)
f.open(filePath, 'rb')
f.copy_nca(ofolder,buffer,'secure',metapatch,vkeypatch,int(RSV_cap))
f.close()
Status.close()
# ...........................................
# Copy nca from secure partition from XCI
# ...........................................
if args.XCI_copy_nca_normal:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.C_clean:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
for filePath in args.XCI_copy_nca_normal:
f = Fs.nXci(filePath)
f.open(filePath, 'rb')
f.copy_nca(ofolder,buffer,'normal',metapatch,vkeypatch,int(RSV_cap))
f.close()
Status.close()
# ...........................................
# Copy nca from secure partition from XCI
# ...........................................
if args.XCI_copy_nca_update:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.C_clean:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
for filePath in args.XCI_copy_nca_update:
f = Fs.uXci(filePath)
f.open(filePath, 'rb')
f.copy_nca(ofolder,buffer,'update',metapatch,vkeypatch,int(RSV_cap))
f.close()
Status.close()
# ...........................................
# Copy root.hfs0 from XCI
# ...........................................
if args.XCI_copy_rhfs0:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.XCI_copy_rhfs0:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filePath in args.XCI_copy_rhfs0:
f = Fs.factory(filePath)
f.open(filePath, 'rb')
f.copy_root_hfs0(ofolder,buffer)
f.close()
Status.close()
# ...................................................
# Copy OTHER KIND OF FILES from NSP file
# ...................................................
if args.NSP_copy_other:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_other:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_other:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_other(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy XML from NSP file
# ...................................................
if args.NSP_copy_xml:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_xml:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_xml:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_xml(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy CERT from NSP file
# ...................................................
if args.NSP_copy_cert:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_cert:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_cert:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nsp_cert(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy JPG from NSP file
# ...................................................
if args.NSP_copy_jpg:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_jpg:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_jpg:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_jpg(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy meta cnmt files from NSP file
# ...................................................
if args.NSP_copy_cnmt:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_cnmt:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_cnmt:
if filename.endswith('.nsp') or filename.endswith('.nsz') or filename.endswith('.nsx'):
try:
f = Fs.Nsp(filename, 'rb')
f.copy_cnmt(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.copy_cnmt(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.cnmt.nca'):
try:
f = Fs.Nca(filename)
f.open(filename, 'rb')
data=f.return_cnmt()
f.flush()
f.close()
f = Fs.Nca(filename)
f.open(filename, 'rb')
filenames=f.ret_cnmt_name()
f.flush()
f.close()
try:
basename=str(filenames[0])
except:
basename=(str(os.path.basename(os.path.abspath(filename))))[:-4]
ofile =os.path.join(ofolder,basename)
with open (ofile,'wb') as o:
o.write(data)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy pfs0 from NSP file
# ...................................................
if args.copy_pfs0_meta:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.copy_pfs0_meta:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.copy_pfs0_meta:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_pfs0_meta(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy control nacp files from NSP file
# ...................................................
if args.copy_nacp:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.copy_nacp:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.copy_nacp:
if filename.endswith(".nsp"):
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nacp(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
'''
if filename.endswith(".nca"):
try:
f = Fs.Nca(filename, 'rb')
f.extract(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
'''
Status.close()
# DEDICATED COPY FUNCTIONS. NCA TYPES.
# ...................................................
# Copy all META NCA from NSP file
# ...................................................
if args.NSP_copy_nca_meta:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_meta:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_meta:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_meta(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all CONTROL NCA from NSP file
# ...................................................
if args.NSP_copy_nca_control:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_control:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_control:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_control(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all MANUAL NCA from NSP file
# ...................................................
if args.NSP_copy_nca_manual:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_manual:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_manual:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_manual(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all PROGRAM NCA from NSP file
# ...................................................
if args.NSP_copy_nca_program:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_program:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_program:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_program(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all DATA NCA from NSP file
# ...................................................
if args.NSP_copy_nca_data:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_data:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_data:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_data(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all PUBLIC DATA NCA from NSP file
# ...................................................
if args.NSP_copy_nca_pdata:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_nca_pdata:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_nca_pdata:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_pdata(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# DEDICATED COPY FUNCTIONS. TITLERIGHTS.
# ...................................................
# Copy all NCA WITH TITLERIGHTS from target NSP
# ...................................................
if args.NSP_copy_tr_nca:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_tr_nca:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_tr_nca:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_tr_nca(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy all NCA WITHOUT TITLERIGHTS from target NSP
# ...................................................
if args.NSP_copy_ntr_nca:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_copy_ntr_nca:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.NSP_copy_ntr_nca:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_ntr_nca(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................
# Copy ALL NCA AND CLEAN TITLERIGHTS
# ..................................
if args.C_clean:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
else:
for filename in args.C_clean:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'true'
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.C_clean:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.C_clean:
filename=filename
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
if f.trights_set() == 'FALSE':
Print.info("NSP DOESN'T HAVE TITLERIGHTS")
f.copy_nca(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.trights_set() == 'TRUE':
if f.exist_ticket() == 'TRUE':
Print.info("NSP HAS TITLERIGHTS AND TICKET EXISTS")
f.cr_tr_nca(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.exist_ticket() == 'FALSE':
Print.error('NSP FILE HAS TITLERIGHTS BUT NO TICKET')
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
if f.trights_set() == 'FALSE':
Print.info("XCI DOESN'T HAVE TITLERIGHTS")
f.copy_nca(ofolder,buffer,'secure',metapatch,vkeypatch,int(RSV_cap))
if f.trights_set() == 'TRUE':
if f.exist_ticket() == 'TRUE':
Print.info("XCI HAS TITLERIGHTS AND TICKET EXISTS")
f.cr_tr_nca(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.exist_ticket() == 'FALSE':
Print.error('XCI FILE HAS TITLERIGHTS BUT NO TICKET')
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Copy ALL NCA AND CLEAN TITLERIGHTS WITHOUT DELTAS
# ...................................................
if args.C_clean_ND:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
else:
for filename in args.C_clean_ND:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.C_clean_ND:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.C_clean_ND:
filename=filename
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
if f.trights_set() == 'FALSE':
Print.info("NSP DOESN'T HAVE TITLERIGHTS")
f.copy_nca_nd(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.trights_set() == 'TRUE':
if f.exist_ticket() == 'TRUE':
Print.info("NSP HAS TITLERIGHTS AND TICKET EXISTS")
f.cr_tr_nca_nd(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.exist_ticket() == 'FALSE':
Print.error('NSP FILE HAS TITLERIGHTS BUT NO TICKET')
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
if f.trights_set() == 'FALSE':
Print.info("XCI DOESN'T HAVE TITLERIGHTS")
f.copy_nca_nd(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.trights_set() == 'TRUE':
if f.exist_ticket() == 'TRUE':
Print.info("XCI HAS TITLERIGHTS AND TICKET EXISTS")
f.cr_tr_nca_nd(ofolder,buffer,metapatch,vkeypatch,int(RSV_cap))
if f.exist_ticket() == 'FALSE':
Print.error('XCI FILE HAS TITLERIGHTS BUT NO TICKET')
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ........................................................
# Copy keyblock from nca files with titlerights from a nsp
# ........................................................
if args.NSP_c_KeyBlock:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.NSP_c_KeyBlock:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
for filename in args.NSP_c_KeyBlock:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_KeyBlock(ofolder)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................................................
# Identify if nsp has titlerights
# ..................................................
# if args.nsp_htrights:
# for filename in args.nsp_htrights:
# try:
# f = Fs.Nsp(filename, 'r+b')
# if f.trights_set() == 'TRUE':
# Print.info('TRUE')
# if f.trights_set() == 'FALSE':
# Print.info('FALSE')
# except BaseException as e:
# Print.error('Exception: ' + str(e))
# ..................................................
# Identify if nsp has ticket
# ..................................................
# if args.nsp_hticket:
# for filename in args.nsp_hticket:
# try:
# f = Fs.Nsp(filename, 'r+b')
# if f.exist_ticket() == 'TRUE':
# Print.info('TRUE')
# if f.exist_ticket() == 'FALSE':
# Print.info('FALSE')
# except BaseException as e:
# Print.error('Exception: ' + str(e))
# DEDICATED COPY FUNCTIONS. SPLIT OR UPDATE.
# ............................................................
# Split content by titleid according to cnmt files
# ............................................................
if args.splitter:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
else:
for filename in args.splitter:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.pathend:
for input in args.pathend:
try:
pathend = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
pathend = ''
if args.splitter:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.splitter:
filename=filename
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
f.splitter_read(ofolder,buffer,pathend)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.splitter_read(ofolder,buffer,pathend)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ............................................................
# Prepare base content to get it updated
# ............................................................
if args.updbase:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.updbase:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.cskip:
for input in args.cskip:
try:
cskip = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
pathend = 'false'
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
for filename in args.updbase:
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
f.updbase_read(ofolder,buffer,cskip,metapatch,vkeypatch,RSV_cap)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.updbase_read(ofolder,buffer,cskip,metapatch,vkeypatch,RSV_cap)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# COMBINATIONS
# ............................................................
# Get nca files to make a placeholder in eshop format from NSP
# ............................................................
'''
parser.add_argument('--gen_placeholder', nargs='+', help='Creates nsp or xci placeholder')
'''
if args.gen_placeholder:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
folder = args.gen_placeholder
dir=os.path.abspath(folder)
ofolder = os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
ruta=os.path.abspath(filename.rstrip('\n'))
else:
ruta=args.gen_placeholder
indent = 1
tabs = '\t' * indent
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
if args.type:
for t in args.type:
x='.'+t
extlist.append(x)
if x[-1]=='*':
x=x[:-1]
extlist.append(x)
#print(extlist)
if args.filter:
for f in args.filter:
filter=f
filelist=list()
ruta=str(ruta)
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(filename)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
'''
for f in filelist:
print(f)
'''
print('Files to process: '+str(len(filelist)))
counter=len(filelist)
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsx'):
export='nsp'
try:
prlist=list()
f = Fs.Nsp(filepath)
contentlist=f.get_content_placeholder(ofolder)
#print(contentlist)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
continue
if filepath.endswith('.xci'):
export='xci'
try:
prlist=list()
f = Fs.Xci(filepath)
contentlist=f.get_content_placeholder(ofolder)
#print(contentlist)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
continue
if export=='nsp':
oflist=list()
osizelist=list()
totSize=0
#print(prlist)
for i in range(len(prlist)):
for j in prlist[i][4]:
oflist.append(j[0])
osizelist.append(j[1])
totSize = totSize+j[1]
filelist
basename=str(os.path.basename(os.path.abspath(filepath)))
endname=basename[:-4]+'[PLH].nsp'
endfile = os.path.join(ofolder, endname)
#print(str(filepath))
#print(str(endfile))
nspheader=sq_tools.gen_nsp_header(oflist,osizelist)
#print(endfile)
#print(hx(nspheader))
totSize = len(nspheader) + totSize
#print(str(totSize))
vskip=False
print('Processing: '+str(filepath))
if os.path.exists(endfile) and os.path.getsize(endfile) == totSize:
print('- Placeholder file already exists, skipping...')
vskip=True
else:
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(endfile)
else:
v_drive = os.path.dirname(os.path.abspath(endfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
if vskip==False:
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
outf = open(endfile, 'w+b')
t.write(tabs+'- Writing NSP header...')
outf.write(nspheader)
t.update(len(nspheader))
outf.close()
if filepath.endswith('.nsp') or filepath.endswith('.nsx'):
try:
f = Fs.Nsp(filepath)
for file in oflist:
if not file.endswith('xml'):
f.append_content(endfile,file,buffer,t)
f.flush()
f.close()
t.close()
counter=int(counter)
counter-=1
print(tabs+'> Placeholder was created')
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
if export=='xci':
oflist=list()
osizelist=list()
ototlist=list()
totSize=0
for i in range(len(prlist)):
for j in prlist[i][4]:
el=j[0]
if el.endswith('.nca'):
oflist.append(j[0])
#print(j[0])
totSize = totSize+j[1]
#print(j[1])
ototlist.append(j[0])
sec_hashlist=list()
GClist=list()
if filepath.endswith('.xci'):
try:
f = Fs.Xci(filepath)
for file in oflist:
sha,size,gamecard=f.file_hash(file)
if sha != False:
sec_hashlist.append(sha)
osizelist.append(size)
GClist.append([file,gamecard])
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
basename=str(os.path.basename(os.path.abspath(filepath)))
endname=basename[:-4]+'[PLH].xci'
endfile = os.path.join(ofolder, endname)
#print(str(filepath))
#print(str(endfile))
xci_header,game_info,sig_padding,xci_certificate,root_header,upd_header,norm_header,sec_header,rootSize,upd_multiplier,norm_multiplier,sec_multiplier=sq_tools.get_xciheader(oflist,osizelist,sec_hashlist)
totSize=len(xci_header)+len(game_info)+len(sig_padding)+len(xci_certificate)+rootSize
#print(str(totSize))
vskip=False
print('Processing: '+str(filepath))
if os.path.exists(endfile) and os.path.getsize(endfile) == totSize:
print('- Placeholder file already exists, skipping...')
vskip=True
else:
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(endfile)
else:
v_drive = os.path.dirname(os.path.abspath(endfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
if vskip==False:
c=0
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
t.write(tabs+'- Writing XCI header...')
outf = open(endfile, 'w+b')
outf.write(xci_header)
t.update(len(xci_header))
c=c+len(xci_header)
t.write(tabs+'- Writing XCI game info...')
outf.write(game_info)
t.update(len(game_info))
c=c+len(game_info)
t.write(tabs+'- Generating padding...')
outf.write(sig_padding)
t.update(len(sig_padding))
c=c+len(sig_padding)
t.write(tabs+'- Writing XCI certificate...')
outf.write(xci_certificate)
t.update(len(xci_certificate))
c=c+len(xci_certificate)
t.write(tabs+'- Writing ROOT HFS0 header...')
outf.write(root_header)
t.update(len(root_header))
c=c+len(root_header)
t.write(tabs+'- Writing UPDATE partition header...')
t.write(tabs+' Calculated multiplier: '+str(upd_multiplier))
outf.write(upd_header)
t.update(len(upd_header))
c=c+len(upd_header)
t.write(tabs+'- Writing NORMAL partition header...')
t.write(tabs+' Calculated multiplier: '+str(norm_multiplier))
outf.write(norm_header)
t.update(len(norm_header))
c=c+len(norm_header)
t.write(tabs+'- Writing SECURE partition header...')
t.write(tabs+' Calculated multiplier: '+str(sec_multiplier))
outf.write(sec_header)
t.update(len(sec_header))
c=c+len(sec_header)
outf.close()
if filepath.endswith('.xci'):
try:
GC=False
f = Fs.Xci(filepath)
for file in oflist:
if not file.endswith('xml'):
for i in range(len(GClist)):
if GClist[i][0] == file:
GC=GClist[i][1]
f.append_content(endfile,file,buffer,t,includexml=False)
f.flush()
f.close()
t.close()
counter=int(counter)
counter-=1
print(tabs+'> Placeholder was created')
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ............................................................
# Get files to make a [lc].nsp from NSP
# ............................................................
if args.license_combo:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.license_combo:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.license_combo:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_control(ofolder,buffer)
f.copy_ticket(ofolder)
f.copy_nsp_cert(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ............................................................
# Get files to make a placeholder+license nsp from a NSP
# ............................................................
if args.mlicense_combo:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.mlicense_combo:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.mlicense_combo:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_control(ofolder,buffer)
f.copy_nca_meta(ofolder,buffer)
f.copy_ticket(ofolder)
f.copy_nsp_cert(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ............................................................
# Get files to make zip to restore nsp to original state
# ............................................................
if args.zip_combo:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.zip_combo:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
for filename in args.zip_combo:
try:
f = Fs.Nsp(filename, 'rb')
f.copy_nca_meta(ofolder,buffer)
f.copy_ticket(ofolder)
f.copy_other(ofolder,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# REPACK
# ...................................................
# Repack NCA files to NSP
# ...................................................
if args.create:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if args.fexport:
for input in args.fexport:
try:
if input == "files":
fx="files"
else:
fx="folder"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fx="files"
if args.ifolder:
ruta = args.ifolder
f_list = list()
ncalist = list()
orderlist = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
if f.endswith('.cnmt.nca'):
try:
filepath = os.path.join(ruta, f)
nca = Fs.Nca(filepath, 'r+b')
ncalist=ncalist+nca.ncalist_bycnmt()
except BaseException as e:
Print.error('Exception: ' + str(e))
for f in fnames:
filepath = os.path.join(ruta, f)
f_list.append(filepath)
for f in ncalist:
fpath= os.path.join(ruta, f)
if fpath in f_list:
orderlist.append(fpath)
for f in fnames:
if f.endswith('.cnmt'):
fpath= os.path.join(ruta, f)
orderlist.append(fpath)
for f in fnames:
if f.endswith('.jpg'):
fpath= os.path.join(ruta, f)
orderlist.append(fpath)
for f in fnames:
if f.endswith('.tik') or f.endswith('.cert'):
fpath= os.path.join(ruta, f)
orderlist.append(fpath)
nsp = Fs.Nsp(None, None)
nsp.path = args.create
nsp.pack(orderlist,buffer,fat,fx)
#print (f_list)
#print (fnames)
#print (ncalist)
#print (orderlist)
else:
nsp = Fs.Nsp(None, None)
nsp.path = args.create
nsp.pack(args.file,buffer,fat,fx)
#for filePath in args.file:
# Print.info(filePath)
Status.close()
# parser.add_argument('-cpr', '--compress', help='Compress a nsp or xci')
if args.compress:
if args.position:
try:
position=int(args.position)
except:
position=False
else:
position=False
if args.n_instances:
try:
n_instances=int(args.n_instances)
except:
n_instances=False
else:
n_instances=False
if args.nodelta:
for input in args.nodelta:
try:
if input == "true" or input == "True" or input == "TRUE":
delta=False
elif input == "false" or input == "False" or input == "FALSE":
delta=True
else:
delta=False
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
delta=False
if args.fexport:
for input in args.fexport:
try:
if input == "nsz":
xci_exp="nsz"
elif input == "xcz":
xci_exp="xcz"
else:
xci_exp="xcz"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
xci_exp="xcz"
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.compress:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder =os.path.join(dir, 'output')
workers=0
if args.threads:
try:
if workers=="-1":
workers=-1
else:
workers=int(args.threads)
if workers<0:
workers=0
elif workers>4:
workers=4
except:
workers=0
if args.compress:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
if isinstance(args.compress, list):
inputs=len(args.compress)
try:
if inputs==1:
level=int(args.compress[0])
elif inputs>1:
level=int(args.compress[(int(inputs)-1)])
else:
level=17
except:
level=17
else:
try:
level=int(args.compress)
except:
level=17
else:
if isinstance(args.compress, list):
filepath=args.compress[0]
inputs=len(args.compress)
if inputs>1:
level=int(args.compress[(int(inputs)-1)])
else:
level=17
else:
filepath=args.compress
level=17
if filepath.endswith(".nsp") or filepath.endswith(".xci"):
import compressor
try:
level=int(level)
if level>22:
level=22
if level<1:
level=1
except:
level=17
if filepath.endswith(".nsp"):
compressor.compress(filepath,ofolder,level,workers,delta,pos=position,nthreads=n_instances)
elif filepath.endswith(".xci"):
basename=os.path.basename(os.path.abspath(filepath))
if xci_exp=='nsz':
outfile=basename[:-3]+'nsz'
outfile =os.path.join(ofolder,outfile)
nszPath=compressor.xci_to_nsz(filepath,buffer=65536,outfile=outfile,keepupd=False,level = level, threads = workers,pos=position,nthreads=n_instances)
try:
f=Fs.Nsp(nszPath,'rb+')
f.seteshop()
f.flush()
f.close()
except:pass
else:
outfile=basename[:-3]+'xcz'
outfile =os.path.join(ofolder,outfile)
compressor.supertrim_xci(filepath,buffer=65536,outfile=outfile,keepupd=False,level = level, threads = workers,pos=position,nthreads=n_instances)
# parser.add_argument('-dcpr', '--decompress', help='deCompress a nsz, xcz or ncz')
if args.decompress:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.decompress:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder =os.path.join(dir, 'output')
break
if args.decompress:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
for inpt in args.decompress:
filepath=inpt
break
if filepath.endswith(".nsz"):
import decompressor
basename=os.path.basename(os.path.abspath(filepath))
endname=basename[:-1]+'p'
endname =os.path.join(ofolder,endname)
decompressor.decompress_nsz(filepath,endname)
if filepath.endswith(".xcz"):
import decompressor
basename=os.path.basename(os.path.abspath(filepath))
endname=basename[:-3]+'xci'
endname =os.path.join(ofolder,endname)
decompressor.decompress_xcz(filepath,endname)
# ...................................................
# Repack NCA files to partition hfs0
# ...................................................
if args.create_hfs0:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
hfs0 = Fs.Hfs0(None, None)
hfs0.path = args.create_hfs0
if args.ifolder:
ruta = args.ifolder
f_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
f_list.append(filepath)
hfs0.pack(f_list,buffer)
else:
hfs0.pack(args.file,buffer)
Status.close()
# ...................................................
# Repack NCA files to root_hfs0
# ...................................................
if args.create_rhfs0:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ifolder:
ruta = args.ifolder
ruta_update=os.path.join(ruta, "update")
ruta_normal=os.path.join(ruta, "normal")
ruta_secure=os.path.join(ruta, "secure")
if os.path.isdir(ruta_update) == True:
upd_list = list()
for dirpath, dnames, fnames in os.walk(ruta_update):
for f in fnames:
filepath = os.path.join(ruta_update, f)
upd_list.append(filepath)
else:
upd_list = list()
if os.path.isdir(ruta_normal) == True:
norm_list = list()
for dirpath, dnames, fnames in os.walk(ruta_normal):
for f in fnames:
filepath = os.path.join(ruta_normal, f)
norm_list.append(filepath)
else:
norm_list = list()
if os.path.isdir(ruta_secure) == True:
sec_list = list()
for dirpath, dnames, fnames in os.walk(ruta_secure):
for f in fnames:
filepath = os.path.join(ruta_secure, f)
sec_list.append(filepath)
else:
sec_list = list()
else:
if args.ifolder_update:
ruta = args.ifolder_update
upd_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
upd_list.append(filepath)
else:
upd_list = list()
if args.ifolder_normal:
ruta = args.ifolder_normal
norm_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
norm_list.append(filepath)
else:
norm_list = list()
if args.ifolder_secure:
ruta = args.ifolder_secure
sec_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
sec_list.append(filepath)
else:
sec_list = list()
#print (upd_list)
#print (norm_list)
#print (sec_list)
hfs0 = Fs.Hfs0(None, None)
hfs0.path = args.create_rhfs0
hfs0.pack_root(upd_list,norm_list,sec_list,buffer)
Status.close()
# ...................................................
# Repack NCA files to xci
# ...................................................
if args.create_xci:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if args.ifolder:
ruta = args.ifolder
ruta_update=os.path.join(ruta, "update")
ruta_normal=os.path.join(ruta, "normal")
ruta_secure=os.path.join(ruta, "secure")
if os.path.isdir(ruta_update) == True:
upd_list = list()
for dirpath, dnames, fnames in os.walk(ruta_update):
for f in fnames:
filepath = os.path.join(ruta_update, f)
upd_list.append(filepath)
else:
upd_list = list()
if os.path.isdir(ruta_normal) == True:
norm_list = list()
for dirpath, dnames, fnames in os.walk(ruta_normal):
for f in fnames:
filepath = os.path.join(ruta_normal, f)
norm_list.append(filepath)
else:
norm_list = list()
if os.path.isdir(ruta_secure) == True:
sec_list = list()
for dirpath, dnames, fnames in os.walk(ruta_secure):
for f in fnames:
filepath = os.path.join(ruta_secure, f)
sec_list.append(filepath)
else:
sec_list = list()
else:
if args.ifolder_update:
ruta = args.ifolder_update
upd_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
upd_list.append(filepath)
else:
upd_list = list()
if args.ifolder_normal:
ruta = args.ifolder_normal
norm_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
norm_list.append(filepath)
else:
norm_list = list()
if args.ifolder_secure:
ruta = args.ifolder_secure
sec_list = list()
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
sec_list.append(filepath)
else:
sec_list = list()
#print (upd_list)
#print (norm_list)
#print (sec_list)
xci = Fs.Xci(None)
xci.path = args.create_xci
xci.pack(upd_list,norm_list,sec_list,buffer,fat)
Status.close()
# ...................................................
# Supertrimm a xci
# ...................................................
if args.xci_super_trim:
try:
if str(args.xci_super_trim[1]).lower() == "keepupd":
keepupd=True
else:
keepupd=False
except:
keepupd=False
try:
if str(args.nodecompress).lower() == "true":
nodecompress=True
else:
nodecompress=False
except:
nodecompress=True
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
if args.xci_super_trim[0] !="":
filepath=args.xci_super_trim[0]
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder =os.path.join(dir, 'output')
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if filepath.endswith('.xci'):
try:
f = Fs.factory(filepath)
filename=os.path.basename(os.path.abspath(filepath))
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.supertrim(buffer,outfile,ofolder,fat,keepupd)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif filepath.endswith('.xcz'):
f = Fs.Xci(filepath)
filename=os.path.basename(os.path.abspath(filepath))
outfile = os.path.join(ofolder, filename)
f.supertrim(buffer,outfile,ofolder,keepupd,nodecompress=True)
f.flush()
f.close()
Status.close()
# ...................................................
# Normal trimming for xci files
# ...................................................
if args.xci_trim:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
dir=os.path.dirname(os.path.abspath(filename))
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
ofolder =os.path.join(dir, 'output')
else:
for filename in args.xci_trim:
dir=os.path.dirname(os.path.abspath(filename))
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
ofolder =os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if not args.text_file:
for filepath in args.xci_trim:
if filepath.endswith('.xci'):
try:
f = Fs.factory(filepath)
filename=os.path.basename(os.path.abspath(filepath))
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.trim(buffer,outfile,ofolder,fat)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
filepath=filename
if filepath.endswith('.xci'):
try:
f = Fs.factory(filepath)
filename=os.path.basename(os.path.abspath(filepath))
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.trim(buffer,outfile,ofolder,fat)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Untrimming for xci files
# ...................................................
#parser.add_argument('-xci_untr', '--xci_untrim', nargs='+', help='Untrims xci')
if args.xci_untrim:
filename=None
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
if not args.ofolder:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
elif not args.ofolder:
for filename in args.xci_untrim:
if filename.endswith('.xci'):
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
break
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if filename==None:
for filepath in args.xci_untrim:
if filepath.endswith('.xci'):
filename=filepath
filepath=filename
try:
f = Fs.factory(filepath)
filename=os.path.basename(os.path.abspath(filepath))
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.untrim(buffer,outfile,ofolder,fat)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Take off deltas
# ...................................................
if args.erase_deltas:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.erase_deltas:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder = os.path.join(dir, 'output')
if args.xml_gen:
for input in args.xml_gen:
try:
if input == "true" or input == "True" or input == "TRUE":
xml_gen=True
elif input == "false" or input == "False" or input == "FALSE":
xml_gen=False
else:
xml_gen=False
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.erase_deltas:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
for filepath in args.erase_deltas:
filepath=filepath
endfile=os.path.basename(os.path.abspath(filepath))
endfile=os.path.join(ofolder,endfile)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if filepath.endswith(".nsp") or filepath.endswith(".nsz"):
try:
print('Processing: '+filepath)
f = Fs.Nsp(filepath)
f.rebuild(buffer,endfile,False,True,xml_gen)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Rebuild
# ...................................................
if args.rebuild_nsp:
skipper=False
Damage=False
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.type:
for input in args.type:
if input == "nsp":
export='nsp'
elif input == "nsz":
export='nsz'
else:
export='nsp'
else:
export='nsp'
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
elif args.ifolder:
filepath=args.ifolder
else:
for filepath in args.rebuild_nsp:
filepath=filepath
if args.nodelta:
for input in args.nodelta:
try:
if input == "true" or input == "True" or input == "TRUE":
delta=False
elif input == "false" or input == "False" or input == "FALSE":
delta=True
else:
delta=False
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
delta=True
if args.xml_gen:
for input in args.xml_gen:
try:
if input == "true" or input == "True" or input == "TRUE":
xml_gen=True
elif input == "false" or input == "False" or input == "FALSE":
xml_gen=False
else:
xml_gen=False
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
xml_gen=False
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.rebuild_nsp:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder = os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
endfile=os.path.basename(os.path.abspath(filepath))
endfile=os.path.join(ofolder,endfile)
if args.v_organize:
if args.v_organize != 'false':
base_folder=os.path.join(ofolder,'base')
update_folder=os.path.join(ofolder,'updates')
dlc_folder=os.path.join(ofolder,'dlcs')
if not os.path.exists(base_folder):
os.makedirs(base_folder)
if not os.path.exists(update_folder):
os.makedirs(update_folder)
if not os.path.exists(dlc_folder):
os.makedirs(dlc_folder)
try:
f = Fs.Nsp(filepath)
ctype=f.nsptype()
#print(ctype)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Damage=True
skipper=True
print('Content seems to be damaged')
if Damage==False:
if ctype=='BASE':
endfile=os.path.basename(os.path.abspath(filepath))
endfile=os.path.join(base_folder,endfile)
elif ctype=='UPDATE':
endfile=os.path.basename(os.path.abspath(filepath))
endfile=os.path.join(update_folder,endfile)
elif ctype=='DLC':
endfile=os.path.basename(os.path.abspath(filepath))
endfile=os.path.join(dlc_folder,endfile)
else:
print("Content can't be identified")
skipper=True
print('Final destination:')
print(' > '+endfile)
if os.path.exists(endfile):
skipper=True
print("Content exists in final destination. Skipping...")
if not args.ifolder:
if args.rebuild_nsp and skipper==False:
if filepath.endswith(".nsp"):
try:
print('Processing: '+filepath)
f = Fs.Nsp(filepath)
f.rebuild(buffer,endfile,delta,False,xml_gen)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif filepath.endswith(".nsz"):
if export == 'nsp':
try:
import decompressor
basename=os.path.basename(os.path.abspath(filepath))
endname=basename[:-1]+'p'
endname =os.path.join(ofolder,endname)
decompressor.decompress_nsz(filepath,endname,buffer,delta,xml_gen)
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
import batchprocess
batchprocess.rebuild_nsp(filepath,ofolder,buffer,delta,xml_gen,export)
Status.close()
# ...................................................
# Direct NSP OR XCI
# ...................................................
if args.direct_creation:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.nodelta:
for input in args.nodelta:
try:
if input == "true" or input == "True" or input == "TRUE":
delta=False
elif input == "false" or input == "False" or input == "FALSE":
delta=True
else:
delta=False
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
delta=True
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.direct_creation:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder =os.path.join(dir, 'output')
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if args.fexport:
for input in args.fexport:
try:
if input == "files":
fx="files"
else:
fx="folder"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fx="files"
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
if args.direct_creation:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
for filepath in args.direct_creation:
filepath=filepath
if args.type:
for input in args.type:
if input == "xci" or input == "XCI":
export='xci'
elif input == "nsp" or input == "NSP":
export='nsp'
elif input == "both" or input == "BOTH":
export='both'
else:
print ("Wrong Type!!!")
else:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
export='nsp'
elif filepath.endswith('.xci') or filepath.endswith('.xcz'):
export='xci'
else:
print ("Wrong Type!!!")
if args.rename:
for newname in args.rename:
newname=newname+'.xxx'
endfile = os.path.join(ofolder, newname)
else:
endfile=os.path.basename(os.path.abspath(filepath))
if args.cskip=='False':
cskip=False
else:
cskip=True
if filepath.endswith(".nsp") or filepath.endswith('.nsz'):
f = Fs.Nsp(filepath)
TYPE=f.nsptype()
f.flush()
f.close()
if cskip==True:
if TYPE=='DLC' or TYPE=='UPDATE':
export='nsp'
if export=='nsp':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'nsp'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_nsp_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif export=='xci':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'xci'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_xci_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif export=='both':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'nsp'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_nsp_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'xci'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_xci_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith(".xci") or filepath.endswith('.xcz'):
if export=='nsp':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'nsp'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_nsp_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif export=='xci':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'xci'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
temp=f.c_xci_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
elif export=='both':
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'nsp'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_nsp_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
try:
print("Processing: " + filepath)
f = Fs.factory(filepath)
filename=endfile[:-3]+'xci'
#print(filename)
outfile = os.path.join(ofolder, filename)
#print(f.path)
f.open(filepath, 'rb')
f.c_xci_direct(buffer,outfile,ofolder,fat,fx,delta,metapatch,RSV_cap,vkeypatch)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Direct MULTI NSP OR XCI
# ...................................................
if args.direct_multi:
indent = 1
index = 0
tabs = '\t' * indent
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.romanize:
for input in args.romanize:
roman=str(input).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
if not os.path.exists(ofolder):
os.makedirs(ofolder)
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.direct_multi:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder = os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if args.fexport:
for input in args.fexport:
try:
if input == "files":
fx="files"
else:
fx="folder"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fx="files"
if args.nodelta:
for input in args.nodelta:
try:
if input == "true" or input == "True" or input == "TRUE":
delta=False
elif input == "false" or input == "False" or input == "FALSE":
delta=True
else:
delta=False
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
delta=True
if args.patchversion:
for input in args.patchversion:
try:
metapatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
metapatch = 'false'
if args.RSVcap:
for input in args.RSVcap:
try:
RSV_cap = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
RSV_cap = 268435656
if args.keypatch:
for input in args.keypatch:
try:
vkeypatch = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
vkeypatch = 'false'
export=list()
if args.type:
for input in args.type:
if input == "xci" or input == "XCI":
export.append('xci')
elif input == "nsp" or input == "NSP":
export.append('nsp')
elif input == "cnsp" or input == "CNSP":
export.append('cnsp')
else:
print ("Wrong Type!!!")
if args.direct_multi:
if args.text_file:
tfile=args.text_file
filelist=list()
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
'''
for file in filelist:
print(file)
pass
'''
prlist=list()
print ('Calculating final content:')
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
#print(filepath)
try:
c=list()
f = Fs.Nsp(filepath)
if 'nsp' in export or 'cnsp' in export:
afolder=False
if fat=="fat32" and fx=="folder":
afolder=os.path.join(ofolder,"archfolder")
if not os.path.exists(afolder):
os.makedirs(afolder)
contentlist=f.get_content(afolder,vkeypatch,delta)
else:
contentlist=f.get_content(ofolder,vkeypatch,delta)
else:
contentlist=f.get_content(False,False,delta)
# print(contentlist)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (prlist[i][1])
#print (contentlist[j][6])
#print (prlist[i][6])
#pass
if contentlist[j][1] == prlist[i][1]:
#print('true')
#print(contentlist[j][6])
#print(prlist[i][6])
if int(contentlist[j][6]) > int(prlist[i][6]):
del prlist[i]
#print(prlist[i])
prlist.append(contentlist[j])
notinlist=False
break
elif int(contentlist[j][6]) <= int(prlist[i][6]):
notinlist=False
break
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
#print(filepath)
try:
c=list()
f = Fs.Xci(filepath)
if 'nsp' in export or 'cnsp' in export:
contentlist=f.get_content(ofolder,vkeypatch,delta)
else:
contentlist=f.get_content(False,False,delta)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (prlist[i][1])
#print (contentlist[j][6])
#print (prlist[i][6])
#pass
if contentlist[j][1] == prlist[i][1]:
#print('true')
#print(contentlist[j][6])
#print(prlist[i][6])
if int(contentlist[j][6]) > int(prlist[i][6]):
del prlist[i]
#print(prlist[i])
prlist.append(contentlist[j])
notinlist=False
break
elif int(contentlist[j][6]) <= int(prlist[i][6]):
notinlist=False
break
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
Print.error('Exception: ' + str(e))
'''
for i in range(len(prlist)):
print (prlist[i][0])
print (prlist[i][1]+' v'+prlist[i][6])
for j in prlist[i][4]:
print (j[0])
print (j[1])
print('////////////////////////////////////////////////////////////')
'''
tnamefile=False
for f in args.direct_multi:
if f == 'calculate':
#BASE
basecount=0; basename='';basever='';baseid='';basefile=''
updcount=0; updname='';updver='';updid='';updfile=''
dlccount=0; dlcname='';dlcver='';dlcid='';dlcfile=''
ccount='';bctag='';updtag='';dctag=''
for i in range(len(prlist)):
if prlist[i][5] == 'BASE':
basecount+=1
if baseid == "":
basefile=str(prlist[i][0])
baseid=str(prlist[i][1])
basever='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'UPDATE':
updcount+=1
endver=str(prlist[i][6])
if updid == "":
updfile=str(prlist[i][0])
updid=str(prlist[i][1])
updver='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'DLC':
dlccount+=1
if dlcid == "":
dlcfile=str(prlist[i][0])
dlcid=str(prlist[i][1])
dlcver='[v'+str(prlist[i][6])+']'
if basecount !=0:
bctag=str(basecount)+'G'
else:
bctag=''
if updcount !=0:
if bctag != '':
updtag='+'+str(updcount)+'U'
else:
updtag=str(updcount)+'U'
else:
updtag=''
if dlccount !=0:
if bctag != '' or updtag != '':
dctag='+'+str(dlccount)+'D'
else:
dctag=str(dlccount)+'D'
else:
dctag=''
ccount='('+bctag+updtag+dctag+')'
if baseid != "":
try:
if basefile.endswith('.xci') or basefile.endswith('.xcz') :
f = Fs.Xci(basefile)
elif basefile.endswith('.nsp') or basefile.endswith('.nsz') :
f = Fs.Nsp(basefile)
ctitl=f.get_title(baseid,roman)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
tnamefile=True
except:
tnamefile=True
if tnamefile==True:
ctitl=str(os.path.basename(os.path.abspath(basefile)))
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(basefile) if char == '[']
tid2=[pos for pos, char in enumerate(basefile) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
t=basefile[i1:i2]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid3=list()
tid4=list()
tid3=[pos for pos, char in enumerate(ctitl) if char == '(']
tid4=[pos for pos, char in enumerate(ctitl) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
t=ctitl[i3:i4]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid5=list()
tid5=[pos for pos, char in enumerate(ctitl) if char == '-']
lentlist=len(tid5)
for i in range(lentlist):
i5=tid5[i]+1
ctitl=ctitl[i5:]
break
ctitl=ctitl[:-4]
if ctitl.endswith(' '):
ctitl=ctitl[:-1]
if ctitl.startswith(' '):
ctitl=ctitl[1:]
elif updid !="":
try:
if updfile.endswith('.xci') or updfile.endswith('.xcz') :
f = Fs.Xci(updfile)
elif updfile.endswith('.nsp') or updfile.endswith('.nsz') :
f = Fs.Nsp(updfile)
ctitl=f.get_title(updid,roman)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
tnamefile=True
except:
tnamefile=True
if tnamefile==True:
ctitl=str(os.path.basename(os.path.abspath(updfile)))
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(updfile) if char == '[']
tid2=[pos for pos, char in enumerate(updfile) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
t=updfile[i1:i2]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid3=list()
tid4=list()
tid3=[pos for pos, char in enumerate(ctitl) if char == '(']
tid4=[pos for pos, char in enumerate(ctitl) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
t=ctitl[i3:i4]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid5=list()
tid5=[pos for pos, char in enumerate(ctitl) if char == '-']
lentlist=len(tid5)
for i in range(lentlist):
i5=tid5[i]+1
ctitl=ctitl[i5:]
break
ctitl=ctitl[:-4]
if ctitl.endswith(' '):
ctitl=ctitl[:-1]
if ctitl.startswith(' '):
ctitl=ctitl[1:]
elif dlcid !="":
try:
ctitl=str(os.path.basename(os.path.abspath(dlcfile)))
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(dlcfile) if char == '[']
tid2=[pos for pos, char in enumerate(dlcfile) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
t=dlcfile[i1:i2]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid3=list()
tid4=list()
tid3=[pos for pos, char in enumerate(ctitl) if char == '(']
tid4=[pos for pos, char in enumerate(ctitl) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
t=ctitl[i3:i4]
ctitl=ctitl.replace(t,'')
ctitl=ctitl.replace(' ',' ')
tid5=list()
tid5=[pos for pos, char in enumerate(ctitl) if char == '-']
lentlist=len(tid5)
for i in range(lentlist):
i5=tid5[i]+1
ctitl=ctitl[i5:]
break
ctitl=ctitl[:-4]
if ctitl.endswith(' '):
ctitl=ctitl[:-1]
if ctitl.startswith(' '):
ctitl=ctitl[1:]
except:
if dlcfile.endswith('.xci') or dlcfile.endswith('.xcz'):
f = Fs.Xci(dlcfile)
elif dlcfile.endswith('.nsp') or dlcfile.endswith('.nsz') :
f = Fs.Nsp(dlcfile)
ctitl=f.get_title(dlcid,roman)
f.flush()
f.close()
else:
ctitl='UNKNOWN'
baseid='['+baseid.upper()+']'
updid='['+updid.upper()+']'
dlcid='['+dlcid.upper()+']'
if ccount == '(1G)' or ccount == '(1U)' or ccount == '(1D)':
ccount=''
targetnormal=list()
if baseid != "[]":
if updver != "":
endname=ctitl+' '+baseid+' '+updver+' '+ccount
targetnormal.append([baseid[1:-1],updver[2:-1]])
else:
endname=ctitl+' '+baseid+' '+basever+' '+ccount
targetnormal.append([baseid[1:-1],basever[2:-1]])
elif updid != "[]":
endname=ctitl+' '+updid+' '+updver+' '+ccount
targetnormal.append([updid[1:-1],updver[2:-1]])
else:
endname=ctitl+' '+dlcid+' '+dlcver+' '+ccount
targetnormal.append([dlcid[1:-1],dlcver[2:-1]])
#print('Filename: '+endname)
else:
endname=str(f)
endname = (re.sub(r'[\/\\\:\*\?]+', '', endname))
endname = re.sub(r'[™©®`~^´ªº¢#£€¥$ƒ±¬½¼♡«»±•²‰œæÆ³☆<<>>|]', '', endname)
endname = re.sub(r'[Ⅰ]', 'I', endname);endname = re.sub(r'[Ⅱ]', 'II', endname)
endname = re.sub(r'[Ⅲ]', 'III', endname);endname = re.sub(r'[Ⅳ]', 'IV', endname)
endname = re.sub(r'[Ⅴ]', 'V', endname);endname = re.sub(r'[Ⅵ]', 'VI', endname)
endname = re.sub(r'[Ⅶ]', 'VII', endname);endname = re.sub(r'[Ⅷ]', 'VIII', endname)
endname = re.sub(r'[Ⅸ]', 'IX', endname);endname = re.sub(r'[Ⅹ]', 'X', endname)
endname = re.sub(r'[Ⅺ]', 'XI', endname);endname = re.sub(r'[Ⅻ]', 'XII', endname)
endname = re.sub(r'[Ⅼ]', 'L', endname);endname = re.sub(r'[Ⅽ]', 'C', endname)
endname = re.sub(r'[Ⅾ]', 'D', endname);endname = re.sub(r'[Ⅿ]', 'M', endname)
endname = re.sub(r'[—]', '-', endname);endname = re.sub(r'[√]', 'Root', endname)
endname = re.sub(r'[àâá@äå]', 'a', endname);endname = re.sub(r'[ÀÂÁÄÅ]', 'A', endname)
endname = re.sub(r'[èêéë]', 'e', endname);endname = re.sub(r'[ÈÊÉË]', 'E', endname)
endname = re.sub(r'[ìîíï]', 'i', endname);endname = re.sub(r'[ÌÎÍÏ]', 'I', endname)
endname = re.sub(r'[òôóöø]', 'o', endname);endname = re.sub(r'[ÒÔÓÖØ]', 'O', endname)
endname = re.sub(r'[ùûúü]', 'u', endname);endname = re.sub(r'[ÙÛÚÜ]', 'U', endname)
endname = re.sub(r'[’]', "'", endname);endname = re.sub(r'[“”]', '"', endname)
endname = re.sub(' {3,}', ' ',endname);re.sub(' {2,}', ' ',endname);
try:
endname = endname.replace("( ", "(");endname = endname.replace(" )", ")")
endname = endname.replace("[ ", "[");endname = endname.replace(" ]", "]")
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace("[]", "");endname = endname.replace("()", "")
endname = endname.replace('" ','"');endname = endname.replace(' "','"')
endname = endname.replace(" !", "!");endname = endname.replace(" ?", "?")
endname = endname.replace(" ", " ");endname = endname.replace(" ", " ")
endname = endname.replace('"', '');
endname = endname.replace(')', ') ');endname = endname.replace(']', '] ')
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace(" ", " ")
except:pass
if endname[-1]==' ':
endname=endname[:-1]
if fat=="fat32" and fx=="folder":
tfname='filename.txt'
tfname = os.path.join(ofolder, tfname)
with open(tfname,"w", encoding='utf8') as tfile:
tfile.write(endname)
if 'nsp' in export:
oflist=list()
osizelist=list()
totSize=0
c=0
# print(prlist)
for i in range(len(prlist)):
for j in prlist[i][4]:
oflist.append(j[0])
osizelist.append(j[1])
totSize = totSize+j[1]
nspheader=sq_tools.gen_nsp_header(oflist,osizelist)
endname_x=endname+'.nsp'
endfile = os.path.join(ofolder, str(endname_x))
print('Filename: '+endname_x)
#print(hx(nspheader))
totSize = len(nspheader) + totSize
#print(str(totSize))
if totSize <= 4294901760:
fat="exfat"
if fat=="fat32":
splitnumb=math.ceil(totSize/4294901760)
index=0
endfile=endfile[:-1]+str(index)
if fx=="folder" and fat=="fat32":
output_folder = os.path.join(ofolder, "archfolder")
endfile = os.path.join(output_folder, "00")
if not os.path.exists(output_folder):
os.makedirs(output_folder)
elif fx=="folder" and fat=="exfat":
ext='.xml'
if os.path.exists(afolder) and os.path.isdir(afolder):
for dirpath, dirnames, filenames in os.walk(afolder):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
filename= os.path.join(afolder,filename)
shutil.move(filename,ofolder)
shutil.rmtree(afolder, ignore_errors=True)
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(endfile)
else:
v_drive = os.path.dirname(os.path.abspath(endfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
outf = open(endfile, 'w+b')
t.write(tabs+'- Writing NSP header...')
outf.write(nspheader)
t.update(len(nspheader))
c=c+len(nspheader)
outf.close()
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
try:
f = Fs.Nsp(filepath)
for file in oflist:
if not file.endswith('xml'):
endfile,index,c = f.append_content(endfile,file,buffer,t,fat,fx,c,index)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
t.close()
if 'xci' in export:
endname_x=endname+'.xci'
print('Filename: '+endname_x)
endfile = os.path.join(ofolder, endname_x)
oflist=list()
osizelist=list()
ototlist=list()
totSize=0
for i in range(len(prlist)):
for j in prlist[i][4]:
el=j[0]
if el.endswith('.nca'):
oflist.append(j[0])
#print(j[0])
totSize = totSize+j[1]
#print(j[1])
ototlist.append(j[0])
sec_hashlist=list()
GClist=list()
# print(filelist)
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
try:
f = Fs.Nsp(filepath)
for file in oflist:
sha,size,gamecard=f.file_hash(file)
if sha != False:
sec_hashlist.append(sha)
osizelist.append(size)
GClist.append([file,gamecard])
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
try:
f = Fs.Xci(filepath)
for file in oflist:
sha,size,gamecard=f.file_hash(file)
if sha != False:
sec_hashlist.append(sha)
osizelist.append(size)
GClist.append([file,gamecard])
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
# print(oflist)
# print(osizelist)
# print(sec_hashlist)
if totSize <= 4294934528:
fat="exfat"
if fat=="fat32":
splitnumb=math.ceil(totSize/4294934528)
index=0
endfile=endfile[:-1]+str(index)
xci_header,game_info,sig_padding,xci_certificate,root_header,upd_header,norm_header,sec_header,rootSize,upd_multiplier,norm_multiplier,sec_multiplier=sq_tools.get_xciheader(oflist,osizelist,sec_hashlist)
totSize=len(xci_header)+len(game_info)+len(sig_padding)+len(xci_certificate)+rootSize
#print(hx(xci_header))
#print(str(totSize))
c=0
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(endfile)
else:
v_drive = os.path.dirname(os.path.abspath(endfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
t.write(tabs+'- Writing XCI header...')
outf = open(endfile, 'w+b')
outf.write(xci_header)
t.update(len(xci_header))
c=c+len(xci_header)
t.write(tabs+'- Writing XCI game info...')
outf.write(game_info)
t.update(len(game_info))
c=c+len(game_info)
t.write(tabs+'- Generating padding...')
outf.write(sig_padding)
t.update(len(sig_padding))
c=c+len(sig_padding)
t.write(tabs+'- Writing XCI certificate...')
outf.write(xci_certificate)
t.update(len(xci_certificate))
c=c+len(xci_certificate)
t.write(tabs+'- Writing ROOT HFS0 header...')
outf.write(root_header)
t.update(len(root_header))
c=c+len(root_header)
t.write(tabs+'- Writing UPDATE partition header...')
t.write(tabs+' Calculated multiplier: '+str(upd_multiplier))
outf.write(upd_header)
t.update(len(upd_header))
c=c+len(upd_header)
t.write(tabs+'- Writing NORMAL partition header...')
t.write(tabs+' Calculated multiplier: '+str(norm_multiplier))
outf.write(norm_header)
t.update(len(norm_header))
c=c+len(norm_header)
t.write(tabs+'- Writing SECURE partition header...')
t.write(tabs+' Calculated multiplier: '+str(sec_multiplier))
outf.write(sec_header)
t.update(len(sec_header))
c=c+len(sec_header)
outf.close()
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
try:
GC=False
f = Fs.Nsp(filepath)
for file in oflist:
if not file.endswith('xml'):
for i in range(len(GClist)):
if GClist[i][0] == file:
GC=GClist[i][1]
endfile,index,c = f.append_clean_content(endfile,file,buffer,t,GC,vkeypatch,metapatch,RSV_cap,fat,fx,c,index,block=4294934528)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
try:
GC=False
f = Fs.Xci(filepath)
for file in oflist:
if not file.endswith('xml'):
for i in range(len(GClist)):
if GClist[i][0] == file:
GC=GClist[i][1]
endfile,index,c = f.append_clean_content(endfile,file,buffer,t,GC,vkeypatch,metapatch,RSV_cap,fat,fx,c,index,block=4294934528)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
t.close()
if 'cnsp' in export:
oflist=list()
osizelist=list()
ototlist=list()
totSize=0
c=0
for i in range(len(prlist)):
for j in prlist[i][4]:
el=j[0]
if el.endswith('.nca') or el.endswith('.xml'):
oflist.append(j[0])
#print(j[0])
osizelist.append(j[1])
totSize = totSize+j[1]
#print(j[1])
ototlist.append(j[0])
nspheader=sq_tools.gen_nsp_header(oflist,osizelist)
endname_x=endname+'[rr].nsp'
print('Filename: '+endname_x)
endfile = os.path.join(ofolder, endname_x)
#print(endfile)
#print(hx(nspheader))
totSize = len(nspheader) + totSize
if totSize <= 4294901760:
fat="exfat"
if fat=="fat32":
splitnumb=math.ceil(totSize/4294901760)
index=0
endfile=endfile[:-1]+str(index)
if fx=="folder" and fat=="fat32":
output_folder = os.path.join(ofolder, "archfolder")
endfile = os.path.join(output_folder, "00")
if not os.path.exists(output_folder):
os.makedirs(output_folder)
elif fx=="folder" and fat=="exfat":
ext='.xml'
if os.path.exists(afolder) and os.path.isdir(afolder):
for dirpath, dirnames, filenames in os.walk(afolder):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
filename= os.path.join(afolder,filename)
shutil.move(filename,ofolder)
shutil.rmtree(afolder, ignore_errors=True)
#print(str(totSize))
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(endfile)
else:
v_drive = os.path.dirname(os.path.abspath(endfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
outf = open(endfile, 'w+b')
t.write(tabs+'- Writing NSP header...')
outf.write(nspheader)
t.update(len(nspheader))
c=c+len(nspheader)
outf.close()
for filepath in filelist:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
try:
f = Fs.Nsp(filepath)
for file in oflist:
if not file.endswith('xml'):
endfile,index,c = f.append_clean_content(endfile,file,buffer,t,False,vkeypatch,metapatch,RSV_cap,fat,fx,c,index)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
try:
f = Fs.Xci(filepath)
for file in oflist:
if not file.endswith('xml'):
endfile,index,c = f.append_clean_content(endfile,file,buffer,t,False,vkeypatch,metapatch,RSV_cap,fat,fx,c,index)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
t.close()
Status.close()
# ...................................................
# Direct Splitter
# ...................................................
if args.direct_splitter:
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.direct_splitter:
dir=os.path.dirname(os.path.abspath(filepath))
ofolder = os.path.join(dir, 'output')
if args.fat:
for input in args.fat:
try:
if input == "fat32":
fat="fat32"
else:
fat="exfat"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fat="exfat"
if args.fexport:
for input in args.fexport:
try:
if input == "files":
fx="files"
else:
fx="folder"
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
fx="files"
if args.direct_splitter:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
for filepath in args.direct_splitter:
filepath=filepath
try:
if str(args.nodecompress).lower() == "true":
nodecompress=True
else:
nodecompress=False
except:
nodecompress=False
if nodecompress==True:
fat="exfat"
if args.type:
for input in args.type:
if input == "xci" or input == "XCI":
export='xci'
elif input == "nsp" or input == "NSP":
export='nsp'
elif input == "both" or input == "BOTH":
export='both'
else:
print ("Wrong Type!!!")
else:
if filepath.endswith('.nsp') or filepath.endswith('.nsz'):
export='nsp'
elif filepath.endswith('.xci') or filepath.endswith('.xcz'):
export='xci'
else:
print ("Wrong Type!!!")
if args.rename:
for newname in args.rename:
newname=newname+'.xxx'
endfile = os.path.join(ofolder, newname)
else:
endfile=os.path.basename(os.path.abspath(filepath))
if args.cskip=='False':
cskip=False
else:
cskip=True
if filepath.endswith(".nsp") or filepath.endswith('.nsz'):
try:
f = Fs.Nsp(filepath)
f.sp_groupncabyid(buffer,ofolder,fat,fx,export,nodecompress)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith(".xci") or filepath.endswith('.xcz'):
try:
f = Fs.Xci(filepath)
f.sp_groupncabyid(buffer,ofolder,fat,fx,export,nodecompress)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Archive to nsp
# ...................................................
if sys.platform == 'win32':
if args.archive and args.ifolder:
indent = 1
tabs = '\t' * indent
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as tname:
name = tname.readline()
name=name+'.nsp'
endfolder=args.archive
endfolder = os.path.join(endfolder, name)
else:
endfolder=args.archive
try:
ruta = args.ifolder
if not os.path.exists(endfolder):
os.makedirs(endfolder)
#print (ruta)
#print (os.path.isdir(ruta))
print (tabs+"Archiving to output folder...")
if os.path.isdir(ruta) == True:
for dirpath, dnames, fnames in os.walk(ruta):
#print (fnames)
for f in fnames:
filepath = os.path.join(ruta, f)
#print (f)
#win32api.SetFileAttributes(filepath,win32con.FILE_ATTRIBUTE_NORMAL)
shutil.move(filepath,endfolder)
win32api.SetFileAttributes(endfolder,win32con.FILE_ATTRIBUTE_ARCHIVE)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Join split files
# ...................................................
if args.joinfile:
indent = 1
tabs = '\t' * indent
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
filepath = args.joinfile
dir=os.path.dirname(os.path.abspath(filepath))
ofolder = os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.buffer:
for input in args.buffer:
try:
buffer = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filepath = filelist.readline()
filepath=os.path.abspath(filepath.rstrip('\n'))
else:
for filepath in args.joinfile:
filepath=filepath
print(filepath)
file_list=list()
try:
bname=os.path.basename(os.path.abspath(filepath))
bn=''
if bname != '00':
bn=bname[:-4]
if filepath.endswith(".xc0"):
outname = bn+".xci"
ender=".xc"
elif filepath.endswith(".ns0"):
outname = bn+".nsp"
ender=".ns"
elif filepath[-2:]=="00":
outname = "output.nsp"
ender="0"
else:
print ("Not valid file")
outfile = os.path.join(ofolder, outname)
#print (outfile)
ruta=os.path.dirname(os.path.abspath(filepath))
#print(ruta)
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
check=f[-4:-1]
#print(check)
#print(ender)
#print(bname[:-1])
#print(f[:-1])
if check==ender and bname[:-1]==f[:-1]:
n=bname[-1];n=int(n)
try:
n=f[-1];n=int(n)
n+=1
fp = os.path.join(ruta, f)
file_list.append(fp)
except: continue
file_list.sort()
#print(file_list)
except BaseException as e:
Print.error('Exception: ' + str(e))
totSize = sum(os.path.getsize(file) for file in file_list)
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(outfile)
else:
v_drive = os.path.dirname(os.path.abspath(outfile))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(totSize):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=totSize, unit='B', unit_scale=True, leave=False)
t.write(tabs+'- Joining files...')
index=0
outf = open(outfile, 'wb')
#print(file_list)
for file in file_list:
t.write(tabs+'- Appending: '+ file)
outfile=file[:-1]+str(index)
with open(outfile, 'rb') as inf:
while True:
data = inf.read(int(buffer))
outf.write(data)
t.update(len(data))
outf.flush()
if not data:
break
index+=1
t.close()
outf.close()
Status.close()
# ...................................................
# ZIP
# ...................................................
if args.zippy and args.ifolder:
indent = 1
tabs = '\t' * indent
try:
outfile=args.zippy
ruta = args.ifolder
endfolder=os.path.dirname(os.path.abspath(outfile))
if not os.path.exists(endfolder):
os.makedirs(endfolder)
print (tabs+"Packing zip...")
if os.path.isdir(ruta) == True:
for dirpath, dnames, fnames in os.walk(ruta):
for f in fnames:
filepath = os.path.join(ruta, f)
with ZipFile(outfile, 'a') as zippy:
fp = os.path.join(ruta, f)
zippy.write(fp,f)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# INFORMATION
# ...................................................
# Show file filelist
# ...................................................
if args.filelist:
for filename in args.filelist:
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'rb')
f.print_file_list()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.print_file_list()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Show advance filelist
# ...................................................
if args.ADVfilelist:
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.ADVfilelist:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.ADVfilelist:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-Fcontent.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
feed=f.adv_file_list()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
feed=f.adv_file_list()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Show advance filelist
# ...................................................
if args.ADVcontentlist:
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.ADVcontentlist:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.ADVcontentlist:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'_ID_content.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
feed=f.adv_content_list()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
feed=f.adv_content_list()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# FW REQ INFO
# ...................................................
if args.fw_req:
if args.translate:
if str(args.translate).lower()=="true":
trans=True
else:
trans=False
if args.romanize:
for val_ in args.romanize:
roman=str(val_).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.fw_req:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.fw_req:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-fwinfo.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
feed=f.print_fw_req(trans,roma=roman)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
feed=f.print_fw_req(trans,roma=roman)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# XCI HEADER
# ...................................................
if args.Read_xci_head:
for filename in args.Read_xci_head:
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.print_head()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# ADD CONTENT TO DATABASE
# ...................................................
if args.addtodb:
if args.romanize:
for input in args.romanize:
roman=str(input).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.db_file:
outfile=args.db_file
dir=os.path.dirname(os.path.abspath(outfile))
err='errorlog.txt'
errfile = os.path.join(dir, err)
else:
for filename in args.addtodb:
dir=os.path.dirname(os.path.abspath(filename))
ofolder = os.path.join(dir, 'output')
outname='nutdb.txt'
outfile = os.path.join(ofolder, outname)
err='errorlog.txt'
errfile = os.path.join(ofolder, outname)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.dbformat:
for input in args.dbformat:
input=str(input).lower()
if input == "nutdb":
outdb = "nutdb"
elif input == "keyless":
outdb = "keyless"
elif input == "simple":
outdb = "simple"
elif input == "extended":
outdb = "extended"
else:
outdb = "all"
else:
outdb = "extended"
if args.addtodb:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.addtodb:
filename=filename
if (filename.lower()).endswith('.nsp') or (filename.lower()).endswith('.nsx') or (filename.lower()).endswith('.nsz'):
try:
infile=r''
infile+=filename
f = Fs.Nsp(filename, 'rb')
f.addtodb(outfile,outdb,roman)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+' Error in "ADD TO DATABASE" function:'+'\n')
errfile.write("Route "+str(filename)+'\n')
errfile.write('- Exception: ' + str(e)+ '\n')
if (filename.lower()).endswith('.xci') or (filename.lower()).endswith('.xcz'):
try:
infile=r''
infile+=filename
f = Fs.factory(filename)
f.open(filename, 'rb')
f.addtodb(outfile,outdb,roman)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+' Error in "ADD TO DATABASE" function:'+'\n')
errfile.write("Route "+str(filename)+'\n')
errfile.write('- Exception: ' + str(e)+ '\n')
#parser.add_argument('-nscdb_new', '--addtodb_new', nargs='+', help='Adds content to database')
if args.addtodb_new:
if args.translate:
if str(args.translate).lower()=="true":
trans=True
else:
trans=False
if args.db_file:
DBfile=args.db_file
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.addtodb_new:
filename=filename
if (filename.lower()).endswith('.nsp') or (filename.lower()).endswith('.nsx') or (filename.lower()).endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
f.Incorporate_to_permaDB(DBfile,trans)
except BaseException as e:
Print.error('Exception: ' + str(e))
if (filename.lower()).endswith('.xci') or (filename.lower()).endswith('.xcz'):
try:
f = Fs.Xci(filename)
f.Incorporate_to_permaDB(DBfile,trans)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Show info
# ...................................................
if args.info:
print(str(len(args.info)))
if re.search(r'^[A-Fa-f0-9]+$', args.info.strip(), re.I | re.M | re.S):
Print.info('%s version = %s' % (args.info.upper(), CDNSP.get_version(args.info.lower())))
else:
f = Fs.factory(args.info)
f.open(args.info, 'r+b')
f.printInfo()
'''
for i in f.cnmt():
for j in i:
Print.info(j._path)
j.rewind()
buf = j.read()
Hex.dump(buf)
j.seek(0x28)
#j.writeInt64(0)
Print.info('min: ' + str(j.readInt64()))
#f.flush()
#f.close()
'''
Status.close()
# ...................................................
# Read ncap inside nsp or xci
# ...................................................
if args.Read_nacp:
if args.romanize:
for val_ in args.romanize:
roman=str(val_).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.Read_nacp:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.Read_nacp:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-nacp.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
feed=f.read_nacp(roma=roman)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
feed=f.read_nacp(roma=roman)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'rb')
if str(f.header.contentType) == 'Content.CONTROL':
feed=f.read_nacp()
f.flush()
f.close()
else:
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename.lower()
feed=''
message=basename+' is not a TYPE CONTROL NCA';print(message);feed+=message+'\n'
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Read ncap inside nsp or xci
# ...................................................
if args.Read_icon:
for filename in args.Read_icon:
filename=filename
if filename.endswith('.nsp') or filename.endswith('.nsx'):
try:
files_list=sq_tools.ret_nsp_offsets(filename)
f = Fs.Nsp(filename, 'rb')
f.icon_info(files_list)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
files_list=sq_tools.ret_xci_offsets(filename)
f = Fs.Xci(filename)
f.icon_info(files_list)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ......................................................................
# Raw extraction. For cases when a file is bad and triggers a exception
# ......................................................................
if args.raw_extraction:
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
ofolder=False
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
else:
for filename in args.raw_extraction:
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
test=filename.lower()
if test.endswith('.nsp') or test.endswith('.nsx') or test.endswith('.nsz'):
try:
files_list=sq_tools.ret_nsp_offsets(filename,32)
for i in range(len(files_list)):
#print(files_list[i][0])
#print(files_list[i][1])
#print(files_list[i][2])
off1=files_list[i][1]
off2=files_list[i][2]
filepath = os.path.join(ofolder, files_list[i][0])
fp = open(filepath, 'w+b')
s=files_list[i][3]
if int(buffer)>s:
buf=s
else:
buf=buffer
#print(filepath)
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(filepath)
else:
v_drive = os.path.dirname(os.path.abspath(filepath))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(s):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=s, unit='B', unit_scale=True, leave=False)
with open(filename, 'r+b') as f:
f.seek(off1)
c=0
t.write(tabs+'Copying: ' + str(files_list[i][0]))
for data in iter(lambda: f.read(int(buf)), ""):
fp.write(data)
fp.flush()
c=len(data)+c
t.update(len(data))
if c+int(buf)>s:
if (s-c)<0:
t.close()
fp.close()
break
data=f.read(s-c)
fp.write(data)
t.update(len(data))
t.close()
fp.close()
break
if not data:
t.close()
fp.close()
break
except BaseException as e:
Print.error('Exception: ' + str(e))
elif test.endswith('.xci') or test.endswith('.xcz'):
try:
files_list=sq_tools.ret_xci_offsets(filename,32)
#print(files_list)
for i in range(len(files_list)):
#print(files_list[i][0])
#print(files_list[i][1])
#print(files_list[i][2])
off1=files_list[i][1]
off2=files_list[i][2]
filepath = os.path.join(ofolder, files_list[i][0])
fp = open(filepath, 'w+b')
s=files_list[i][3]
if int(buffer)>s:
buf=s
else:
buf=buffer
#print(filepath)
if sys.platform == 'win32':
v_drive, v_path = os.path.splitdrive(filepath)
else:
v_drive = os.path.dirname(os.path.abspath(filepath))
dsktotal, dskused, dskfree=disk_usage(str(v_drive))
if int(dskfree)<int(s):
sys.exit("Warning disk space lower than required size. Program will exit")
t = tqdm(total=s, unit='B', unit_scale=True, leave=False)
with open(filename, 'r+b') as f:
f.seek(off1)
c=0
t.write(tabs+'Copying: ' + str(files_list[i][0]))
for data in iter(lambda: f.read(int(buf)), ""):
fp.write(data)
fp.flush()
c=len(data)+c
t.update(len(data))
if c+int(buf)>s:
if (s-c)<0:
t.close()
fp.close()
break
data=f.read(s-c)
fp.write(data)
t.update(len(data))
t.close()
fp.close()
break
if not data:
t.close()
fp.close()
break
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..........................................................................
# NCA_FILE_EXTACTION. EXTRACT FILES PACKED IN NCA FROM NSP\XCI\NCA
# ..........................................................................
if args.nca_file_extraction:
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
ofolder=False
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
else:
for filename in args.nca_file_extraction:
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if filename.endswith('.nsp'):
try:
files_list=sq_tools.ret_nsp_offsets(filename)
f = Fs.Nsp(filename, 'rb')
f.extract_nca(ofolder,files_list,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
files_list=sq_tools.ret_xci_offsets(filename)
f = Fs.Xci(filename)
f.extract_nca(ofolder,files_list,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...........................................................................
# NCA_2_PLAINTEXT. EXTRACT OR CONVERT NCA FILES TO PLAINTEXT FROM NSP\XCI\NCA
# ...........................................................................
if args.extract_plain_nca:
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
ofolder=False
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
else:
for filename in args.extract_plain_nca:
if ofolder != False:
dir=ofolder
else:
dir=os.path.dirname(os.path.abspath(filename))
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename[:-4]
ofolder =os.path.join(dir, basename)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if filename.endswith('.nsp'):
try:
files_list=sq_tools.ret_nsp_offsets(filename)
f = Fs.Nsp(filename, 'rb')
f.copy_as_plaintext(ofolder,files_list,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
files_list=sq_tools.ret_xci_offsets(filename)
#print(files_list)
f = Fs.Xci(filename)
f.copy_as_plaintext(ofolder,files_list,buffer)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...........................................................................
# Read npdm from inside nsp or xci
# ...........................................................................
if args.Read_npdm:
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.Read_npdm:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.Read_npdm:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-npdm.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith(".nsp"):
try:
files_list=sq_tools.ret_nsp_offsets(filename)
f = Fs.Nsp(filename, 'rb')
feed=f.read_npdm(files_list)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith(".xci"):
try:
files_list=sq_tools.ret_xci_offsets(filename)
f = Fs.Xci(filename)
feed=f.read_npdm(files_list)
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Read cnmt inside nsp or xci
# ...................................................
if args.Read_cnmt:
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.Read_cnmt:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.Read_cnmt:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-meta.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx') or filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
feed=f.read_cnmt()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci') or filename.endswith('.xcz'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
feed=f.read_cnmt()
f.flush()
f.close()
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'rb')
if str(f.header.contentType) == 'Content.META':
feed=f.read_cnmt()
f.flush()
f.close()
else:
basename=str(os.path.basename(os.path.abspath(filename)))
basename=basename.lower()
feed=''
message=basename+' is not a TYPE META NCA';print(message);feed+=message+'\n'
if not args.text_file:
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Change Required System Version in an nca file
# ...................................................
if args.patchversion:
for input in args.patchversion:
try:
number = int(input)
break
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
number = 336592896
if args.set_cnmt_RSV:
for filename in args.set_cnmt_RSV:
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'r+b')
f.write_req_system(number)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha=f.calc_pfs0_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.set_pfs0_hash(sha)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha2=f.calc_htable_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_htable_hash(sha2)
f.flush()
f.close()
########################
f = Fs.Nca(filename, 'r+b')
sha3=f.header.calculate_hblock_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_hblock_hash(sha3)
f.flush()
f.close()
########################
with open(filename, 'r+b') as file:
nsha=sha256(file.read()).hexdigest()
newname=nsha[:32] + '.cnmt.nca'
Print.info('New name: ' + newname )
dir=os.path.dirname(os.path.abspath(filename))
newpath =os.path.join(dir, newname)
os.rename(filename, newpath)
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.nsp'):
try:
f = Fs.Nsp(filename, 'r+b')
f.metapatcher(number)
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
#parser.add_argument('--set_cnmt_titleid', nargs='+', help='Changes cnmt.nca titleid')
if args.set_cnmt_titleid:
filename=args.set_cnmt_titleid[0]
value=args.set_cnmt_titleid[1]
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'r+b')
f.header.setTitleID(value)
print(hx(f.header.getTitleID()))
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
f.write_cnmt_titleid(value)
f.write_cnmt_updid(value[:-4]+'80'+value[-1])
#print(hx(f.get_cnmt_titleid()))
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
#f.write_cnmt_titleid(value)
print(hx(f.get_cnmt_titleid()))
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha=f.calc_pfs0_hash()
Print.info(tabs + '- Calculated hash from pfs0: ')
Print.info(tabs +' + '+ str(hx(sha)))
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.set_pfs0_hash(sha)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha2=f.calc_htable_hash()
Print.info(tabs + '- Calculated hash from pfs0: ')
Print.info(tabs +' + '+ str(hx(sha2)))
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_htable_hash(sha2)
f.flush()
f.close()
########################
f = Fs.Nca(filename, 'r+b')
sha3=f.header.calculate_hblock_hash()
Print.info(tabs + '- Calculated hash from pfs0: ')
Print.info(tabs +' + '+ str(hx(sha3)))
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_hblock_hash(sha3)
f.flush()
f.close()
########################
with open(filename, 'r+b') as file:
nsha=sha256(file.read()).hexdigest()
newname=nsha[:32] + '.cnmt.nca'
Print.info('New name: ' + newname )
dir=os.path.dirname(os.path.abspath(filename))
newpath =os.path.join(dir, newname)
os.rename(filename, newpath)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Change version number from nca
# ...................................................
if args.set_cnmt_version:
if args.patchversion:
for input in args.patchversion:
try:
number = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
number = 65536
for filename in args.set_cnmt_version:
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'r+b')
f.write_version(number)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha=f.calc_pfs0_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.set_pfs0_hash(sha)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha2=f.calc_htable_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_htable_hash(sha2)
f.flush()
f.close()
########################
f = Fs.Nca(filename, 'r+b')
sha3=f.header.calculate_hblock_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_hblock_hash(sha3)
f.flush()
f.close()
########################
with open(filename, 'r+b') as file:
nsha=sha256(file.read()).hexdigest()
newname=nsha[:32] + '.cnmt.nca'
Print.info('New name: ' + newname )
dir=os.path.dirname(os.path.abspath(filename))
newpath =os.path.join(dir, newname)
os.rename(filename, newpath)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ..................
# Read hfs0
# ..................
if args.Read_hfs0:
for filename in args.Read_hfs0:
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
f.readhfs0()
#f.printInfo()
f.flush()
f.close()
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Update hashes in cnmt file
# ...................................................
if args.update_hash:
for filename in args.update_hash:
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'r+b')
pfs0_size,block_size,multiplier=f.get_pfs0_hash_data()
Print.info('block size in bytes: ' + str(hx(block_size.to_bytes(8, byteorder='big'))))
Print.info('Pfs0 size: ' + str(hx(pfs0_size.to_bytes(8, byteorder='big'))))
Print.info('Multiplier: ' + str(multiplier))
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha=f.calc_pfs0_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.set_pfs0_hash(sha)
f.flush()
f.close()
############################
f = Fs.Nca(filename, 'r+b')
sha2=f.calc_htable_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_htable_hash(sha2)
f.flush()
f.close()
########################
f = Fs.Nca(filename, 'r+b')
sha3=f.header.calculate_hblock_hash()
f.flush()
f.close()
f = Fs.Nca(filename, 'r+b')
f.header.set_hblock_hash(sha3)
f.flush()
f.close()
########################
with open(filename, 'r+b') as file:
nsha=sha256(file.read()).hexdigest()
newname=nsha[:32] + '.cnmt.nca'
Print.info('New name: ' + newname )
dir=os.path.dirname(os.path.abspath(filename))
newpath =os.path.join(dir, newname)
os.rename(filename, newpath)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# LISTMANAGER
# ..................
# Generate cnmt.xml
# ..................
if args.xml_gen:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.xml_gen:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
for filename in args.xml_gen:
if filename.endswith('.nca'):
try:
with open(filename, 'r+b') as file:
nsha=sha256(file.read()).hexdigest()
f = Fs.Nca(filename, 'r+b')
f.xml_gen(ofolder,nsha)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Change line in text file
# ...................................................
if args.change_line:
if args.line_number:
try:
line_number = int(args.line_number)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.new_line:
try:
new_line = str(args.new_line)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.change_line:
try:
config_file=os.path.abspath(str(args.change_line))
lines = open(str(config_file)).read().splitlines()
lines[line_number] = str(new_line)
open(str(config_file),'w').write('\n'.join(lines))
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Read line in text file
# ...................................................
if args.read_line:
if args.new_line:
try:
write_line = str(args.new_line)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.line_number:
try:
line_number = int(args.line_number)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.read_line:
try:
indent = 4
tabs = '\t' * indent
config_file=os.path.abspath(str(args.read_line))
lines = open(str(config_file)).read().splitlines()
line2read= str(lines[line_number])
Print.info(write_line + line2read)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Strip line in text file
# ...................................................
#parser.add_argument('-stripl', '--strip_lines', nargs='+', help='Strips lines from a text file')
if args.strip_lines:
if args.strip_lines[0]:
textfile=args.strip_lines[0]
try:
if args.strip_lines[1]:
number=args.strip_lines[1]
else:
number=1
except:
number=1
try:
if args.strip_lines[2]:
uinput=args.strip_lines[2]
if str(uinput).upper() == 'TRUE':
counter=True
else:
counter=False
except:
counter=False
try:
listmanager.striplines(textfile,number,counter)
except BaseException as e:
Print.error('Exception: ' + str(e))
#parser.add_argument('-showcline', '--show_current_line', nargs='+', help='Shows current line')
if args.show_current_line:
if args.show_current_line[0]:
textfile=args.show_current_line[0]
try:
number=args.show_current_line[1]
except:
number=1
try:
listmanager.printcurrent(textfile,number)
except BaseException as e:
Print.error('Exception: ' + str(e))
#parser.add_argument('-countlines', '--count_n_lines', nargs='+', help='Count the number of lines')
if args.count_n_lines:
if args.count_n_lines[0]:
textfile=args.count_n_lines[0]
try:
c=listmanager.counter(textfile)
print('STILL '+str(c)+' FILES TO PROCESS')
except BaseException as e:
Print.error('Exception: ' + str(e))
#parser.add_argument('-dff', '--delete_item', nargs='+', help='Deletes a os item listed in text file, a file or a folder')
if args.delete_item:
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
ruta = filelist.readline()
ruta=os.path.abspath(ruta.rstrip('\n'))
ruta = os.path.abspath(ruta)
try:
os.remove(ruta)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
else:
ruta = os.path.abspath(args.delete_item[0])
if os.path.isdir(ruta):
try:
shutil.rmtree(ruta, ignore_errors=True)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
elif os.path.isfile(ruta):
try:
os.remove(ruta)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
else:
print('Input is not a system file or folder')
Status.close()
# ...................................................
# Generate list of files
# ...................................................
#parser.add_argument('-tid', '--titleid', nargs='+', help='Filter with titleid')
#parser.add_argument('-bid', '--baseid', nargs='+', help='Filter with base titleid')
if args.findfile:
raised_error=False
if args.findfile == 'uinput':
ruta=input("PLEASE DRAG A FILE OR FOLDER OVER THE WINDOW AND PRESS ENTER: ")
if '&' in ruta:
varout='999'
elif len(ruta)<2:
varout=ruta
else:
varout='999'
if args.userinput:
userfile=args.userinput
else:
userfile='uinput'
with open(userfile,"w", encoding='utf8') as userinput:
userinput.write(varout)
else:
ruta=args.findfile
try:
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
except:
raised_error=True
if not os.path.exists(ruta):
raised_error=True
if raised_error==False:
extlist=list()
if args.type:
for t in args.type:
if t=="all":
extlist.append('all')
continue
x='.'+t
extlist.append(x)
if x[-1]=='*':
x=x[:-1]
extlist.append(x)
elif x==".00":
extlist.append('00')
#print(extlist)
if args.filter:
for f in args.filter:
filter=f
filelist=list()
try:
fname=""
binbin='RECYCLE.BIN'
if not 'all' in extlist:
for ext in extlist:
#print (ext)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
else:
# print(ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
filename = ruta
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
if args.text_file:
tfile=args.text_file
with open(tfile,"a", encoding='utf8') as tfile:
for line in filelist:
try:
tfile.write(line+"\n")
except:
continue
else:
for line in filelist:
try:
print (line)
except:
continue
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.nint_keys:
try:
sq_tools.verify_nkeys(args.nint_keys)
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Clean tags in filenames
# ...................................................
#parser.add_argument('-tgtype','--tagtype', help="Type of tag to remove")
if args.cleantags:
if args.tagtype:
if args.tagtype=="[]":
tagtype='brackets'
elif args.tagtype=="()":
tagtype='parenthesis'
elif args.tagtype=="(":
tagtype='('
elif args.tagtype=="[":
tagtype='['
else:
tagtype=False
else:
tagtype=False
if args.text_file and args.cleantags == 'single':
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
ruta = filelist.readline()
ruta=os.path.abspath(ruta.rstrip('\n'))
ruta = os.path.abspath(ruta)
else:
ruta=args.cleantags
#print(ruta)
indent = 1
tabs = '\t' * indent
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
if args.type:
for t in args.type:
x='.'+t
extlist.append(x)
if x[-1]=='*':
x=x[:-1]
extlist.append(x)
#print(extlist)
if args.filter:
for f in args.filter:
filter=f
filelist=list()
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
'''
for f in filelist:
print(f)
'''
print('Items to process: '+str(len(filelist)))
counter=len(filelist)
for filepath in filelist:
basename=str(os.path.basename(os.path.abspath(filepath)))
endname=basename
dir=os.path.dirname(os.path.abspath(filepath))
print('Filename: '+basename)
if tagtype==False or tagtype=='brackets':
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
t=filepath[i1:i2]
endname=endname.replace(t,'')
endname=endname.replace(' ',' ')
if tagtype=='[':
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
endname=filepath[:i1]+filepath[-4:]
break
if tagtype=='(':
tid3=list()
tid4=list()
tid3=[pos for pos, char in enumerate(endname) if char == '(']
tid4=[pos for pos, char in enumerate(endname) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
endname=filepath[:i3]+filepath[-4:]
break
if tagtype==False or tagtype=='parenthesis':
tid3=list()
tid4=list()
tid3=[pos for pos, char in enumerate(endname) if char == '(']
tid4=[pos for pos, char in enumerate(endname) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
t=endname[i3:i4]
#print('t is '+t)
endname=endname.replace(t,'')
endname=endname.replace(' ',' ')
endname=endname.replace(' .','.')
dir=os.path.dirname(os.path.abspath(filepath))
newpath=os.path.join(dir,endname)
print('New name: '+endname)
if os.path.exists(newpath) and newpath != filepath:
if filepath.endswith('.xci'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.xci'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsp'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsp'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.xcz'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.xcz'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsx'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsx'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsz'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsz'
newpath=os.path.join(dir,endname)
try:
os.rename(filepath, newpath)
print(tabs+'> File was renamed to: '+endname)
except BaseException as e:
pass
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Rename file with proper name
# ...................................................
#parser.add_argument('-oaid','--onlyaddid', help='Rename file with proper name')
#parser.add_argument('-renm','--renmode', help='Rename mode (force,skip_corr_tid,skip_if_tid)')
#parser.add_argument('-addl','--addlangue', help='Add language string')
#parser.add_argument('-nover','--noversion', help="Don't add version (false,true,xci_no_v0)")
#parser.add_argument('-dlcrn','--dlcrname', help="If false keeps base name in dlcs")
if args.renamef:
import nutdb
languetag=''
if args.romanize:
for input in args.romanize:
roman=str(input).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.onlyaddid:
if args.onlyaddid=="true" or args.onlyaddid == "True" or args.onlyaddid == "TRUE":
onaddid=True
elif args.onlyaddid=="idtag":
onaddid='idtag'
else:
onaddid=False
else:
onaddid=False
if args.addlangue:
if args.addlangue=="true" or args.addlangue == "True" or args.addlangue == "TRUE":
addlangue=True
else:
addlangue=False
else:
addlangue=False
if args.renmode:
if args.renmode=="skip_if_tid":
renmode="skip_if_tid"
elif args.renmode=="force":
renmode="force"
else:
renmode="skip_corr_tid"
else:
renmode="skip_corr_tid"
if args.noversion:
if args.noversion=="true" or args.noversion == "True" or args.noversion == "TRUE":
nover=True
elif args.noversion=="xci_no_v0":
nover="xci_no_v0"
else:
nover=False
else:
nover=False
if args.dlcrname:
if args.dlcrname=="true" or args.dlcrname == "True" or args.dlcrname == "TRUE":
dlcrname=True
elif args.dlcrname=="tag" or args.dlcrname == "Tag" or args.dlcrname == "TAG":
dlcrname='tag'
else:
dlcrname=False
else:
dlcrname=False
if args.text_file and args.renamef == 'single':
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
ruta = filelist.readline()
ruta=os.path.abspath(ruta.rstrip('\n'))
ruta = os.path.abspath(ruta)
else:
ruta=args.renamef
#print(ruta)
indent = 1
tabs = '\t' * indent
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
if args.type:
for t in args.type:
x='.'+t
extlist.append(x)
if x[-1]=='*':
x=x[:-1]
extlist.append(x)
#print(extlist)
if args.filter:
for f in args.filter:
filter=f
filelist=list()
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
'''
for f in filelist:
print(f)
'''
if args.text_file:
print('Items to process: '+str(len(filelist)))
counter=len(filelist)
for filepath in filelist:
setskip=False
if renmode == "skip_if_tid":
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
if len(t)==16:
try:
int(filepath[i1:i2], 16)
basename=str(os.path.basename(os.path.abspath(filepath)))
print('Filename: '+basename)
print(tabs+'> File already has id: '+filepath[i1:i2])
setskip=True
except:
pass
if setskip == True:
counter=int(counter)
counter-=1
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
continue
if filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
try:
prlist=list()
f = Fs.Nsp(filepath)
contentlist=f.get_content(False,False,True)
#print(contentlist)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
continue
#print(prlist)
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
filepath.strip()
print("Processing "+filepath)
#print(filepath)
try:
prlist=list()
#f = Fs.Xci(filepath)
f = Fs.factory(filepath)
f.open(filepath, 'rb')
contentlist=f.get_content(False,False,True)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
continue
if filepath.endswith('.xci') or filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz') or filepath.endswith('.xcz'):
basecount=0; basename='';basever='';baseid='';basefile=''
updcount=0; updname='';updver='';updid='';updfile=''
dlccount=0; dlcname='';dlcver='';dlcid='';dlcfile=''
endname=0; mgame=''
ccount='';bctag='';updtag='';dctag=''
for i in range(len(prlist)):
#print(prlist[i][5])
if prlist[i][5] == 'BASE':
basecount+=1
if baseid == "":
basefile=str(prlist[i][0])
baseid=str(prlist[i][1])
basever='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'UPDATE':
updcount+=1
endver=str(prlist[i][6])
if updid == "":
#print(str(prlist))
updfile=str(prlist[i][0])
updid=str(prlist[i][1])
updver='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'DLC':
dlccount+=1
if dlcid == "":
dlcfile=str(prlist[i][0])
dlcid=str(prlist[i][1])
dlcver='[v'+str(prlist[i][6])+']'
if basecount !=0:
bctag=str(basecount)+'G'
else:
bctag=''
if updcount !=0:
if bctag != '':
updtag='+'+str(updcount)+'U'
else:
updtag=str(updcount)+'U'
else:
updtag=''
if dlccount !=0:
if bctag != '' or updtag != '':
dctag='+'+str(dlccount)+'D'
else:
dctag=str(dlccount)+'D'
else:
dctag=''
ccount='('+bctag+updtag+dctag+')'
if baseid != "":
basename=str(os.path.basename(os.path.abspath(filepath)))
basename2=basename.upper()
check=str('['+baseid+']').upper()
#print(basename)
#print(check)
if renmode != "force":
if basename2.find(check) != -1:
print('Filename: '+basename)
print(tabs+"> File already has correct id: "+baseid)
counter=int(counter)
counter-=1
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
continue
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
f = Fs.Xci(basefile)
elif filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
f = Fs.Nsp(basefile)
ctitl=f.get_title(baseid,roman)
if addlangue==True:
languetag=f.get_lang_tag(baseid)
if languetag != False:
ctitl=ctitl+' '+languetag
#print(ctitl)
#print(baseid)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
ctitl=''
elif updid !="":
basename=str(os.path.basename(os.path.abspath(filepath)))
basename2=basename.upper()
check=str('['+updid+']').upper()
if renmode != "force":
if basename2.find(check) != -1:
basename=os.path.basename(os.path.abspath(filepath))
print('Filename: '+basename)
print(tabs+"> File already has correct id: "+updid)
counter=int(counter)
counter-=1
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
continue
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
f = Fs.Xci(updfile)
elif filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
f = Fs.Nsp(updfile)
ctitl=f.get_title(updid,roman)
if addlangue==True:
languetag=f.get_lang_tag(baseid)
if languetag != False:
ctitl=ctitl+' '+languetag
#print(ctitl)
#print(updid)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
ctitl=''
elif dlcid !="":
basename=str(os.path.basename(os.path.abspath(filepath)))
basename2=basename.upper()
check=str('['+dlcid+']').upper()
if renmode != "force":
if basename2.find(check) != -1:
print('Filename: '+basename)
print(tabs+"> File already has correct id: "+dlcid)
counter=int(counter)
counter-=1
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
continue
else:
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
f = Fs.Xci(dlcfile)
elif filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
f = Fs.Nsp(dlcfile)
ctitl=f.get_title(dlcid,roman)
f.flush()
f.close()
elif dlcrname == False or dlcrname == 'tag':
nutdbname=nutdb.get_dlcname(dlcid)
if nutdbname!=False:
dlcname=nutdbname
else:
dlcname=str(os.path.basename(os.path.abspath(filepath)))
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
i1=tid1[i]
i2=tid2[i]+1
t=filepath[i1:i2]
dlcname=dlcname.replace(t,'')
dlcname=dlcname.replace(' ',' ')
tid3=[pos for pos, char in enumerate(dlcname) if char == '(']
tid4=[pos for pos, char in enumerate(dlcname) if char == ')']
if len(tid3)>=len(tid4):
lentlist=len(tid3)
elif len(tid3)<len(tid4):
lentlist=len(tid4)
for i in range(lentlist):
i3=tid3[i]
i4=tid4[i]+1
t=dlcname[i3:i4]
dlcname=dlcname.replace(t,'')
dlcname=dlcname.replace(' ',' ')
if dlcname.endswith('.xci') or dlcname.endswith('.nsp') or dlcname.endswith('.xcz') or dlcname.endswith('.nsz'):
dlcname=dlcname[:-4]
if dlcname.endswith(' '):
dlcname=dlcname[:-1]
ctitl=dlcname
if dlcrname == 'tag':
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
f = Fs.Xci(dlcfile)
elif filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
f = Fs.Nsp(dlcfile)
dlctag=f.get_title(dlcid,tag=True)
dlctag='['+dlctag+']'
ctitl=ctitl+' '+dlctag
f.flush()
f.close()
else:
if filepath.endswith('.xci') or filepath.endswith('.xcz'):
f = Fs.Xci(dlcfile)
elif filepath.endswith('.nsp') or filepath.endswith('.nsx') or filepath.endswith('.nsz'):
f = Fs.Nsp(dlcfile)
ctitl=f.get_title(dlcid)
f.flush()
f.close()
else:
ctitl='UNKNOWN'
baseid='['+baseid.upper()+']'
updid='['+updid.upper()+']'
dlcid='['+dlcid.upper()+']'
if basecount>1:
mgame='(mgame)'
if ccount == '(1G)' or ccount == '(1U)' or ccount == '(1D)':
ccount=''
basename=str(os.path.basename(os.path.abspath(filepath)))
if onaddid=='idtag':
from pathlib import Path
g=Path(basename).stem
try:
g0=[pos for pos, char in enumerate(g) if char == '[']
ctitl=(g[0:g0[0]]).strip()
except:
ctitl=g.strip()
if languetag!='':
ctitl+=' '+languetag
renmode="force"
onaddid=False
if baseid != "" and baseid != "[]":
if updver != "":
if onaddid==True:
endname=basename[:-4]+' '+baseid
elif nover == True and (ccount==''):
endname=ctitl+' '+baseid
else:
endname=ctitl+' '+baseid+' '+updver+' '+ccount+' '+mgame
else:
if onaddid==True:
endname=basename[:-4]+' '+baseid
elif nover == True and (ccount==''):
endname=ctitl+' '+baseid
elif (filepath.endswith('.xci') or filepath.endswith('.xcz')) and nover=="xci_no_v0" and ccount=='':
if renmode=="force":
endname=ctitl+' '+baseid+' '+ccount+' '+mgame
elif onaddid==True:
endname=basename[:-4]+' '+baseid
else:
endname=ctitl+' '+baseid+' '+ccount+' '+mgame
else:
endname=ctitl+' '+baseid+' '+basever+' '+ccount+' '+mgame
elif updid !="" and updid != "[]":
if onaddid==True:
endname=basename[:-4]+' '+updid
elif nover == True and (ccount==''):
endname=ctitl+' '+updid
else:
endname=ctitl+' '+updid+' '+updver+' '+ccount+' '+mgame
else:
if onaddid==True:
endname=basename[:-4]+' '+dlcid
elif nover == True and (ccount==''):
endname=ctitl+' '+dlcid
else:
endname=ctitl+' '+dlcid+' '+dlcver+' '+ccount+' '+mgame
while endname[-1]==' ':
endname=endname[:-1]
#endname = re.sub(r'[\/\\\:\*\?\"\<\>\|\.\s™©®()\~]+', ' ', endname)
endname = (re.sub(r'[\/\\\:\*\?]+', '', endname))
endname = re.sub(r'[™©®`~^´ªº¢#£€¥$ƒ±¬½¼♡«»±•²‰œæÆ³☆<<>>|]', '', endname)
endname = re.sub(r'[Ⅰ]', 'I', endname);endname = re.sub(r'[Ⅱ]', 'II', endname)
endname = re.sub(r'[Ⅲ]', 'III', endname);endname = re.sub(r'[Ⅳ]', 'IV', endname)
endname = re.sub(r'[Ⅴ]', 'V', endname);endname = re.sub(r'[Ⅵ]', 'VI', endname)
endname = re.sub(r'[Ⅶ]', 'VII', endname);endname = re.sub(r'[Ⅷ]', 'VIII', endname)
endname = re.sub(r'[Ⅸ]', 'IX', endname);endname = re.sub(r'[Ⅹ]', 'X', endname)
endname = re.sub(r'[Ⅺ]', 'XI', endname);endname = re.sub(r'[Ⅻ]', 'XII', endname)
endname = re.sub(r'[Ⅼ]', 'L', endname);endname = re.sub(r'[Ⅽ]', 'C', endname)
endname = re.sub(r'[Ⅾ]', 'D', endname);endname = re.sub(r'[Ⅿ]', 'M', endname)
endname = re.sub(r'[—]', '-', endname);endname = re.sub(r'[√]', 'Root', endname)
endname = re.sub(r'[àâá@äå]', 'a', endname);endname = re.sub(r'[ÀÂÁÄÅ]', 'A', endname)
endname = re.sub(r'[èêéë]', 'e', endname);endname = re.sub(r'[ÈÊÉË]', 'E', endname)
endname = re.sub(r'[ìîíï]', 'i', endname);endname = re.sub(r'[ÌÎÍÏ]', 'I', endname)
endname = re.sub(r'[òôóöø]', 'o', endname);endname = re.sub(r'[ÒÔÓÖØ]', 'O', endname)
endname = re.sub(r'[ùûúü]', 'u', endname);endname = re.sub(r'[ÙÛÚÜ]', 'U', endname)
endname = re.sub(r'[’]', "'", endname);endname = re.sub(r'[“”]', '"', endname)
endname = re.sub(' {3,}', ' ',endname);re.sub(' {2,}', ' ',endname);
try:
endname = endname.replace("( ", "(");endname = endname.replace(" )", ")")
endname = endname.replace("[ ", "[");endname = endname.replace(" ]", "]")
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace("[]", "");endname = endname.replace("()", "")
endname = endname.replace('" ','"');endname = endname.replace(' "','"')
endname = endname.replace(" !", "!");endname = endname.replace(" ?", "?")
endname = endname.replace(" ", " ");endname = endname.replace(" ", " ")
endname = endname.replace('"', '');
endname = endname.replace(')', ') ');endname = endname.replace(']', '] ')
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace(" ", " ")
if endname.endswith(' '):
endname=endname[:-1]
if dlcrname == 'tag':
endname = endname.replace('DLC number', 'DLC')
except:pass
if filepath.endswith('.xci'):
endname=endname+'.xci'
elif filepath.endswith('.xcz'):
endname=endname+'.xcz'
elif filepath.endswith('.nsp'):
endname=endname+'.nsp'
elif filepath.endswith('.nsx'):
endname=endname+'.nsx'
elif filepath.endswith('.nsz'):
endname=endname+'.nsz'
basename=str(os.path.basename(os.path.abspath(filepath)))
dir=os.path.dirname(os.path.abspath(filepath))
newpath=os.path.join(dir,endname)
if os.path.exists(newpath) and newpath != filepath:
if filepath.endswith('.xci'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.xci'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.xcz'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.xcz'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsp'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsp'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsx'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsx'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsz'):
endname=endname[:-4]+' (SeemsDuplicate)'+'.nsz'
newpath=os.path.join(dir,endname)
if ctitl=='UNKNOWN':
if filepath.endswith('.xci'):
endname=basename[:-4]+' (needscheck)'+'.xci'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.xcz'):
endname=basename[:-4]+' (needscheck)'+'.xcz'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsp'):
endname=basename[:-4]+' (needscheck)'+'.nsp'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsx'):
endname=basename[:-4]+' (needscheck)'+'.nsx'
newpath=os.path.join(dir,endname)
elif filepath.endswith('.nsz'):
endname=basename[:-4]+' (needscheck)'+'.nsz'
newpath=os.path.join(dir,endname)
print('Old Filename: '+basename)
print('Filename: '+endname)
os.rename(filepath, newpath)
counter=int(counter)
counter-=1
print(tabs+'File was renamed')
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
Status.close()
# **********************
# Rename using txt file
# **********************
if args.renameftxt:
ruta=args.renameftxt
if args.romanize:
for input in args.romanize:
roman=str(input).upper()
if roman == "FALSE":
roman = False
else:
roman = True
else:
roman = True
if args.text_file:
tfile=args.text_file
filelist=list()
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
prlist=list()
print ('Calculating final name:')
for filepath in filelist:
if filepath.endswith('.nsp'):
#print(filepath)
try:
c=list()
f = Fs.Nsp(filepath)
contentlist=f.get_content(False,False,True)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
Print.error('Exception: ' + str(e))
if filepath.endswith('.xci'):
try:
c=list()
f = Fs.Xci(filepath)
contentlist=f.get_content(False,False,True)
f.flush()
f.close()
if len(prlist)==0:
for i in contentlist:
prlist.append(i)
#print (prlist)
else:
for j in range(len(contentlist)):
notinlist=False
for i in range(len(prlist)):
#print (contentlist[j][1])
#print (contentlist[j][6])
#pass
if contentlist[j][1] == prlist[i][1]:
if contentlist[j][6] > prlist[i][6]:
del prlist[i]
prlist.append(contentlist[j])
notinlist=False
elif contentlist[j][6] == prlist[i][6]:
notinlist=False
else:
notinlist=True
if notinlist == True:
prlist.append(contentlist[j])
except BaseException as e:
Print.error('Exception: ' + str(e))
basecount=0; basename='';basever='';baseid='';basefile=''
updcount=0; updname='';updver='';updid='';updfile=''
dlccount=0; dlcname='';dlcver='';dlcid='';dlcfile=''
ccount='';bctag='';updtag='';dctag=''
for i in range(len(prlist)):
if prlist[i][5] == 'BASE':
basecount+=1
if baseid == "":
basefile=str(prlist[i][0])
baseid=str(prlist[i][1])
basever='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'UPDATE':
updcount+=1
endver=str(prlist[i][6])
if updid == "":
updfile=str(prlist[i][0])
updid=str(prlist[i][1])
updver='[v'+str(prlist[i][6])+']'
if prlist[i][5] == 'DLC':
dlccount+=1
if dlcid == "":
dlcfile=str(prlist[i][0])
dlcid=str(prlist[i][1])
dlcver='[v'+str(prlist[i][6])+']'
if basecount !=0:
bctag=str(basecount)+'G'
else:
bctag=''
if updcount !=0:
if bctag != '':
updtag='+'+str(updcount)+'U'
else:
updtag=str(updcount)+'U'
else:
updtag=''
if dlccount !=0:
if bctag != '' or updtag != '':
dctag='+'+str(dlccount)+'D'
else:
dctag=str(dlccount)+'D'
else:
dctag=''
ccount='('+bctag+updtag+dctag+')'
if baseid != "":
if basefile.endswith('.xci'):
f = Fs.Xci(basefile)
elif basefile.endswith('.nsp'):
f = Fs.Nsp(basefile)
ctitl=f.get_title(baseid)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
ctitl=''
elif updid !="":
if updfile.endswith('.xci'):
f = Fs.Xci(updfile)
elif updfile.endswith('.nsp'):
f = Fs.Nsp(updfile)
ctitl=f.get_title(updid)
f.flush()
f.close()
if ctitl=='DLC' or ctitl=='-':
ctitl=''
elif dlcid !="":
ctitl=get_title
if dlcfile.endswith('.xci'):
f = Fs.Xci(dlcfile)
elif dlcfile.endswith('.nsp'):
f = Fs.Nsp(dlcfile)
ctitl=f.get_title(dlcid)
f.flush()
f.close()
else:
ctitl='UNKNOWN'
baseid='['+baseid.upper()+']'
updid='['+updid.upper()+']'
dlcid='['+dlcid.upper()+']'
if ccount == '(1G)' or ccount == '(1U)' or ccount == '(1D)':
ccount=''
if baseid != "[]":
if updver != "":
endname=ctitl+' '+baseid+' '+updver+' '+ccount
else:
endname=ctitl+' '+baseid+' '+basever+' '+ccount
elif updid != "[]":
endname=ctitl+' '+updid+' '+updver+' '+ccount
else:
endname=ctitl+' '+dlcid+' '+dlcver+' '+ccount
#print('Filename: '+endname)
else:
endname=str(f)
if rom == True:
kakasi = pykakasi.kakasi()
kakasi.setMode("H", "a")
kakasi.setMode("K", "a")
kakasi.setMode("J", "a")
kakasi.setMode("s", True)
kakasi.setMode("E", "a")
kakasi.setMode("a", None)
kakasi.setMode("C", False)
converter = kakasi.getConverter()
endname=converter.do(endname)
endname=endname[0].upper()+endname[1:]
endname = (re.sub(r'[\/\\\:\*\?]+', '', endname))
endname = re.sub(r'[™©®`~^´ªº¢#£€¥$ƒ±¬½¼♡«»±•²‰œæÆ³☆<<>>|]', '', endname)
endname = re.sub(r'[Ⅰ]', 'I', endname);endname = re.sub(r'[Ⅱ]', 'II', endname)
endname = re.sub(r'[Ⅲ]', 'III', endname);endname = re.sub(r'[Ⅳ]', 'IV', endname)
endname = re.sub(r'[Ⅴ]', 'V', endname);endname = re.sub(r'[Ⅵ]', 'VI', endname)
endname = re.sub(r'[Ⅶ]', 'VII', endname);endname = re.sub(r'[Ⅷ]', 'VIII', endname)
endname = re.sub(r'[Ⅸ]', 'IX', endname);endname = re.sub(r'[Ⅹ]', 'X', endname)
endname = re.sub(r'[Ⅺ]', 'XI', endname);endname = re.sub(r'[Ⅻ]', 'XII', endname)
endname = re.sub(r'[Ⅼ]', 'L', endname);endname = re.sub(r'[Ⅽ]', 'C', endname)
endname = re.sub(r'[Ⅾ]', 'D', endname);endname = re.sub(r'[Ⅿ]', 'M', endname)
endname = re.sub(r'[—]', '-', endname);endname = re.sub(r'[√]', 'Root', endname)
endname = re.sub(r'[àâá@äå]', 'a', endname);endname = re.sub(r'[ÀÂÁÄÅ]', 'A', endname)
endname = re.sub(r'[èêéë]', 'e', endname);endname = re.sub(r'[ÈÊÉË]', 'E', endname)
endname = re.sub(r'[ìîíï]', 'i', endname);endname = re.sub(r'[ÌÎÍÏ]', 'I', endname)
endname = re.sub(r'[òôóöø]', 'o', endname);endname = re.sub(r'[ÒÔÓÖØ]', 'O', endname)
endname = re.sub(r'[ùûúü]', 'u', endname);endname = re.sub(r'[ÙÛÚÜ]', 'U', endname)
endname = re.sub(r'[’]', "'", endname);endname = re.sub(r'[“”]', '"', endname)
endname = re.sub(' {3,}', ' ',endname);re.sub(' {2,}', ' ',endname);
try:
endname = endname.replace("( ", "(");endname = endname.replace(" )", ")")
endname = endname.replace("[ ", "[");endname = endname.replace(" ]", "]")
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace("[]", "");endname = endname.replace("()", "")
endname = endname.replace('" ','"');endname = endname.replace(' "','"')
endname = endname.replace(" !", "!");endname = endname.replace(" ?", "?")
endname = endname.replace(" ", " ");endname = endname.replace(" ", " ")
endname = endname.replace('"', '');
endname = endname.replace(')', ') ');endname = endname.replace(']', '] ')
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace(" ", " ")
except:pass
if endname[-1]==' ':
endname=endname[:-1]
ext=ruta[-4:]
endname=endname+ext
print('New name: '+endname)
basename=str(os.path.basename(os.path.abspath(ruta)))
dir=os.path.dirname(os.path.abspath(ruta))
newpath=os.path.join(dir,endname)
try:
os.rename(ruta, newpath)
print(tabs+'> File was renamed to: '+endname)
except BaseException as e:
pass
Status.close()
#parser.add_argument('-snz','--sanitize', help='Remove unreadable characters from names')
#parser.add_argument('-roma','--romanize', help='Translate kanji and extended kanna to romaji and sanitize name')
if not args.direct_multi and not args.fw_req and not args.renameftxt and not args.renamef and not args.Read_nacp and not args.addtodb and (args.sanitize or args.romanize):
if args.sanitize:
san=True; rom=False
route=args.sanitize[0]
elif args.romanize:
san=True; rom=True
route=args.romanize[0]
else:
route=False
if route != False:
if args.text_file and route == 'single':
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as filelist:
ruta = filelist.readline()
ruta=os.path.abspath(ruta.rstrip('\n'))
ruta = os.path.abspath(ruta)
else:
ruta=route
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
if args.type:
for t in args.type:
x='.'+t
extlist.append(x)
if x[-1]=='*':
x=x[:-1]
extlist.append(x)
filelist=list()
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
print('Items to process: '+str(len(filelist)))
counter=len(filelist)
for filepath in filelist:
basename=str(os.path.basename(os.path.abspath(filepath)))
dir=os.path.dirname(os.path.abspath(filepath))
print('Processing: '+filepath)
endname=basename
if rom == True:
kakasi = pykakasi.kakasi()
kakasi.setMode("H", "a")
kakasi.setMode("K", "a")
kakasi.setMode("J", "a")
kakasi.setMode("s", True)
kakasi.setMode("E", "a")
kakasi.setMode("a", None)
kakasi.setMode("C", False)
converter = kakasi.getConverter()
endname=converter.do(endname)
endname=endname[0].upper()+endname[1:]
if san == True:
endname = (re.sub(r'[\/\\\:\*\?]+', '', endname))
endname = re.sub(r'[™©®`~^´ªº¢#£€¥$ƒ±¬½¼♡«»±•²‰œæÆ³☆<<>>|]', '', endname)
endname = re.sub(r'[Ⅰ]', 'I', endname);endname = re.sub(r'[Ⅱ]', 'II', endname)
endname = re.sub(r'[Ⅲ]', 'III', endname);endname = re.sub(r'[Ⅳ]', 'IV', endname)
endname = re.sub(r'[Ⅴ]', 'V', endname);endname = re.sub(r'[Ⅵ]', 'VI', endname)
endname = re.sub(r'[Ⅶ]', 'VII', endname);endname = re.sub(r'[Ⅷ]', 'VIII', endname)
endname = re.sub(r'[Ⅸ]', 'IX', endname);endname = re.sub(r'[Ⅹ]', 'X', endname)
endname = re.sub(r'[Ⅺ]', 'XI', endname);endname = re.sub(r'[Ⅻ]', 'XII', endname)
endname = re.sub(r'[Ⅼ]', 'L', endname);endname = re.sub(r'[Ⅽ]', 'C', endname)
endname = re.sub(r'[Ⅾ]', 'D', endname);endname = re.sub(r'[Ⅿ]', 'M', endname)
endname = re.sub(r'[—]', '-', endname);endname = re.sub(r'[√]', 'Root', endname)
endname = re.sub(r'[àâá@äå]', 'a', endname);endname = re.sub(r'[ÀÂÁÄÅ]', 'A', endname)
endname = re.sub(r'[èêéë]', 'e', endname);endname = re.sub(r'[ÈÊÉË]', 'E', endname)
endname = re.sub(r'[ìîíï]', 'i', endname);endname = re.sub(r'[ÌÎÍÏ]', 'I', endname)
endname = re.sub(r'[òôóöø]', 'o', endname);endname = re.sub(r'[ÒÔÓÖØ]', 'O', endname)
endname = re.sub(r'[ùûúü]', 'u', endname);endname = re.sub(r'[ÙÛÚÜ]', 'U', endname)
endname = re.sub(r'[’]', "'", endname);endname = re.sub(r'[“”]', '"', endname)
endname = re.sub(' {3,}', ' ',endname);re.sub(' {2,}', ' ',endname);
try:
endname = endname.replace("( ", "(");endname = endname.replace(" )", ")")
endname = endname.replace("[ ", "[");endname = endname.replace(" ]", "]")
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace("[]", "");endname = endname.replace("()", "")
endname = endname.replace('" ','"');endname = endname.replace(' "','"')
endname = endname.replace(" !", "!");endname = endname.replace(" ?", "?")
endname = endname.replace(" ", " ");endname = endname.replace(" ", " ")
endname = endname.replace('"', '');
endname = endname.replace(')', ') ');endname = endname.replace(']', '] ')
endname = endname.replace("[ (", "[(");endname = endname.replace(") ]", ")]")
endname = endname.replace(" ", " ")
except:pass
if endname[-5]==" ":
endname=endname[:-5]+endname[-4:]
newpath=os.path.join(dir,endname)
print('Old Filename: '+basename)
print('Filename: '+endname)
os.rename(filepath, newpath)
counter=int(counter)
counter-=1
print(tabs+'File was renamed')
if not args.text_file:
print(tabs+'> Still '+str(counter)+' to go')
except BaseException as e:
counter=int(counter)
counter-=1
Print.error('Exception: ' + str(e))
Status.close()
# ...................................................
# Verify. File verification
# ...................................................
if args.verify_key:
orig_kg=False
if isinstance(args.verify_key, list):
filepath=args.verify_key[0]
userkey=args.verify_key[1]
# print(args.verify_key[2])
try:
if args.verify_key[2]:
if str(args.verify_key[2]).lower()=="true":
unlock=True
else:
unlock=False
except:
unlock=False
try:
if args.verify_key[3]:
try:
orig_kg=int(args.verify_key[3])
except:
orig_kg=False
except:
orig_kg=False
userkey=str(userkey).upper()
if filepath.endswith('.nsp') or filepath.endswith('.nsx'):
basename=str(os.path.basename(os.path.abspath(filepath)))
try:
f = Fs.Nsp(filepath, 'rb')
if orig_kg==False:
check=f.verify_input_key(userkey)
else:
check,userkey=f.verify_input_key_m2(userkey,orig_kg)
f.flush()
f.close()
if check==True:
print(('\nTitlekey {} is correct for '.format(userkey)).upper()+('"{}"').format(basename))
print("-- YOU CAN UNLOCK AND ENJOY THE GAME --")
if unlock==True:
print("--> UNLOCKING...")
f = Fs.Nsp(filepath, 'r+b')
f.unlock(userkey)
try:
f.flush()
f.close()
except:pass
else:
print(('\nTitlekey {} is incorrect for '.format(userkey)).upper()+('"{}"').format(basename))
print("-- BETTER LUCK NEXT TIME --")
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
print('Missing arguments')
if args.verify:
feed=''
if args.vertype:
if args.vertype=="dec" or args.vertype=="lv1":
vertype="lv1"
elif args.vertype=="sig" or args.vertype=="lv2":
vertype="lv2"
elif args.vertype=="sig" or args.vertype=="lv3":
vertype="lv3"
else:
vertype="lv1"
else:
vertype="lv1"
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for var in args.ofolder:
try:
ofolder = var
tmpfolder =os.path.join(ofolder,'tmp')
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.verify:
dir=os.path.dirname(os.path.abspath(filename))
info='INFO'
ofolder =os.path.join(dir,info)
tmpfolder =os.path.join(dir,'tmp')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.verify:
filename=filename
basename=str(os.path.basename(os.path.abspath(filename)))
ofile=basename[:-4]+'-verify.txt'
infotext=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx'):
try:
f = Fs.Nsp(filename, 'rb')
check,feed=f.verify()
f.flush()
f.close()
if not args.text_file:
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
i=0
print('\n********************************************************')
print('Do you want to verify the hash of the nca files?')
print('********************************************************')
while i==0:
print('Input "1" to VERIFY hash of files')
print('Input "2" to NOT verify hash of files\n')
ck=input('Input your answer: ')
if ck ==str(1):
print('')
f = Fs.Nsp(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,headerlist,verdict,feed)
f.flush()
f.close()
i=1
elif ck ==str(2):
f.flush()
f.close()
i=1
else:
print('WRONG CHOICE\n')
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
elif args.text_file:
if vertype == "lv2":
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
elif vertype == "lv3":
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
f = Fs.Nsp(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,headerlist,verdict,feed)
f.flush()
f.close()
if check == True:
check=verdict
if check == False:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write("IS INCORRECT"+'\n')
dir=os.path.dirname(os.path.abspath(tfile))
info='INFO'
subf='MASSVERIFY'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
ofolder =os.path.join(ofolder,subf)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
infotext=os.path.join(ofolder, ofile)
with open(infotext, 'w') as info:
info.write(feed)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write('Exception: ' + str(e)+'\n')
if filename.endswith('.xci'):
try:
f = Fs.factory(filename)
f.open(filename, 'rb')
check,feed=f.verify()
f.flush()
f.close()
if not args.text_file:
f = Fs.factory(filename)
f.open(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
i=0
print('\n********************************************************')
print('Do you want to verify the hash of the nca files?')
print('********************************************************')
while i==0:
print('Input "1" to VERIFY hash of files')
print('Input "2" to NOT verify hash of files\n')
check=input('Input your answer: ')
if check ==str(1):
print('')
f = Fs.factory(filename)
f.open(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,headerlist,verdict,feed)
f.flush()
f.close()
i=1
elif check ==str(2):
f.flush()
f.close()
i=1
else:
print('WRONG CHOICE\n')
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
check=input('Input your answer: ')
if check ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif check ==str(2):
i=1
else:
print('WRONG CHOICE\n')
elif args.text_file:
if vertype == "lv2":
f = Fs.factory(filename)
f.open(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
elif vertype == "lv3":
f = Fs.factory(filename)
f.open(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
f = Fs.factory(filename)
f.open(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,headerlist,verdict,feed)
f.flush()
f.close()
if check == True:
check=verdict
if check == False:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write("IS INCORRECT"+'\n')
dir=os.path.dirname(os.path.abspath(tfile))
info='INFO'
subf='MASSVERIFY'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
ofolder =os.path.join(ofolder,subf)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
infotext=os.path.join(ofolder, ofile)
with open(infotext, 'w') as info:
info.write(feed)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write('Exception: ' + str(e)+'\n')
if filename.endswith('.nsz'):
try:
f = Fs.Nsp(filename, 'rb')
check,feed=f.verify()
f.flush()
f.close()
if not args.text_file:
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
i=0
print('\n********************************************************')
print('Do you want to verify the hash of the nca files?')
print('********************************************************')
while i==0:
print('Input "1" to VERIFY hash of files')
print('Input "2" to NOT verify hash of files\n')
ck=input('Input your answer: ')
if ck ==str(1):
print('')
f = Fs.Nsp(filename, 'rb')
verdict,feed=f.nsz_hasher(buffer,headerlist,verdict,feed)
f.flush()
f.close()
i=1
elif ck ==str(2):
f.flush()
f.close()
i=1
else:
print('WRONG CHOICE\n')
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
elif args.text_file:
if vertype == "lv2":
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
elif vertype == "lv3":
f = Fs.Nsp(filename, 'rb')
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
f = Fs.Nsp(filename, 'rb')
verdict,feed=f.nsz_hasher(buffer,headerlist,verdict,feed)
f.flush()
f.close()
if check == True:
check=verdict
if check == False:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write("IS INCORRECT"+'\n')
dir=os.path.dirname(os.path.abspath(tfile))
info='INFO'
subf='MASSVERIFY'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
ofolder =os.path.join(ofolder,subf)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
infotext=os.path.join(ofolder, ofile)
with open(infotext, 'w') as info:
info.write(feed)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write('Exception: ' + str(e)+'\n')
if filename.endswith('.xcz'):
try:
f = Fs.Xci(filename)
check,feed=f.verify()
f.flush()
f.close()
if not args.text_file:
f = Fs.Xci(filename)
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
i=0
print('\n********************************************************')
print('Do you want to verify the hash of the nca files?')
print('********************************************************')
while i==0:
print('Input "1" to VERIFY hash of files')
print('Input "2" to NOT verify hash of files\n')
ck=input('Input your answer: ')
if ck ==str(1):
print('')
f = Fs.Xci(filename)
verdict,feed=f.xcz_hasher(buffer,headerlist,verdict,feed)
f.flush()
f.close()
i=1
elif ck ==str(2):
f.flush()
f.close()
i=1
else:
print('WRONG CHOICE\n')
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
ck=input('Input your answer: ')
if ck ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif ck ==str(2):
i=1
else:
print('WRONG CHOICE\n')
elif args.text_file:
if vertype == "lv2":
f = Fs.Xci(filename)
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
elif vertype == "lv3":
f = Fs.Xci(filename)
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
f.flush()
f.close()
if check == True:
check=verdict
f = Fs.Xci(filename)
verdict,feed=f.xcz_hasher(buffer,headerlist,verdict,feed)
f.flush()
f.close()
if check == True:
check=verdict
if check == False:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write("IS INCORRECT"+'\n')
dir=os.path.dirname(os.path.abspath(tfile))
info='INFO'
subf='MASSVERIFY'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
ofolder =os.path.join(ofolder,subf)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
infotext=os.path.join(ofolder, ofile)
with open(infotext, 'w') as info:
info.write(feed)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write('Exception: ' + str(e)+'\n')
if filename.endswith('.nca'):
try:
f = Fs.Nca(filename, 'rb')
ver_,origheader,ncaname,feed,currkg,tr,tkey,iGC=f.verify(False)
f.flush()
f.close()
if not args.text_file:
i=0
print('\n********************************************************')
print('Do you want to verify the hash of the nca files?')
print('********************************************************')
while i==0:
print('Input "1" to VERIFY hash of files')
print('Input "2" to NOT verify hash of files\n')
check=input('Input your answer: ')
if check ==str(1):
print('')
f = Fs.Nca(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,origheader,ver_,feed)
f.flush()
f.close()
i=1
elif check ==str(2):
i=1
else:
print('WRONG CHOICE\n')
print('\n********************************************************')
print('Do you want to print the information to a text file')
print('********************************************************')
i=0
while i==0:
print('Input "1" to print to text file')
print('Input "2" to NOT print to text file\n')
check=input('Input your answer: ')
if check ==str(1):
with open(infotext, 'w') as info:
info.write(feed)
i=1
elif check ==str(2):
i=1
else:
print('WRONG CHOICE\n')
if args.text_file:
f = Fs.Nca(filename, 'rb')
verdict,feed=f.verify_hash_nca(buffer,origheader,ver_,feed)
f.flush()
f.close()
if ver_ == True:
ver_=verdict
if ver_ == False:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write("IS INCORRECT"+'\n')
dir=os.path.dirname(os.path.abspath(tfile))
info='INFO'
subf='MASSVERIFY'
ofolder =os.path.join(dir,info)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
ofolder =os.path.join(ofolder,subf)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
infotext=os.path.join(ofolder, ofile)
with open(infotext, 'w') as info:
info.write(feed)
except BaseException as e:
Print.error('Exception: ' + str(e))
if args.text_file:
with open(errfile, 'a') as errfile:
now=datetime.now()
date=now.strftime("%x")+". "+now.strftime("%X")
errfile.write(date+'\n')
errfile.write("Filename: "+str(filename)+'\n')
errfile.write('Exception: ' + str(e)+'\n')
Status.close()
#split_list_by_id
if args.split_list_by_id:
for filepath in args.split_list_by_id:
ofolder=os.path.abspath(filepath)
if not os.path.exists(ofolder):
os.makedirs(ofolder)
baselist=list()
addonlist=list()
updlist=list()
if args.text_file:
tfile=args.text_file
filelist=list()
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
'''
for file in filelist:
print(file)
pass
'''
print('- Calculating base-ids for:')
for filepath in filelist:
try:
if filepath.endswith('.nsp') or filepath.endswith('.nsz') or filepath.endswith('.nsx') :
f = Fs.Nsp(filepath)
elif filepath.endswith('.xci') or filepath.endswith('.xcz') :
f = Fs.factory(filepath)
f.open(filepath, 'rb')
print(tabs+filepath)
validator,contentlist=f.cnmt_get_baseids()
f.flush()
f.close()
if validator=='base':
baselist.append([filepath,contentlist])
elif validator=='update':
updlist.append([filepath,contentlist])
else:
addonlist.append([filepath,contentlist])
except BaseException as e:
Print.error('Exception: ' + str(e))
'''
print('Baselist')
for i in baselist:
print(i)
print(str(len(baselist)))
print('Updlist')
for i in updlist:
print(i)
print(str(len(updlist)))
print('Addonlist')
for i in addonlist:
print(i)
print(str(len(addonlist)))
'''
print('')
print('- Generating lists:')
if len(baselist)>0:
for i in range(len(baselist)):
lname=''
fileslist=list()
idlist=baselist[i][1]
for k in idlist:
lname+='['+k+']'
lname=lname.upper()
lname+='.txt'
fileslist.append(baselist[i][0])
for j in range(len(updlist)):
addid=updlist[j][1]
addid=addid[0]
if addid in idlist:
if updlist[j][0] not in fileslist:
fileslist.append(updlist[j][0])
for j in range(len(addonlist)):
addid=addonlist[j][1]
addid=addid[0]
if addid in idlist:
if addonlist[j][0] not in fileslist:
fileslist.append(addonlist[j][0])
endfile=os.path.join(ofolder, lname)
print(' > '+endfile)
with open(endfile,"w", encoding='utf8') as tfile:
for line in fileslist:
try:
print(tabs+line)
tfile.write(line+"\n")
except:
continue
elif len(updlist)>0:
for i in range(len(updlist)):
lname=''
fileslist=list()
idlist=updlist[i][1]
for k in idlist:
k=k[:-3]+'800'
lname+='['+k+']'
lname=lname.upper()
lname+='.txt'
fileslist.append(updlist[i][0])
for j in range(len(addonlist)):
addid=addonlist[j][1]
addid=addid[0]
if addid in idlist:
if addonlist[j][0] not in fileslist:
fileslist.append(addonlist[j][0])
endfile=os.path.join(ofolder, lname)
print(' > '+endfile)
with open(endfile,"w", encoding='utf8') as tfile:
for line in fileslist:
try:
print(tabs+line)
tfile.write(line+"\n")
except:
continue
elif len(addonlist)>0:
for i in range(len(addonlist)):
lname=''
fileslist=list()
idlist=addonlist[i][1]
for k in idlist:
lname+='['+k+']'
lname=lname.upper()
lname+='.txt'
fileslist.append(addonlist[i][0])
endfile=os.path.join(ofolder, lname)
print(' > '+endfile)
with open(endfile,"w", encoding='utf8') as tfile:
for line in fileslist:
try:
print(tabs+line)
tfile.write(line+"\n")
except:
continue
Status.close()
#--------------------------
#Print list of old updates
#--------------------------
#parser.add_argument('-mv_oupd', '--mv_old_updates', nargs='+', help='Moves old updates to another folder')
if args.mv_old_updates:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.mv_old_updates:
ofolder=os.path.abspath(filepath)
ofolder=os.path.join(ofolder, 'old')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
duplicates_f=os.path.join(ofolder, 'duplicates')
if not os.path.exists(duplicates_f):
os.makedirs(duplicates_f)
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
else:
ruta=args.mv_old_updates[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
if args.filter:
for f in args.filter:
filter=f
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
'''
for file in filelist:
print(file)
pass
'''
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
if fileversion !='unknown':
break
except:
continue
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
print(str(os.path.basename(os.path.abspath(filepath))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(filepath))))
if not os.path.isfile(checker):
shutil.move(filepath,ofolder)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
print(str(os.path.basename(os.path.abspath(filepath))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(filepath))))
if not os.path.isfile(checker):
shutil.move(filepath,duplicates_f)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=shelvedfile
else:
print(str(os.path.basename(os.path.abspath(shelvedfile[0]))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(shelvedfile[0]))))
if not os.path.isfile(checker):
shutil.move(shelvedfile[0],ofolder)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
#parser.add_argument('-mv_odlc', '--mv_old_dlcs', nargs='+', help='Moves old dlcs to another folder')
if args.mv_old_dlcs:
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filepath in args.mv_old_dlcs:
ofolder=os.path.abspath(filepath)
ofolder=os.path.join(ofolder, 'old')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
duplicates_f=os.path.join(ofolder, 'duplicates')
if not os.path.exists(duplicates_f):
os.makedirs(duplicates_f)
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
else:
ruta=args.mv_old_dlcs[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
if args.filter:
for f in args.filter:
filter=f
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
'''
for file in filelist:
print(file)
pass
'''
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
if fileversion !='unknown':
break
except:
continue
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
print(str(os.path.basename(os.path.abspath(filepath))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(filepath))))
if not os.path.isfile(checker):
shutil.move(filepath,ofolder)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
print(str(os.path.basename(os.path.abspath(filepath))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(filepath))))
if not os.path.isfile(checker):
shutil.move(filepath,duplicates_f)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=shelvedfile
else:
print(str(os.path.basename(os.path.abspath(shelvedfile[0]))))
checker=os.path.join(ofolder, str(os.path.basename(os.path.abspath(shelvedfile[0]))))
if not os.path.isfile(checker):
shutil.move(shelvedfile[0],ofolder)
else:
try:
os.remove(filepath)
except:pass
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
#parser.add_argument('-cr_ilist', '--cr_incl_list', nargs='+', help='Creates a include list from a textfile and a folder')
#parser.add_argument('-tfile_aux', '--text_file_aux', help='Auxiliary text file')
if args.cr_incl_list:
# if args.ofolder:
# for input in args.ofolder:
# try:
# ofolder = input
# except BaseException as e:
# Print.error('Exception: ' + str(e))
# else:
# for filepath in args.cr_incl_list:
# ofolder=os.path.abspath(filepath)
# ofolder=os.path.join(ofolder, 'old')
# if not os.path.exists(ofolder):
# os.makedirs(ofolder)
# duplicates_f=os.path.join(ofolder, 'duplicates')
# if not os.path.exists(duplicates_f):
# os.makedirs(duplicates_f)
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
if args.text_file_aux:
filelist2=list()
tfile2=args.text_file_aux
with open(tfile2,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist2.append(fp)
else:
filelist2=list()
ruta=args.cr_incl_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
'''
for file in filelist2:
print(file)
pass
'''
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
#print(fileversion)
if fileversion !='unknown':
break
except:
continue
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
Datashelve[str(fileid)]=shelvedfile
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,str(fileid),fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
del filelist2
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
if fileversion !='unknown':
break
except:
continue
#print(fileid+' '+str(fileversion)+' '+cctag)
#print(filepath)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
if int(shelvedfile[2])<int(fileversion):
print(fileid +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
else:
print(fileid +' v'+str(fileversion))
#print(filepath)
#tfname='testmissdlc.txt'
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
# ...................................................
# Create exclude list
# ...................................................
#parser.add_argument('-cr_elist', '--cr_excl_list', nargs='+', help='Creates a exclude list from a textfile and a folder or 2 textfiles')
#parser.add_argument('-tfile_aux', '--text_file_aux', help='Auxiliary text file')
if args.cr_excl_list:
from listmanager import read_lines_to_list,folder_to_list,parsetags
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
filelist=read_lines_to_list(tfile,all=True)
if args.text_file_aux:
filelist2=list()
tfile2=args.text_file_aux
filelist2=read_lines_to_list(tfile2,all=True)
else:
filelist2=list()
ruta=args.cr_excl_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
else:
filter=False
#print(ruta)
filelist2=folder_to_list(ruta,extlist,filter)
'''
for file in filelist2:
print(file)
pass
'''
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown';baseid='unknown'
nG=0;nU=0;nD=0
try:
fileid,fileversion,cctag,nG,nU,nD,baseid=parsetags(filepath)
except:pass
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
x=parsetags(filepath)
print(str(x))
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag,nG,nU,nD,baseid]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
if int(shelvedfile[6])>=int(nD):
Datashelve[str(fileid)]=shelvedfile
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag,nG,nU,nD,baseid]
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag,nG,nU,nD,baseid]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag,nG,nU,nD,baseid]
except BaseException as e:
Print.error('Exception: ' + str(e))
del filelist2
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown';baseid='unknown'
nG=0;nU=0;nD=0
try:
fileid,fileversion,cctag,nG,nU,nD,baseid=parsetags(filepath)
except:pass
#print(fileid+' '+str(fileversion)+' '+cctag)
#print(filepath)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
x=parsetags(filepath)
print(str(x))
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
if str(filepath) != str(shelvedfile[0]):
if int(shelvedfile[2])>int(fileversion):
print(fileid +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
elif int(shelvedfile[2])==int(fileversion):
if int(shelvedfile[6])>int(nD):
print(fileid +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
else:
pass
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
# ...................................................
# OUTDATED XCI LIST
# ...................................................
#parser.add_argument('-cr_xcioutlist', '--cr_outdated_xci_list', nargs='+', help='Creates a include list from a textfile and a folder')
#parser.add_argument('-tfile_aux', '--text_file_aux', help='Auxiliary text file')
if args.cr_outdated_xci_list:
# if args.ofolder:
# for input in args.ofolder:
# try:
# ofolder = input
# except BaseException as e:
# Print.error('Exception: ' + str(e))
# else:
# for filepath in args.cr_outdated_xci_list:
# ofolder=os.path.abspath(filepath)
# ofolder=os.path.join(ofolder, 'old')
# if not os.path.exists(ofolder):
# os.makedirs(ofolder)
# duplicates_f=os.path.join(ofolder, 'duplicates')
# if not os.path.exists(duplicates_f):
# os.makedirs(duplicates_f)
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
if args.text_file_aux:
filelist2=list()
tfile2=args.text_file_aux
with open(tfile2,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist2.append(fp)
else:
filelist2=list()
ruta=args.cr_outdated_xci_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
'''
for file in filelist2:
print(file)
pass
'''
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
#print(fileversion)
if fileversion !='unknown':
break
except:
continue
if cctag=="BASE" and fileversion == 'unknown':
fileversion=0
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
Datashelve[str(fileid)]=shelvedfile
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,str(fileid),fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
del filelist2
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
if fileversion !='unknown':
break
except:
continue
#print(fileid+' '+str(fileversion)+' '+cctag)
#print(filepath)
if cctag=="BASE" and fileversion == 'unknown':
fileversion=0
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
isbase=False
if str(fileid[-3:])=='000':
isbase=True
elif str(fileid[-3:])=='800':
fileid=str(fileid[:-3])+'000'
else:
pass
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
if int(shelvedfile[2])<int(fileversion):
print(fileid +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
elif isbase==True:
print(fileid +' v'+str(fileversion))
#print(filepath)
#tfname='testmissdlc.txt'
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(filepath+'\n')
else:
pass
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
# ...................................................
# EXPAND LIST
# ...................................................
#parser.add_argument('-cr_xexplist', '--cr_expand_list', nargs='+', help='Expands the list with games by baseid')
#parser.add_argument('-tfile_aux', '--text_file_aux', help='Auxiliary text file')
if args.cr_expand_list:
# if args.ofolder:
# for input in args.ofolder:
# try:
# ofolder = input
# except BaseException as e:
# Print.error('Exception: ' + str(e))
# else:
# for filepath in args.cr_expand_list:
# ofolder=os.path.abspath(filepath)
# ofolder=os.path.join(ofolder, 'old')
# if not os.path.exists(ofolder):
# os.makedirs(ofolder)
# duplicates_f=os.path.join(ofolder, 'duplicates')
# if not os.path.exists(duplicates_f):
# os.makedirs(duplicates_f)
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
filelist=list()
if args.text_file:
tfile=args.text_file
with open(tfile,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist.append(fp)
if args.text_file_aux:
filelist2=list()
tfile2=args.text_file_aux
with open(tfile2,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist2.append(fp)
else:
filelist2=list()
ruta=args.cr_expand_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
'''
for file in filelist2:
print(file)
pass
'''
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
#print(fileversion)
if fileversion !='unknown':
break
except:
continue
if cctag=="BASE" and fileversion == 'unknown':
fileversion=0
#print(fileid+' '+str(fileversion)+' '+cctag)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if c==0:
c+=1
try:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
#print(shelvedfile[2])
if shelvedfile[1]==fileid:
if int(shelvedfile[2])>int(fileversion):
Datashelve[str(fileid)]=shelvedfile
elif int(shelvedfile[2])== int(fileversion):
Datashelve[str(fileid)]=shelvedfile
else:
Datashelve[str(fileid)]=[filepath,fileid,fileversion,cctag]
else:
pass
else:
Datashelve[str(fileid)]=[filepath,str(fileid),fileversion,cctag]
except BaseException as e:
Print.error('Exception: ' + str(e))
del filelist2
for filepath in filelist:
fileid='unknown';fileversion='unknown';cctag='unknown'
tid1=list()
tid2=list()
tid1=[pos for pos, char in enumerate(filepath) if char == '[']
tid2=[pos for pos, char in enumerate(filepath) if char == ']']
if len(tid1)>=len(tid2):
lentlist=len(tid1)
elif len(tid1)<len(tid2):
lentlist=len(tid2)
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
t=filepath[i1:i2]
#print(t)
if len(t)==16:
try:
test1=filepath[i1:i2]
int(filepath[i1:i2], 16)
fileid=str(filepath[i1:i2]).upper()
if fileid !='unknown':
if int(fileid[-3:])==800:
cctag='UPD'
elif int(fileid[-3:])==000:
cctag='BASE'
else:
try:
int(fileid[-3:])
cctag='DLC'
except:pass
break
except:
try:
fileid=str(filepath[i1:i2]).upper()
if str(fileid[-3:])!='800' or str(fileid[-3:])!='000':
DLCnumb=str(fileid)
DLCnumb="0000000000000"+DLCnumb[-3:]
DLCnumb=bytes.fromhex(DLCnumb)
DLCnumb=str(int.from_bytes(DLCnumb, byteorder='big'))
DLCnumb=int(DLCnumb)
cctag='DLC'
except:continue
except:pass
for i in range(lentlist):
try:
i1=tid1[i]+1
i2=tid2[i]
except:pass
if (str(filepath[(i1)]).upper())=='V':
try:
test2=filepath[(i1+1):i2]
fileversion=int(filepath[(i1+1):i2])
if fileversion !='unknown':
break
except:
continue
if cctag=="BASE" and fileversion == 'unknown':
fileversion=0
#print(fileid+' '+str(fileversion)+' '+cctag)
#print(filepath)
if fileid == 'unknown' or fileversion == 'unknown':
print(fileid+' '+str(fileversion))
print(str(os.path.basename(os.path.abspath(filepath))))
print(test1)
print(test2)
if cctag!="DLC" and cctag!="BASE" and cctag!="UPD":
print(str(os.path.basename(os.path.abspath(filepath))))
if str(fileid[-3:])=='800':
fileid=str(fileid[:-3])+'000'
elif str(fileid[-3:])=='000':
fileid=str(fileid)
else:
#print(str(fileid))
DLCnumb=str(fileid)
#print(hx(b''+bytes.fromhex('0'+DLCnumb[-4:-3])))
token=int(hx(bytes.fromhex('0'+DLCnumb[-4:-3])),16)-int('1',16)
token=str(hex(token))[-1]
token=token.upper()
#print(token)
fileid=fileid[:-4]+token+'000'
#print(fileid)
try:
if str(fileid) in Datashelve:
shelvedfile=Datashelve[str(fileid)]
if str(shelvedfile[0])!=str(filepath):
print(str(fileid) +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(str(filepath)+'\n')
elif str(fileid[:-3]+'800') in Datashelve:
fileid=str(fileid[:-3]+'800')
shelvedfile=Datashelve[str(fileid)]
if str(shelvedfile[0])!=str(filepath):
print(str(fileid) +' v'+str(fileversion))
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(str(filepath)+'\n')
else:
pass
except BaseException as e:
Print.error('Exception: ' + str(e))
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
Status.close()
#parser.add_argument('-blckl', '--black_list', nargs='+', help='Deletes blacklisted files from a list')
if args.black_list:
try:
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
blacklist=list()
if args.black_list:
t_blacklist=args.black_list[0]
if args.black_list[1]:
if str(args.black_list[1]).lower()=='true':
blacklistbaseid=True
else:
blacklistbaseid=False
else:
blacklistbaseid=False
with open(t_blacklist,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
blacklist.append(fp)
if args.text_file:
filelist2=list()
tfile2=args.text_file
with open(tfile2,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist2.append(fp)
else:
filelist2=list()
ruta=args.cr_incl_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown'
try:
fileid,fileversion,cctag,nG,nU,nD,baseid=listmanager.parsetags(filepath)
except:pass
if cctag !='unknown':
try:
Datashelve[str(fileid)]=[filepath,str(fileid),fileversion,cctag,nG,nU,nD,baseid]
except: pass
del filelist2
tfile=open(exportlist,"w", encoding='utf8')
tfile.close()
for filepath in blacklist:
fileid='unknown';fileversion='unknown';cctag='unknown'
try:
fileid,fileversion,cctag,nG,nU,nD,baseid=listmanager.parsetags(filepath)
#print(baseid)
except:pass
if cctag !='unknown':
try:
if str(fileid) in Datashelve:
del Datashelve[str(fileid)]
else:
keylist=list()
for k in Datashelve.keys():
keylist.append(k)
for k in keylist:
if k in Datashelve:
entry=Datashelve[k]
test=str(entry[0]).lower()
fp=str(filepath).lower()
if test==fp:
del Datashelve[k]
if blacklistbaseid==False:
pass
else:
keylist=list()
for k in Datashelve.keys():
keylist.append(k)
for k in keylist:
if k in Datashelve:
entry=Datashelve[k]
test=str(entry[-1]).lower()
baseid=str(baseid).lower()
if test==baseid:
del Datashelve[k]
except BaseException as e:
Print.error('Exception: ' + str(e))
continue
del blacklist
for k in Datashelve.keys():
with open(exportlist,"a", encoding='utf8') as tfile:
entry=Datashelve[k]
fp=str(entry[0])
tfile.write(fp+'\n')
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
except:pass
Status.close()
#parser.add_argument('-chdlcn', '--chck_dlc_numb', nargs='+', help='Checks if xci has corrent number of dlcs')
if args.chck_dlc_numb:
try:
if args.fexport:
for input in args.fexport:
try:
exportlist = input
except BaseException as e:
Print.error('Exception: ' + str(e))
baselist=list()
addonlist=list()
updlist=list();updtomove=list()
dlclist=list()
if args.chck_dlc_numb:
t_dlc_list=args.chck_dlc_numb[0]
with open(t_dlc_list,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
dlclist.append(fp)
if args.text_file:
filelist2=list()
tfile2=args.text_file
with open(tfile2,"r+", encoding='utf8') as f:
for line in f:
fp=line.strip()
filelist2.append(fp)
else:
filelist2=list()
ruta=args.cr_incl_list[0]
if ruta[-1]=='"':
ruta=ruta[:-1]
if ruta[0]=='"':
ruta=ruta[1:]
extlist=list()
extlist.append('.nsp')
extlist.append('.nsz')
extlist.append('.xci')
extlist.append('.xcz')
if args.filter:
for f in args.filter:
filter=f
#print(ruta)
try:
fname=""
binbin='RECYCLE.BIN'
for ext in extlist:
#print (ext)
#print (ruta)
if os.path.isdir(ruta):
for dirpath, dirnames, filenames in os.walk(ruta):
for filename in [f for f in filenames if f.endswith(ext.lower()) or f.endswith(ext.upper()) or f[:-1].endswith(ext.lower()) or f[:-1].endswith(ext.lower())]:
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
#print(fname)
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(os.path.join(dirpath, filename))
else:
if ruta.endswith(ext.lower()) or ruta.endswith(ext.upper()) or ruta[:-1].endswith(ext.lower()) or ruta[:-1].endswith(ext.upper()):
filename = ruta
#print(ruta)
fname=""
if args.filter:
if filter.lower() in filename.lower():
fname=filename
else:
fname=filename
if fname != "":
if binbin.lower() not in filename.lower():
filelist2.append(filename)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
test2="";test=""
Datashelve = dbmodule.Dict('File01.dshlv');c=0
for filepath in filelist2:
fileid='unknown';fileversion='unknown';cctag='unknown'
try:
fileid,fileversion,cctag,nG,nU,nD,baseid=listmanager.parsetags(filepath)
except:pass
if cctag !='unknown':
try:
Datashelve[str(fileid)]=[filepath,str(fileid),fileversion,cctag,nG,nU,nD,baseid]
except: pass
del filelist2
tfile=open(exportlist,"w", encoding='utf8')
tfile.close()
keylist=list()
for k in Datashelve.keys():
keylist.append(k)
for k in keylist:
if k in Datashelve:
entry=Datashelve[k]
numbDLC=entry[6]
test=str(entry[1]).lower()
count=0
dlcpaths=list()
# test2='['+test[:-4]
for filepath in dlclist:
fileid='unknown';fileversion='unknown';cctag='unknown'
# print(test2)
# if test2 in str(filepath).lower():
try:
# print(filepath)
fileid,fileversion,cctag,nG,nU,nD,baseid=listmanager.parsetags(filepath)
# print(baseid)
# print(test)
baseid=baseid.lower()
if (str(baseid).lower())==test:
if not filepath in dlcpaths:
count+=1
dlcpaths.append(filepath)
dlclist.remove(filepath)
except BaseException as e:
Print.error('Exception: ' + str(e))
pass
# print(str(count))
# print(str(numbDLC))
if count>int(numbDLC):
with open(exportlist,"a", encoding='utf8') as tfile:
tfile.write(str(entry[0])+'\n')
Datashelve.close()
try:os.remove('File01.dshlv')
except:pass
except:pass
Status.close()
# ...................................................
# Restore. File Restoration
# ...................................................
if args.restore:
feed='';cnmt_is_patched=False
if args.buffer:
for var in args.buffer:
try:
buffer = var
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
buffer = 65536
if args.ofolder:
for input in args.ofolder:
try:
ofolder = input
except BaseException as e:
Print.error('Exception: ' + str(e))
else:
for filename in args.restore:
dir=os.path.dirname(os.path.abspath(filename))
ofolder =os.path.join(dir, 'output')
if not os.path.exists(ofolder):
os.makedirs(ofolder)
tmpfolder =os.path.join(ofolder, 'tmp')
if args.text_file:
tfile=args.text_file
dir=os.path.dirname(os.path.abspath(tfile))
if not os.path.exists(dir):
os.makedirs(dir)
err='badfiles.txt'
errfile = os.path.join(dir, err)
with open(tfile,"r+", encoding='utf8') as filelist:
filename = filelist.readline()
filename=os.path.abspath(filename.rstrip('\n'))
else:
for filename in args.restore:
filename=filename
ofile=str(os.path.basename(os.path.abspath(filename)))
ofile=os.path.join(ofolder, ofile)
if filename.endswith('.nsp') or filename.endswith('.nsx'):
try:
f = Fs.Nsp(filename, 'rb')
check,feed=f.verify()
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder,cnmt='nocheck')
output_type='nsp';multi=False;cnmtcount=0
if verdict == True:
isrestored=True
for i in range(len(headerlist)):
entry=headerlist[i]
if str(entry[0]).endswith('.cnmt.nca'):
cnmtcount+=1
if cnmt_is_patched==False:
status=entry[2]
if status=='patched':
cnmt_is_patched=True
if entry[1]!=False:
if int(entry[-1])==1:
output_type='xci'
isrestored=False
else:
pass
if isrestored == False:
if cnmt_is_patched !=True:
print('\nFILE WAS MODIFIED. FILE IS RESTORABLE')
else:
print('\nFILE WAS MODIFIED AND CNMT PATCHED. FILE MAY BE RESTORABLE')
if cnmtcount<2:
if not os.path.exists(ofolder):
os.makedirs(ofolder)
f.restore_ncas(buffer,headerlist,verdict,ofile,feed,output_type)
else:
print(" -> Current Implementation doesn't support multicontent files")
print(" Please use the multicontent splitter first")
else:
print("\nFILE WASN'T MODIFIED. SKIPPING RESTORATION")
if verdict == False:
print("\nFILE WAS MODIFIED. FILE ISN'T RESTORABLE")
except BaseException as e:
Print.error('Exception: ' + str(e))
if filename.endswith('.xci'):
try:
f = Fs.Xci(filename)
check,feed=f.verify()
verdict,headerlist,feed=f.verify_sig(feed,tmpfolder)
output_type='nsp';multi=False;cnmtcount=0
if verdict == True:
isrestored=True
for i in range(len(headerlist)):
entry=headerlist[i]
if str(entry[0]).endswith('.cnmt.nca'):
cnmtcount+=1
if entry[1]!=False:
if int(entry[-1])==1:
output_type='xci'
isrestored=False
else:
pass
if isrestored == False:
print('\nFILE WAS MODIFIED. FILE IS RESTORABLE')
if cnmtcount<2:
if not os.path.exists(ofolder):
os.makedirs(ofolder)
f.restore_ncas(buffer,headerlist,verdict,ofile,feed,output_type)
else:
print(" -> Current Implementation doesn't support multicontent files")
print(" Please use the multicontent splitter first")
else:
print("\nFILE WASN'T MODIFIED. SKIPPING RESTORATION")
elif verdict == False:
print("\nFILE WAS MODIFIED. FILE ISN'T RESTORABLE")
except BaseException as e:
Print.error('Exception: ' + str(e))
Status.close()
def init_interface():
import secondary
parameters=["Interface","start"]
vret=secondary.call_library(parameters)
#init_interface()
except KeyboardInterrupt:
Config.isRunning = False
Status.close()
except BaseException as e:
Config.isRunning = False
Status.close()
raise
# app=init_interface()
| 33.37739 | 210 | 0.548189 | 40,974 | 335,109 | 4.435667 | 0.026993 | 0.026047 | 0.025893 | 0.043335 | 0.875959 | 0.854507 | 0.837466 | 0.817774 | 0.804459 | 0.788008 | 0 | 0.011184 | 0.26999 | 335,109 | 10,039 | 211 | 33.380715 | 0.730821 | 0.074797 | 0 | 0.875138 | 0 | 0.011855 | 0.106476 | 0.011103 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000111 | false | 0.009971 | 0.005096 | 0 | 0.005207 | 0.036783 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a7df426c3b66ee94e9b621e9cfcebe09fccf8b6 | 23,725 | py | Python | mwt/models/models.py | JinY0ung-Shin/PDNO | 7a16ae04fb2fbdc1fc7be095683d1dffd0d0e863 | [
"MIT"
] | 2 | 2022-03-16T22:10:02.000Z | 2022-03-28T16:15:14.000Z | mwt/models/models.py | JinY0ung-Shin/PDNO | 7a16ae04fb2fbdc1fc7be095683d1dffd0d0e863 | [
"MIT"
] | null | null | null | mwt/models/models.py | JinY0ung-Shin/PDNO | 7a16ae04fb2fbdc1fc7be095683d1dffd0d0e863 | [
"MIT"
] | null | null | null | import torch
import numpy as np
import torch.nn as nn
import torch.nn.functional as F
from torch import Tensor
from typing import List, Tuple
import math
from .utils import get_filter
class sparseKernel1d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernel1d,self).__init__()
self.k = k
self.Li = nn.Linear(c*k, 128)
self.conv = self.convBlock(c*k, 128)
self.Lo = nn.Linear(128, c*k)
def forward(self, x):
B, N, c, ich = x.shape # (B, N, c, k)
x = x.view(B, N, -1)
x = x.permute(0, 2, 1)
x = self.conv(x)
x = x.permute(0, 2, 1)
x = self.Lo(x)
x = x.view(B, N, c, ich)
return x
def convBlock(self, ich, och):
net = nn.Sequential(
nn.Conv1d(ich, och, 3, 1, 1),
nn.ReLU(inplace=True),
)
return net
def compl_mul1d(x, weights):
# (batch, in_channel, x ), (in_channel, out_channel, x) -> (batch, out_channel, x)
return torch.einsum("bix,iox->box", x, weights)
class sparseKernelFT1d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernelFT1d, self).__init__()
self.modes1 = alpha
self.scale = (1 / (c*k*c*k))
self.weights1 = nn.Parameter(self.scale * torch.rand(c*k, c*k, self.modes1, dtype=torch.cfloat))
self.weights1.requires_grad = True
self.k = k
def forward(self, x):
B, N, c, k = x.shape # (B, N, c, k)
x = x.view(B, N, -1)
x = x.permute(0, 2, 1)
x_fft = torch.fft.rfft(x)
# Multiply relevant Fourier modes
l = min(self.modes1, N//2+1)
out_ft = torch.zeros(B, c*k, N//2 + 1, device=x.device, dtype=torch.cfloat)
out_ft[:, :, :l] = compl_mul1d(x_fft[:, :, :l], self.weights1[:, :, :l])
#Return to physical space
x = torch.fft.irfft(out_ft, n=N)
x = x.permute(0, 2, 1).view(B, N, c, k)
return x
class MWT_CZ1d(nn.Module):
def __init__(self,
k = 3, alpha = 5,
L = 0, c = 1,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT_CZ1d, self).__init__()
self.k = k
self.L = L
H0, H1, G0, G1, PHI0, PHI1 = get_filter(base, k)
H0r = H0@PHI0
G0r = G0@PHI0
H1r = H1@PHI1
G1r = G1@PHI1
H0r[np.abs(H0r)<1e-8]=0
H1r[np.abs(H1r)<1e-8]=0
G0r[np.abs(G0r)<1e-8]=0
G1r[np.abs(G1r)<1e-8]=0
self.A = sparseKernelFT1d(k, alpha, c)
self.B = sparseKernelFT1d(k, alpha, c)
self.C = sparseKernelFT1d(k, alpha, c)
self.T0 = nn.Linear(k, k)
self.register_buffer('ec_s', torch.Tensor(
np.concatenate((H0.T, H1.T), axis=0)))
self.register_buffer('ec_d', torch.Tensor(
np.concatenate((G0.T, G1.T), axis=0)))
self.register_buffer('rc_e', torch.Tensor(
np.concatenate((H0r, G0r), axis=0)))
self.register_buffer('rc_o', torch.Tensor(
np.concatenate((H1r, G1r), axis=0)))
def forward(self, x):
B, N, c, ich = x.shape # (B, N, k)
ns = math.floor(np.log2(N))
Ud = torch.jit.annotate(List[Tensor], [])
Us = torch.jit.annotate(List[Tensor], [])
# decompose
for i in range(ns-self.L):
d, x = self.wavelet_transform(x)
Ud += [self.A(d) + self.B(x)]
Us += [self.C(d)]
x = self.T0(x) # coarsest scale transform
# reconstruct
for i in range(ns-1-self.L,-1,-1):
x = x + Us[i]
x = torch.cat((x, Ud[i]), -1)
x = self.evenOdd(x)
return x
def wavelet_transform(self, x):
xa = torch.cat([x[:, ::2, :, :],
x[:, 1::2, :, :],
], -1)
d = torch.matmul(xa, self.ec_d)
s = torch.matmul(xa, self.ec_s)
return d, s
def evenOdd(self, x):
B, N, c, ich = x.shape # (B, N, c, k)
assert ich == 2*self.k
x_e = torch.matmul(x, self.rc_e)
x_o = torch.matmul(x, self.rc_o)
x = torch.zeros(B, N*2, c, self.k,
device = x.device)
x[..., ::2, :, :] = x_e
x[..., 1::2, :, :] = x_o
return x
class MWT1d(nn.Module):
def __init__(self,
ich = 1, k = 3, alpha = 2, c = 1,
nCZ = 3,
L = 0,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT1d,self).__init__()
self.k = k
self.c = c
self.L = L
self.nCZ = nCZ
self.Lk = nn.Linear(ich, c*k)
self.MWT_CZ = nn.ModuleList(
[MWT_CZ1d(k, alpha, L, c, base,
initializer) for _ in range(nCZ)]
)
self.Lc0 = nn.Linear(c*k, 128)
self.Lc1 = nn.Linear(128, 1)
if initializer is not None:
self.reset_parameters(initializer)
def forward(self, x):
B, N, ich = x.shape # (B, N, d)
ns = math.floor(np.log2(N))
x = self.Lk(x)
x = x.view(B, N, self.c, self.k)
for i in range(self.nCZ):
x = self.MWT_CZ[i](x)
if i < self.nCZ-1:
x = F.relu(x)
x = x.view(B, N, -1) # collapse c and k
x = self.Lc0(x)
x = F.relu(x)
x = self.Lc1(x)
return x.squeeze()
def reset_parameters(self, initializer):
initializer(self.Lc0.weight)
initializer(self.Lc1.weight)
class sparseKernel2d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernel2d,self).__init__()
self.k = k
self.conv = self.convBlock(k, c*k**2, alpha)
self.Lo = nn.Linear(alpha*k**2, c*k**2)
def forward(self, x):
B, Nx, Ny, c, ich = x.shape # (B, Nx, Ny, c, k**2)
x = x.view(B, Nx, Ny, -1)
x = x.permute(0, 3, 1, 2)
x = self.conv(x)
x = x.permute(0, 2, 3, 1)
x = self.Lo(x)
x = x.view(B, Nx, Ny, c, ich)
return x
def convBlock(self, k, W, alpha):
och = alpha * k**2
net = nn.Sequential(
nn.Conv2d(W, och, 3, 1, 1),
nn.ReLU(inplace=True),
)
return net
def compl_mul2d(x, weights):
# (batch, in_channel, x,y ), (in_channel, out_channel, x,y) -> (batch, out_channel, x,y)
return torch.einsum("bixy,ioxy->boxy", x, weights)
class sparseKernelFT2d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernelFT2d, self).__init__()
self.modes = alpha
self.weights1 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, dtype=torch.cfloat))
self.weights2 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, dtype=torch.cfloat))
nn.init.xavier_normal_(self.weights1)
nn.init.xavier_normal_(self.weights2)
self.Lo = nn.Linear(c*k**2, c*k**2)
self.k = k
def forward(self, x):
B, Nx, Ny, c, ich = x.shape # (B, N, N, c, k^2)
x = x.view(B, Nx, Ny, -1)
x = x.permute(0, 3, 1, 2)
x_fft = torch.fft.rfft2(x)
# Multiply relevant Fourier modes
l1 = min(self.modes, Nx//2+1)
l1l = min(self.modes, Nx//2-1)
l2 = min(self.modes, Ny//2+1)
out_ft = torch.zeros(B, c*ich, Nx, Ny//2 + 1, device=x.device, dtype=torch.cfloat)
out_ft[:, :, :l1, :l2] = compl_mul2d(
x_fft[:, :, :l1, :l2], self.weights1[:, :, :l1, :l2])
out_ft[:, :, -l1:, :l2] = compl_mul2d(
x_fft[:, :, -l1:, :l2], self.weights2[:, :, :l1, :l2])
#Return to physical space
x = torch.fft.irfft2(out_ft, s = (Nx, Ny))
x = x.permute(0, 2, 3, 1)
x = F.relu(x)
x = self.Lo(x)
x = x.view(B, Nx, Ny, c, ich)
return x
class MWT_CZ2d(nn.Module):
def __init__(self,
k = 3, alpha = 5,
L = 0, c = 1,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT_CZ2d, self).__init__()
self.k = k
self.L = L
H0, H1, G0, G1, PHI0, PHI1 = get_filter(base, k)
H0r = H0@PHI0
G0r = G0@PHI0
H1r = H1@PHI1
G1r = G1@PHI1
H0r[np.abs(H0r)<1e-8]=0
H1r[np.abs(H1r)<1e-8]=0
G0r[np.abs(G0r)<1e-8]=0
G1r[np.abs(G1r)<1e-8]=0
self.A = sparseKernelFT2d(k, alpha, c)
self.B = sparseKernel2d(k, c, c)
self.C = sparseKernel2d(k, c, c)
self.T0 = nn.Linear(c*k**2, c*k**2)
if initializer is not None:
self.reset_parameters(initializer)
self.register_buffer('ec_s', torch.Tensor(
np.concatenate((np.kron(H0, H0).T,
np.kron(H0, H1).T,
np.kron(H1, H0).T,
np.kron(H1, H1).T,
), axis=0)))
self.register_buffer('ec_d', torch.Tensor(
np.concatenate((np.kron(G0, G0).T,
np.kron(G0, G1).T,
np.kron(G1, G0).T,
np.kron(G1, G1).T,
), axis=0)))
self.register_buffer('rc_ee', torch.Tensor(
np.concatenate((np.kron(H0r, H0r),
np.kron(G0r, G0r),
), axis=0)))
self.register_buffer('rc_eo', torch.Tensor(
np.concatenate((np.kron(H0r, H1r),
np.kron(G0r, G1r),
), axis=0)))
self.register_buffer('rc_oe', torch.Tensor(
np.concatenate((np.kron(H1r, H0r),
np.kron(G1r, G0r),
), axis=0)))
self.register_buffer('rc_oo', torch.Tensor(
np.concatenate((np.kron(H1r, H1r),
np.kron(G1r, G1r),
), axis=0)))
def forward(self, x):
B, Nx, Ny, c, ich = x.shape # (B, Nx, Ny, c, k**2)
ns = math.floor(np.log2(Nx))
Ud = torch.jit.annotate(List[Tensor], [])
Us = torch.jit.annotate(List[Tensor], [])
# decompose
for i in range(ns-self.L):
d, x = self.wavelet_transform(x)
Ud += [self.A(d) + self.B(x)]
Us += [self.C(d)]
x = self.T0(x.view(B, 2**self.L, 2**self.L, -1)).view(
B, 2**self.L, 2**self.L, c, ich) # coarsest scale transform
# reconstruct
for i in range(ns-1-self.L,-1,-1):
x = x + Us[i]
x = torch.cat((x, Ud[i]), -1)
x = self.evenOdd(x)
return x
def wavelet_transform(self, x):
xa = torch.cat([x[:, ::2 , ::2 , :, :],
x[:, ::2 , 1::2, :, :],
x[:, 1::2, ::2 , :, :],
x[:, 1::2, 1::2, :, :]
], -1)
d = torch.matmul(xa, self.ec_d)
s = torch.matmul(xa, self.ec_s)
return d, s
def evenOdd(self, x):
B, Nx, Ny, c, ich = x.shape # (B, Nx, Ny, c, k**2)
assert ich == 2*self.k**2
x_ee = torch.matmul(x, self.rc_ee)
x_eo = torch.matmul(x, self.rc_eo)
x_oe = torch.matmul(x, self.rc_oe)
x_oo = torch.matmul(x, self.rc_oo)
x = torch.zeros(B, Nx*2, Ny*2, c, self.k**2,
device = x.device)
x[:, ::2 , ::2 , :, :] = x_ee
x[:, ::2 , 1::2, :, :] = x_eo
x[:, 1::2, ::2 , :, :] = x_oe
x[:, 1::2, 1::2, :, :] = x_oo
return x
def reset_parameters(self, initializer):
initializer(self.T0.weight)
class MWT2d(nn.Module):
def __init__(self,
ich = 1, k = 3, alpha = 2, c = 1,
nCZ = 3,
L = 0,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT2d,self).__init__()
self.k = k
self.c = c
self.L = L
self.nCZ = nCZ
self.Lk = nn.Linear(ich, c*k**2)
self.MWT_CZ = nn.ModuleList(
[MWT_CZ2d(k, alpha, L, c, base,
initializer) for _ in range(nCZ)]
)
self.Lc0 = nn.Linear(c*k**2, 128)
self.Lc1 = nn.Linear(128, 1)
if initializer is not None:
self.reset_parameters(initializer)
def forward(self, x):
B, Nx, Ny, ich = x.shape # (B, Nx, Ny, d)
ns = math.floor(np.log2(Nx))
x = self.Lk(x)
x = x.view(B, Nx, Ny, self.c, self.k**2)
for i in range(self.nCZ):
x = self.MWT_CZ[i](x)
if i < self.nCZ-1:
x = F.relu(x)
x = x.view(B, Nx, Ny, -1) # collapse c and k**2
x = self.Lc0(x)
x = F.relu(x)
x = self.Lc1(x)
return x.squeeze()
def reset_parameters(self, initializer):
initializer(self.Lc0.weight)
initializer(self.Lc1.weight)
class sparseKernel(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernel,self).__init__()
self.k = k
self.conv = self.convBlock(k, c*k**2, alpha)
self.Lo = nn.Linear(alpha*k**2, c*k**2)
def forward(self, x):
B, Nx, Ny, c, ich = x.shape # (B, Nx, Ny, c, k**2)
x = x.view(B, Nx, Ny, -1)
x = x.permute(0, 3, 1, 2)
x = self.conv(x)
x = x.permute(0, 2, 3, 1)
x = self.Lo(x)
x = x.view(B, Nx, Ny, c, ich)
return x
def convBlock(self, k, W, alpha):
och = alpha * k**2
net = nn.Sequential(
nn.Conv2d(W, och, 3, 1, 1),
nn.ReLU(inplace=True),
)
return net
class sparseKernel3d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernel3d,self).__init__()
self.k = k
self.conv = self.convBlock(alpha*k**2, alpha*k**2)
self.Lo = nn.Linear(alpha*k**2, c*k**2)
def forward(self, x):
B, Nx, Ny, T, c, ich = x.shape # (B, Nx, Ny, T, c, k**2)
x = x.view(B, Nx, Ny, T, -1)
x = x.permute(0, 4, 1, 2, 3)
x = self.conv(x)
x = x.permute(0, 2, 3, 4, 1)
x = self.Lo(x)
x = x.view(B, Nx, Ny, T, c, ich)
return x
def convBlock(self, ich, och):
net = nn.Sequential(
nn.Conv3d(och, och, 3, 1, 1),
nn.ReLU(inplace=True),
)
return net
def compl_mul3d(input, weights):
# (batch, in_channel, x,y,t ), (in_channel, out_channel, x,y,t) -> (batch, out_channel, x,y,t)
return torch.einsum("bixyz,ioxyz->boxyz", input, weights)
class sparseKernelFT3d(nn.Module):
def __init__(self,
k, alpha, c=1,
nl = 1,
initializer = None,
**kwargs):
super(sparseKernelFT3d, self).__init__()
self.modes = alpha
self.weights1 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, self.modes, dtype=torch.cfloat))
self.weights2 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, self.modes, dtype=torch.cfloat))
self.weights3 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, self.modes, dtype=torch.cfloat))
self.weights4 = nn.Parameter(torch.zeros(c*k**2, c*k**2, self.modes, self.modes, self.modes, dtype=torch.cfloat))
nn.init.xavier_normal_(self.weights1)
nn.init.xavier_normal_(self.weights2)
nn.init.xavier_normal_(self.weights3)
nn.init.xavier_normal_(self.weights4)
self.Lo = nn.Linear(c*k**2, c*k**2)
self.k = k
def forward(self, x):
B, Nx, Ny, T, c, ich = x.shape # (B, N, N, T, c, k^2)
x = x.view(B, Nx, Ny, T, -1)
x = x.permute(0, 4, 1, 2, 3)
x_fft = torch.fft.rfftn(x, dim = [-3, -2, -1])
# Multiply relevant Fourier modes
l1 = min(self.modes, Nx//2+1)
l2 = min(self.modes, Ny//2+1)
out_ft = torch.zeros(B, c*ich, Nx, Ny, T//2 +1, device=x.device, dtype=torch.cfloat)
out_ft[:, :, :l1, :l2, :self.modes] = compl_mul3d(
x_fft[:, :, :l1, :l2, :self.modes], self.weights1[:, :, :l1, :l2, :])
out_ft[:, :, -l1:, :l2, :self.modes] = compl_mul3d(
x_fft[:, :, -l1:, :l2, :self.modes], self.weights2[:, :, :l1, :l2, :])
out_ft[:, :, :l1, -l2:, :self.modes] = compl_mul3d(
x_fft[:, :, :l1, -l2:, :self.modes], self.weights3[:, :, :l1, :l2, :])
out_ft[:, :, -l1:, -l2:, :self.modes] = compl_mul3d(
x_fft[:, :, -l1:, -l2:, :self.modes], self.weights4[:, :, :l1, :l2, :])
#Return to physical space
x = torch.fft.irfftn(out_ft, s = (Nx, Ny, T))
x = x.permute(0, 2, 3, 4, 1)
x = F.relu(x)
x = self.Lo(x)
x = x.view(B, Nx, Ny, T, c, ich)
return x
class MWT_CZ3d(nn.Module):
def __init__(self,
k = 3, alpha = 5,
L = 0, c = 1,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT_CZ3d, self).__init__()
self.k = k
self.L = L
H0, H1, G0, G1, PHI0, PHI1 = get_filter(base, k)
H0r = H0@PHI0
G0r = G0@PHI0
H1r = H1@PHI1
G1r = G1@PHI1
H0r[np.abs(H0r)<1e-8]=0
H1r[np.abs(H1r)<1e-8]=0
G0r[np.abs(G0r)<1e-8]=0
G1r[np.abs(G1r)<1e-8]=0
self.A = sparseKernelFT3d(k, alpha, c)
self.B = sparseKernel3d(k, c, c)
self.C = sparseKernel3d(k, c, c)
self.T0 = nn.Linear(c*k**2, c*k**2)
if initializer is not None:
self.reset_parameters(initializer)
self.register_buffer('ec_s', torch.Tensor(
np.concatenate((np.kron(H0, H0).T,
np.kron(H0, H1).T,
np.kron(H1, H0).T,
np.kron(H1, H1).T,
), axis=0)))
self.register_buffer('ec_d', torch.Tensor(
np.concatenate((np.kron(G0, G0).T,
np.kron(G0, G1).T,
np.kron(G1, G0).T,
np.kron(G1, G1).T,
), axis=0)))
self.register_buffer('rc_ee', torch.Tensor(
np.concatenate((np.kron(H0r, H0r),
np.kron(G0r, G0r),
), axis=0)))
self.register_buffer('rc_eo', torch.Tensor(
np.concatenate((np.kron(H0r, H1r),
np.kron(G0r, G1r),
), axis=0)))
self.register_buffer('rc_oe', torch.Tensor(
np.concatenate((np.kron(H1r, H0r),
np.kron(G1r, G0r),
), axis=0)))
self.register_buffer('rc_oo', torch.Tensor(
np.concatenate((np.kron(H1r, H1r),
np.kron(G1r, G1r),
), axis=0)))
def forward(self, x):
B, Nx, Ny, T, c, ich = x.shape # (B, Nx, Ny, T, c, k**2)
ns = math.floor(np.log2(Nx))
Ud = torch.jit.annotate(List[Tensor], [])
Us = torch.jit.annotate(List[Tensor], [])
# decompose
for i in range(ns-self.L):
d, x = self.wavelet_transform(x)
Ud += [self.A(d) + self.B(x)]
Us += [self.C(d)]
x = self.T0(x.view(B, 2**self.L, 2**self.L, T, -1)).view(
B, 2**self.L, 2**self.L, T, c, ich) # coarsest scale transform
# reconstruct
for i in range(ns-1-self.L,-1,-1):
x = x + Us[i]
x = torch.cat((x, Ud[i]), -1)
x = self.evenOdd(x)
return x
def wavelet_transform(self, x):
xa = torch.cat([x[:, ::2 , ::2 , :, :, :],
x[:, ::2 , 1::2, :, :, :],
x[:, 1::2, ::2 , :, :, :],
x[:, 1::2, 1::2, :, :, :]
], -1)
d = torch.matmul(xa, self.ec_d)
s = torch.matmul(xa, self.ec_s)
return d, s
def evenOdd(self, x):
B, Nx, Ny, T, c, ich = x.shape # (B, Nx, Ny, c, k**2)
assert ich == 2*self.k**2
x_ee = torch.matmul(x, self.rc_ee)
x_eo = torch.matmul(x, self.rc_eo)
x_oe = torch.matmul(x, self.rc_oe)
x_oo = torch.matmul(x, self.rc_oo)
x = torch.zeros(B, Nx*2, Ny*2, T, c, self.k**2,
device = x.device)
x[:, ::2 , ::2 , :, :, :] = x_ee
x[:, ::2 , 1::2, :, :, :] = x_eo
x[:, 1::2, ::2 , :, :, :] = x_oe
x[:, 1::2, 1::2, :, :, :] = x_oo
return x
def reset_parameters(self, initializer):
initializer(self.T0.weight)
class MWT3d(nn.Module):
def __init__(self,
ich = 1, k = 3, alpha = 2, c = 1,
nCZ = 3,
L = 0,
base = 'legendre',
initializer = None,
**kwargs):
super(MWT3d,self).__init__()
self.k = k
self.c = c
self.L = L
self.nCZ = nCZ
self.Lk = nn.Linear(ich, c*k**2)
self.MWT_CZ = nn.ModuleList(
[MWT_CZ3d(k, alpha, L, c, base,
initializer) for _ in range(nCZ)]
)
self.Lc0 = nn.Linear(c*k**2, 128)
self.Lc1 = nn.Linear(128, 1)
if initializer is not None:
self.reset_parameters(initializer)
def forward(self, x):
B, Nx, Ny, T, ich = x.shape # (B, Nx, Ny, T, d)
ns = math.floor(np.log2(Nx))
x = self.Lk(x)
x = x.view(B, Nx, Ny, T, self.c, self.k**2)
for i in range(self.nCZ):
x = self.MWT_CZ[i](x)
if i < self.nCZ-1:
x = F.relu(x)
x = x.view(B, Nx, Ny, T, -1) # collapse c and k**2
x = self.Lc0(x)
x = F.relu(x)
x = self.Lc1(x)
return x.squeeze()
def reset_parameters(self, initializer):
initializer(self.Lc0.weight)
initializer(self.Lc1.weight) | 31.549202 | 129 | 0.443203 | 3,259 | 23,725 | 3.144216 | 0.062289 | 0.011711 | 0.011125 | 0.012979 | 0.886796 | 0.843271 | 0.818386 | 0.810871 | 0.798185 | 0.775154 | 0 | 0.046825 | 0.398693 | 23,725 | 752 | 130 | 31.549202 | 0.671457 | 0.041728 | 0 | 0.741824 | 0 | 0 | 0.007269 | 0 | 0 | 0 | 0 | 0 | 0.005164 | 1 | 0.075732 | false | 0 | 0.013769 | 0.005164 | 0.156627 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a897a4d7f811fd6f25ed8dddffadbba25a3b4eb | 29,086 | py | Python | instructors/projects/decoding_fun/examples/eng_dict.py | mgadagin/PythonClass | 70b370362d75720b3fb0e1d6cc8158f9445e9708 | [
"MIT"
] | 46 | 2017-09-27T20:19:36.000Z | 2020-12-08T10:07:19.000Z | instructors/projects/decoding_fun/examples/eng_dict.py | mgadagin/PythonClass | 70b370362d75720b3fb0e1d6cc8158f9445e9708 | [
"MIT"
] | 6 | 2018-01-09T08:07:37.000Z | 2020-09-07T12:25:13.000Z | instructors/projects/decoding_fun/examples/eng_dict.py | mgadagin/PythonClass | 70b370362d75720b3fb0e1d6cc8158f9445e9708 | [
"MIT"
] | 18 | 2017-10-10T02:06:51.000Z | 2019-12-01T10:18:13.000Z | eng_dc = ['the', 'be', 'and', 'of', 'to', 'a', 'in', 'have', 'you', 'it', 'he', 'for', 'they', 'not', 'that', 'we', 'I', 'on', 'with', 'do', 'this', 'as', 'she', 'at', 'but', 'from', 'by', 'will', 'or', 'say', 'go', 'so', 'all', 'about', 'if', 'one', 'would', 'know', 'there', 'which', 'can', 'get', 'think', 'like', 'more', 'who', 'when', 'what', 'make', 'time', 'see', 'up', 'people', 'some', 'out', 'good', 'other', 'year', 'well', 'because', 'very', 'just', 'no', 'take', 'come', 'could', 'use', 'work', 'then', 'now', 'also', 'than', 'into', 'only', 'want', 'look', 'new', 'give', 'first', 'way', 'thing', 'find', 'any', 'over', 'right', 'after', 'day', 'where', 'most', 'should', 'need', 'much', 'how', 'back', 'mean', 'may', 'such', 'here', 'really', 'even', 'company', 'many', 'child', 'tell', 'last', 'call', 'down', 'before', 'man', 'through', 'show', 'life', 'between', 'lot', 'feel', 'place', 'change', 'long', 'too', 'pause', 'still', 'write', 'problem', 'talk', 'try', 'something', 'unclear', 'same', 'great', 'number', 'leave', 'little', 'both', 'meet', 'help', 'own', 'ask', 'part', 'country', 'put', 'point', 'start', 'school', 'each', 'become', 'interest', 'old', 'off', 'another', 'different', 'high', 'next', 'include', 'late', 'why', 'live', 'end', 'world', 'week', 'must', 'while', 'never', 'study', 'kind', 'report', 'play', 'house', 'group', 'might', 'yes', 'home', 'course', 'let', 'case', 'system', 'again', 'woman', 'hear', 'family', 'book', 'seem', 'around', 'during', 'keep', 'big', 'follow', 'every', 'question', 'under', 'important', 'always', 'friend', 'however', 'set', 'hand', 'provide', 'small', 'turn', 'state', 'begin', 'run', 'since', 'early', 'money', 'few', 'bring', 'market', 'information', 'area', 'move', 'business', 'service', 'government', 'fact', 'issue', 'thank', 'large', 'result', 'order', 'read', 'month', 'increase', 'name', 'love', 'word', 'without', 'open', 'pay', 'offer', 'build', 'hold', 'happen', 'against', 'away', 'job', 'buy', 'though', 'today', 'example', 'believe', 'plan', 'second', 'program', 'student', 'form', 'young', 'lead', 'face', 'close', 'room', 'hope', 'cost', 'head', 'understand', 'hour', 'far', 'spend', 'car', 'actually', 'level', 'city', 'present', 'less', 'idea', 'reason', 'learn', 'until', 'member', 'process', 'person', 'experience', 'night', 'support', 'sure', 'sort', 'quite', 'bad', 'once', 'enough', 'although', 'within', 'age', 'term', 'whether', 'able', 'share', 'line', 'product', 'speak', 'side', 'train', 'soon', 'low', 'price', 'public', 'often', 'rate', 'possible', 'least', 'parent', 'consider', 'effect', 'rather', 'control', 'view', 'story', 'local', 'anything', 'together', 'value', 'hard', 'stand', 'visit', 'watch', 'color', 'party', 'continue', 'bit', 'ever', 'eye', 'base', 'concern', 'letter', 'center', 'lose', 'yet', 'almost', 'development', 'already', 'test', 'probably', 'sale', 'suggest', 'nothing', 'whole', 'care', 'deal', 'language', 'send', 'fall', 'expect', 'return', 'water', 'allow', 'per', 'cause', 'power', 'sit', 'walk', 'mother', 'subject', 'develop', 'stay', 'record', 'mind', 'remember', 'past', 'office', 'force', 'grow', 'town', 'light', 'stop', 'several', 'period', 'class', 'matter', 'food', 'social', 'require', 'political', 'win', 'decide', 'staff', 'figure', 'real', 'future', 'policy', 'answer', 'laugh', 'among', 'remain', 'ago', 'type', 'shop', 'security', 'receive', 'minute', 'note', 'fund', 'top', 'game', 'involve', 'account', 'half', 'history', 'create', 'break', 'moment', 'individual', 'across', 'either', 'music', 'further', 'yeah', 'reach', 'clear', 'rule', 'computer', 'wait', 'sound', 'team', 'along', 'research', 'appear', 'drive', 'activity', 'black', 'produce', 'free', 'general', 'body', 'please', 'toward', 'sense', 'perhaps', 'everything', 'add', 'law', 'sell', 'easy', 'full', 'film', 'model', 'war', 'forward', 'himself', 'maybe', 'morning', 'design', 'pass', 'condition', 'near', 'door', 'human', 'above', 'available', 'position', 'agree', 'short', 'situation', 'paper', 'cover', 'major', 'customer', 'father', 'choose', 'bear', 'someone', 'describe', 'main', 'date', 'event', 'nice', 'special', 'certain', 'phone', 'join', 'else', 'girl', 'sometimes', 'table', 'community', 'carry', 'decision', 'president', 'role', 'particular', 'cut', 'difference', 'die', 'eat', 'enjoy', 'rise', 'especially', 'detail', 'data', 'charge', 'practice', 'cell', 'improve', 'kid', 'action', 'strong', 'happy', 'health', 'economic', 'difficult', 'regard', 'travel', 'approach', 'amount', 'investment', 'draw', 'white', 'site', 'round', 'behind', 'claim', 'step', 'patient', 'true', 'teacher', 'range', 'percent', 'themselves', 'organization', 'vote', 'front', 'measure', 'trade', 'therefore', 'finally', 'raise', 'wear', 'industry', 'explain', 'relationship', 'quality', 'accord', 'outside', 'wish', 'death', 'project', 'land', 'sign', 'boy', 'news', 'risk', 'total', 'couple', 'national', 'list', 'opportunity', 'act', 'sport', 'road', 'kill', 'serve', 'education', 'picture', 'likely', 'benefit', 'standard', 'stage', 'performance', 'rest', 'certainly', 'culture', 'focus', 'arrive', 'itself', 'employee', 'upon', 'voice', 'due', 'technology', 'field', 'air', 'material', 'current', 'teach', 'financial', 'century', 'society', 'analysis', 'limit', 'evidence', 'reduce', 'listen', 'usually', 'lie', 'foot', 'single', 'common', 'space', 'realize', 'former', 'animal', 'instead', 'similar', 'thus', 'address', 'leader', 'complete', 'arm', 'function', 'factor', 'chance', 'mention', 'contact', 'exist', 'response', 'demand', 'accept', 'save', 'opinion', 'pick', 'wrong', 'apply', 'compare', 'suppose', 'choice', 'structure', 'fight', 'relate', 'feature', 'firm', 'ground', 'effort', 'source', 'pretty', 'check', 'okay', 'campaign', 'street', 'foreign', 'attention', 'personal', 'park', 'particularly', 'knowledge', 'contain', 'official', 'court', 'bank', 'wife', 'article', 'management', 'manager', 'section', 'guy', 'finish', 'fine', 'store', 'attack', 'stock', 'discuss', 'prepare', 'fire', 'piece', 'heart', 'forget', 'police', 'recent', 'behavior', 'represent', 'growth', 'page', 'holiday', 'affect', 'establish', 'wonder', 'poor', 'manage', 'addition', 'bed', 'simply', 'recently', 'yesterday', 'sorry', 'surprise', 'art', 'method', 'fast', 'purchase', 'stuff', 'international', 'drink', 'myself', 'worry', 'whatever', 'private', 'determine', 'summer', 'evening', 'influence', 'exactly', 'average', 'everyone', 'drop', 'miss', 'significant', 'production', 'inside', 'tomorrow', 'region', 'attempt', 'cent', 'shall', 'contract', 'smile', 'skill', 'medium', 'necessary', 'economy', 'various', 'notice', 'key', 'nature', 'population', 'nation', 'hit', 'occur', 'plant', 'election', 'catch', 'director', 'review', 'military', 'statement', 'worker', 'respect', 'paint', 'player', 'capital', 'press', 'movie', 'tax', 'environment', 'son', 'hotel', 'size', 'item', 'image', 'drug', 'simple', 'indeed', 'series', 'window', 'final', 'purpose', 'treatment', 'club', 'file', 'department', 'bus', 'wall', 'direct', 'character', 'race', 'gain', 'fit', 'enter', 'agreement', 'fail', 'season', 'college', 'seek', 'achieve', 'beautiful', 'station', 'alone', 'below', 'clothes', 'attend', 'argue', 'success', 'lack', 'comment', 'option', 'herself', 'pull', 'church', 'advantage', 'identify', 'link', 'indicate', 'aim', 'income', 'specific', 'floor', 'discussion', 'associate', 'recognize', 'tree', 'unit', 'loss', 'mark', 'challenge', 'depend', 'wide', 'anyway', 'mile', 'solution', 'board', 'clearly', 'anyone', 'machine', 'marry', 'relation', 'theory', 'despite', 'introduce', 'prove', 'ability', 'popular', 'modern', 'doctor', 'release', 'score', 'access', 'television', 'ready', 'strike', 'target', 'card', 'potential', 'organize', 'pattern', 'clock', 'village', 'nearly', 'movement', 'propose', 'guess', 'fear', 'operation', 'trip', 'hair', 'supply', 'quickly', 'application', 'sleep', 'network', 'strategy', 'interview', 'hospital', 'red', 'husband', 'degree', 'star', 'generally', 'restaurant', 'yourself', 'author', 'pressure', 'task', 'express', 'competition', 'serious', 'reference', 'treat', 'conclusion', 'brother', 'natural', 'everybody', 'touch', 'beyond', 'define', 'basis', 'trouble', 'deep', 'dark', 'energy', 'fish', 'sing', 'sample', 'refer', 'adult', 'positive', 'except', 'promise', 'disease', 'dress', 'throw', 'worth', 'clean', 'fill', 'somebody', 'property', 'operate', 'profit', 'goal', 'bar', 'advance', 'quarter', 'central', 'cold', 'object', 'style', 'obviously', 'push', 'tend', 'assume', 'normal', 'suffer', 'exchange', 'middle', 'blue', 'match', 'officer', 'avoid', 'reflect', 'useful', 'fun', 'huge', 'instance', 'seat', 'document', 'oil', 'message', 'net', 'argument', 'successful', 'box', 'resource', 'pound', 'facility', 'throughout', 'bill', 'debate', 'speech', 'separate', 'baby', 'male', 'prefer', 'earn', 'maintain', 'hot', 'career', 'doubt', 'exercise', 'previous', 'daily', 'search', 'suddenly', 'fly', 'basic', 'ring', 'dog', 'asset', 'science', 'perform', 'balance', 'song', 'weekend', 'dead', 'encourage', 'protect', 'damage', 'imagine', 'afternoon', 'estimate', 'photo', 'context', 'credit', 'newspaper', 'daughter', 'version', 'variety', 'extend', 'proposal', 'professional', 'sister', 'dollar', 'memory', 'mine', 'ahead', 'nor', 'request', 'post', 'original', 'female', 'green', 'dance', 'dream', 'observe', 'inform', 'communication', 'discover', 'garden', 'track', 'screen', 'agency', 'possibility', 'examine', 'legal', 'university', 'recommend', 'text', 'direction', 'responsibility', 'conversation', 'magazine', 'easily', 'favorite', 'rock', 'independent', 'additional', 'agent', 'complex', 'appropriate', 'invite', 'traditional', 'cross', 'sea', 'reply', 'famous', 'software', 'weight', 'shape', 'completely', 'trial', 'shoot', 'weather', 'administration', 'fix', 'judge', 'absolutely', 'user', 'element', 'welcome', 'announce', 'glass', 'stick', 'requirement', 'difficulty', 'laughter', 'effective', 'survey', 'invest', 'majority', 'primary', 'generation', 'federal', 'wind', 'replace', 'writer', 'stress', 'committee', 'principle', 'content', 'unless', 'immediately', 'percentage', 'equipment', 'telephone', 'title', 'budget', 'transfer', 'blood', 'scene', 'conduct', 'chair', 'sector', 'expensive', 'executive', 'beat', 'wonderful', 'warm', 'copy', 'none', 'negative', 'annual', 'prevent', 'rich', 'block', 'payment', 'collection', 'advice', 'politics', 'remove', 'ensure', 'medical', 'hang', 'relative', 'directly', 'count', 'transport', 'safe', 'email', 'mix', 'display', 'ride', 'flow', 'highly', 'flat', 'leg', 'procedure', 'contrast', 'straight', 'correct', 'connection', 'institution', 'admit', 'consumer', 'video', 'reveal', 'radio', 'otherwise', 'nobody', 'aware', 'appeal', 'alternative', 'status', 'award', 'surface', 'heavy', 'handle', 'cry', 'sex', 'introduction', 'deliver', 'tour', 'pair', 'collect', 'extra', 'intend', 'reader', 'cheap', 'decade', 'sentence', 'farm', 'overall', 'moreover', 'expression', 'concert', 'dinner', 'print', 'responsible', 'decline', 'grant', 'physical', 'trust', 'ship', 'speed', 'south', 'truth', 'select', 'category', 'fair', 'attitude', 'peace', 'band', 'lay', 'importance', 'perfect', 'launch', 'wave', 'presence', 'crime', 'horse', 'advertise', 'progress', 'global', 'chief', 'slightly', 'scale', 'double', 'nuclear', 'warn', 'extent', 'library', 'labor', 'respond', 'edge', 'partner', 'experiment', 'pain', 'satisfy', 'taxi', 'slow', 'suit', 'spot', 'regular', 'excite', 'concept', 'guide', 'initial', 'speaker', 'dry', 'secretary', 'shake', 'photograph', 'scheme', 'technique', 'tonight', 'apart', 'rain', 'suggestion', 'cool', 'distance', 'defense', 'north', 'conflict', 'lift', 'river', 'excellent', 'expert', 'favor', 'funny', 'eventually', 'heat', 'mistake', 'dear', 'improvement', 'chapter', 'emerge', 'football', 'demonstrate', 'artist', 'reform', 'adopt', 'corner', 'audience', 'decrease', 'struggle', 'roll', 'island', 'feed', 'camp', 'surround', 'investor', 'fully', 'fee', 'crowd', 'senior', 'arrange', 'expense', 'cook', 'combine', 'cultural', 'map', 'meal', 'weapon', 'contribution', 'shift', 'ball', 'cash', 'entire', 'reality', 'lesson', 'solve', 'kitchen', 'failure', 'circumstance', 'confirm', 'mouth', 'busy', 'contribute', 'tool', 'objective', 'gas', 'lady', 'quick', 'currently', 'spread', 'driver', 'glad', 'beach', 'commercial', 'basically', 'pop', 'variable', 'brain', 'cancer', 'reaction', 'proceed', 'neither', 'crisis', 'hide', 'refuse', 'consequence', 'volume', 'bag', 'trend', 'traffic', 'mass', 'left', 'owner', 'length', 'vary', 'revenue', 'duty', 'repeat', 'mountain', 'unfortunately', 'survive', 'bedroom', 'schedule', 'marriage', 'employ', 'smoke', 'essential', 'ticket', 'critical', 'fan', 'flight', 'relatively', 'equal', 'egg', 'bottom', 'somewhere', 'plus', 'novel', 'coach', 'pleasure', 'promote', 'background', 'union', 'neighbor', 'provision', 'appreciate', 'plane', 'topic', 'code', 'secret', 'enable', 'package', 'manufacture', 'shareholder', 'investigation', 'attract', 'bird', 'path', 'swim', 'afraid', 'bond', 'environmental', 'finger', 'anybody', 'flower', 'colleague', 'insurance', 'consideration', 'settle', 'powerful', 'quiet', 'burn', 'engineer', 'component', 'waste', 'aid', 'earth', 'extremely', 'desire', 'tire', 'apparently', 'breath', 'strength', 'delay', 'connect', 'nurse', 'brief', 'sum', 'soldier', 'hardly', 'lunch', 'strange', 'religious', 'battle', 'whereas', 'construction', 'engage', 'district', 'hate', 'boat', 'stone', 'gather', 'advertisement', 'tourist', 'divide', 'expand', 'delivery', 'historical', 'tradition', 'museum', 'mostly', 'host', 'shoulder', 'broad', 'council', 'commit', 'spring', 'troop', 'jump', 'healthy', 'fresh', 'conclude', 'furthermore', 'finance', 'threat', 'studio', 'safety', 'bomb', 'active', 'winter', 'export', 'acquire', 'blow', 'sun', 'obvious', 'coffee', 'bind', 'visitor', 'generate', 'tape', 'cycle', 'assess', 'editor', 'spirit', 'scientist', 'tear', 'monitor', 'location', 'actual', 'actor', 'twice', 'corporate', 'minister', 'murder', 'comfortable', 'hurt', 'pool', 'assessment', 'wash', 'register', 'regulation', 'temperature', 'violence', 'route', 'impossible', 'recall', 'army', 'sight', 'error', 'accident', 'usual', 'tough', 'opposite', 'wine', 'relax', 'noise', 'carefully', 'characteristic', 'possibly', 'camera', 'shock', 'convince', 'arrangement', 'oppose', 'climb', 'slowly', 'relevant', 'consist', 'principal', 'lawyer', 'manner', 'gun', 'onto', 'locate', 'domestic', 'pack', 'protein', 'kiss', 'branch', 'voter', 'vehicle', 'civil', 'literature', 'mainly', 'theater', 'stare', 'totally', 'freedom', 'quote', 'industrial', 'significantly', 'guest', 'commitment', 'capacity', 'description', 'skin', 'taste', 'perspective', 'belong', 'normally', 'ought', 'till', 'participant', 'comparison', 'belief', 'dangerous', 'representative', 'signal', 'fashion', 'technical', 'interaction', 'deny', 'friendly', 'previously', 'participate', 'danger', 'gold', 'occasion', 'square', 'leadership', 'gift', 'mobile', 'shoe', 'border', 'label', 'load', 'prison', 'wood', 'ad', 'suitable', 'internal', 'west', 'affair', 'cup', 'outcome', 'discount', 'ignore', 'suspect', 'citizen', 'definition', 'arrest', 'largely', 'desk', 'destroy', 'hall', 'investigate', 'familiar', 'loan', 'remind', 'explore', 'tea', 'index', 'recommendation', 'complain', 'hi', 'poll', 'wed', 'escape', 'switch', 'fairly', 'lovely', 'permit', 'dad', 'import', 'association', 'bright', 'predict', 'division', 'debt', 'shout', 'device', 'wake', 'proper', 'definitely', 'analyze', 'necessarily', 'victim', 'commission', 'amaze', 'employment', 'combination', 'conservative', 'guarantee', 'rank', 'protection', 'mouse', 'nevertheless', 'abuse', 'researcher', 'yield', 'root', 'secure', 'elect', 'chain', 'forest', 'arise', 'confidence', 'frame', 'shot', 'identity', 'afford', 'birth', 'tie', 'brand', 'instrument', 'hole', 'grade', 'threaten', 'hire', 'moral', 'phase', 'latter', 'typical', 'approve', 'strongly', 'factory', 'channel', 'judgment', 'proportion', 'concentration', 'resident', 'empty', 'opposition', 'selection', 'entirely', 'session', 'sexual', 'ice', 'master', 'narrow', 'graduate', 'increasingly', 'insist', 'license', 'bridge', 'concentrate', 'plenty', 'entry', 'reduction', 'farmer', 'respectively', 'notion', 'rent', 'odd', 'appearance', 'musical', 'bore', 'faithfully', 'reasonable', 'rely', 'presidential', 'sequence', 'soft', 'stretch', 'considerable', 'fuel', 'atmosphere', 'bottle', 'unique', 'practical', 'presentation', 'theme', 'hell', 'lock', 'prior', 'secondly', 'peak', 'mechanism', 'explanation', 'mail', 'nowadays', 'native', 'succeed', 'cast', 'wild', 'folk', 'ear', 'intelligence', 'sheet', 'journey', 'tiny', 'terrible', 'online', 'multiple', 'declare', 'engine', 'chairman', 'besides', 'mental', 'specifically', 'relief', 'professor', 'yard', 'celebrate', 'personality', 'construct', 'joint', 'row', 'via', 'capture', 'justice', 'constant', 'youth', 'coast', 'expectation', 'witness', 'blame', 'tone', 'seriously', 'honor', 'ourselves', 'electronic', 'dealer', 'disk', 'northern', 'chemical', 'somehow', 'hill', 'sky', 'fruit', 'fellow', 'guard', 'vision', 'impose', 'reserve', 'chart', 'surely', 'thin', 'minimum', 'variation', 'formal', 'frequently', 'verb', 'acquisition', 'retire', 'recover', 'seed', 'tip', 'instruction', 'mission', 'absence', 'fat', 'east', 'derive', 'ordinary', 'critic', 'helpful', 'gene', 'anywhere', 'lean', 'glance', 'ideal', 'neighborhood', 'smell', 'silence', 'disappear', 'lip', 'cat', 'passenger', 'compete', 'representation', 'rush', 'shut', 'disorder', 'maximum', 'complaint', 'careful', 'column', 'religion', 'legislation', 'employer', 'widely', 'protest', 'ancient', 'illustrate', 'faith', 'observation', 'command', 'reject', 'imply', 'ban', 'implement', 'approximately', 'qualify', 'somewhat', 'regional', 'assumption', 'temporary', 'attractive', 'plate', 'sad', 'frequency', 'weak', 'slip', 'victory', 'circle', 'academic', 'joke', 'county', 'hat', 'steal', 'dozen', 'kick', 'unable', 'settlement', 'accommodation', 'symptom', 'reporter', 'household', 'tennis', 'merely', 'emotion', 'tall', 'sick', 'calculate', 'scientific', 'accuse', 'criminal', 'estate', 'unlike', 'underlie', 'interpretation', 'corporation', 'criterion', 'appoint', 'enemy', 'beside', 'firstly', 'assistant', 'sweet', 'competitive', 'consistent', 'closely', 'equally', 'advise', 'liberal', 'meanwhile', 'impression', 'existence', 'exhibition', 'accompany', 'output', 'salary', 'attach', 'acknowledge', 'snow', 'properly', 'breast', 'prime', 'ill', 'reward', 'bother', 'primarily', 'gray', 'childhood', 'expose', 'everywhere', 'intention', 'discipline', 'medicine', 'input', 'differ', 'politician', 'wage', 'implication', 'substantial', 'adviser', 'permanent', 'infant', 'rare', 'angry', 'remark', 'alive', 'injury', 'pocket', 'knock', 'producer', 'currency', 'manufacturer', 'king', 'obligation', 'initiative', 'talent', 'breakfast', 'resolution', 'emotional', 'enhance', 'core', 'framework', 'phrase', 'gap', 'southern', 'storm', 'undertake', 'distinguish', 'priority', 'draft', 'democracy', 'pursue', 'urge', 'pilot', 'shirt', 'coat', 'lake', 'habit', 'emphasize', 'neck', 'immediate', 'yellow', 'sir', 'communicate', 'breathe', 'vast', 'origin', 'stable', 'enormous', 'negotiation', 'resolve', 'terrorist', 'bloody', 'retain', 'bone', 'mathematics', 'supplier', 'milk', 'panel', 'passage', 'fundamental', 'pupil', 'publication', 'winner', 'gentleman', 'elsewhere', 'examination', 'soul', 'forth', 'urban', 'contemporary', 'incident', 'integrate', 'swing', 'ratio', 'borrow', 'sufficient', 'motion', 'exam', 'boss', 'governor', 'effectively', 'diet', 'install', 'premise', 'abroad', 'hence', 'metal', 'convention', 'layer', 'typically', 'grateful', 'crash', 'incorporate', 'formation', 'classic', 'aircraft', 'sharp', 'highlight', 'climate', 'disappoint', 'defeat', 'retirement', 'defend', 'truly', 'self', 'clinical', 'nearby', 'distribute', 'reputation', 'creation', 'exclude', 'exhibit', 'specify', 'extreme', 'appointment', 'slide', 'lucky', 'brown', 'journalist', 'occupy', 'soil', 'educational', 'upper', 'correspond', 'sudden', 'tooth', 'freeze', 'gay', 'plastic', 'peer', 'exception', 'bet', 'excuse', 'plain', 'crop', 'rural', 'equivalent', 'dish', 'complicate', 'meat', 'collapse', 'luck', 'enterprise', 'update', 'restrict', 'subsequent', 'perfectly', 'originally', 'invitation', 'thick', 'encounter', 'proud', 'chip', 'analyst', 'valuable', 'bike', 'retail', 'calm', 'unusual', 'criticism', 'personally', 'beauty', 'plot', 'preserve', 'emergency', 'comfort', 'deserve', 'repair', 'severe', 'recognition', 'secondary', 'proof', 'capable', 'outline', 'depression', 'evaluate', 'pension', 'external', 'cope', 'emphasis', 'restriction', 'partly', 'aside', 'massive', 'intellectual', 'minority', 'revolution', 'submit', 'prospect', 'equation', 'unemployment', 'intervention', 'delight', 'mom', 'smart', 'illness', 'anymore', 'numerous', 'abandon', 'confuse', 'wheel', 'trace', 'crucial', 'split', 'efficient', 'dominate', 'database', 'drama', 'isolate', 'nose', 'port', 'spell', 'rapidly', 'dispute', 'landscape', 'inch', 'ultimately', 'phenomenon', 'profile', 'entertainment', 'assistance', 'boundary', 'gender', 'dramatic', 'educate', 'edition', 'wing', 'achievement', 'similarly', 'specialist', 'formula', 'innovation', 'festival', 'coverage', 'gate', 'pitch', 'unknown', 'slight', 'distinction', 'roof', 'scream', 'convert', 'minor', 'negotiate', 'pollution', 'era', 'episode', 'volunteer', 'infection', 'preparation', 'arrival', 'silver', 'electricity', 'unlikely', 'sink', 'grand', 'web', 'upset', 'transition', 'forecast', 'eliminate', 'agenda', 'prize', 'wire', 'crack', 'deeply', 'cable', 'pub', 'apparent', 'zone', 'fault', 'characterize', 'everyday', 'honest', 'supporter', 'inspire', 'whisper', 'hunt', 'welfare', 'toy', 'cloud', 'perceive', 'constraint', 'ease', 'solid', 'prisoner', 'expansion', 'agricultural', 'virtually', 'album', 'knee', 'bend', 'exposure', 'alter', 'pour', 'digital', 'satisfaction', 'tension', 'wet', 'perception', 'dimension', 'beer', 'tight', 'restore', 'sweep', 'interpret', 'anger', 'crew', 'assist', 'essay', 'assure', 'deposit', 'string', 'shower', 'elderly', 'extensive', 'truck', 'uniform', 'mood', 'detect', 'shadow', 'beneath', 'territory', 'mode', 'trail', 'sensitive', 'nervous', 'sail', 'parallel', 'hero', 'competitor', 'initially', 'transform', 'stream', 'breed', 'vital', 'attribute', 'awful', 'devote', 'stem', 'height', 'apologize', 'owe', 'alright', 'genetic', 'persuade', 'vice', 'recruit', 'steady', 'heavily', 'entrance', 'furniture', 'strain', 'random', 'justify', 'rarely', 'measurement', 'meter', 'pace', 'western', 'constitute', 'spare', 'designer', 'mature', 'evil', 'guilty', 'curve', 'jacket', 'false', 'demonstration', 'wound', 'frighten', 'muscle', 'scare', 'grass', 'substance', 'pink', 'symbol', 'foundation', 'tank', 'cite', 'extension', 'disaster', 'sigh', 'routine', 'cake', 'efficiency', 'membership', 'smooth', 'portion', 'mirror', 'tune', 'withdraw', 'resort', 'resistance', 'giant', 'bid', 'boot', 'naturally', 'summary', 'radical', 'van', 'mutual', 'entitle', 'fascinate', 'god', 'singer', 'broadcast', 'apartment', 'platform', 'whenever', 'conventional', 'independence', 'loose', 'reverse', 'illustration', 'loud', 'quantity', 'poem', 'damn', 'pose', 'depth', 'significance', 'planet', 'iron', 'gradually', 'approval', 'evaluation', 'wealth', 'visual', 'consult', 'sponsor', 'badly', 'trap', 'stupid', 'adjust', 'log', 'crazy', 'dirty', 'hesitate', 'gaze', 'creative', 'button', 'extraordinary', 'establishment', 'constantly', 'alcohol', 'throat', 'probability', 'vegetable', 'remarkable', 'dependent', 'steel', 'strip', 'sustain', 'ally', 'ethnic', 'pleasant', 'exceed', 'sugar', 'historian', 'brilliant', 'involvement', 'edit', 'philosophy', 'hypothesis', 'bread', 'drag', 'inner', 'statistic', 'liability', 'anticipate', 'league', 'seal', 'flood', 'grab', 'compensation', 'compound', 'segment', 'occasionally', 'spin', 'desert', 'operator', 'tower', 'newly', 'paragraph', 'advocate', 'bath', 'blind', 'confident', 'overcome', 'regularly', 'briefly', 'pure', 'counsel', 'disturb', 'silent', 'burden', 'behave', 'tap', 'valley', 'alarm', 'fantastic', 'preference', 'discovery', 'dare', 'skirt', 'eastern', 'cigarette', 'poverty', 'registration', 'offense', 'cousin', 'criticize', 'bowl', 'clause', 'impress', 'jury', 'venture', 'virus', 'anxiety', 'wrap', 'illegal', 'harm', 'overseas', 'survival', 'teenager', 'specialize', 'moderate', 'limitation', 'modify', 'accurate', 'chest', 'angle', 'comprehensive', 'rival', 'universal', 'adequate', 'tube', 'expenditure', 'tourism', 'mount', 'recovery', 'margin', 'mate', 'admire', 'gesture', 'stair', 'charm', 'musician', 'rapid', 'slave', 'amendment', 'format', 'incentive', 'consultant', 'deficit', 'mortgage', 'abstract', 'dig', 'literary', 'experimental', 'architecture', 'possess', 'opponent', 'keen', 'evolution', 'versus', 'burst', 'lend', 'custom', 'translate', 'cough', 'distinct', 'rough', 'surgery', 'buyer', 'pen', 'quietly', 'laboratory', 'capability', 'province', 'twin', 'chicken', 'mess', 'adapt', 'scholar', 'mad', 'precisely', 'therapy', 'frequent', 'wealthy', 'journal', 'composition', 'tissue', 'stroke', 'flash', 'champion', 'sand', 'promotion', 'charity', 'bury', 'tendency', 'barrier', 'cream', 'rid', 'brush', 'dialog', 'publisher', 'consequently', 'democratic', 'hurry', 'exact', 'abortion', 'govern', 'creature', 'whilst', 'privilege', 'dismiss', 'cap', 'participation', 'visible', 'twist', 'narrative', 'classical', 'assign', 'regret', 'motor', 'impressive', 'prompt', 'ruin', 'density', 'resist', 'rescue', 'coal', 'implementation', 'lecture', 'awareness', 'maintenance', 'inflation', 'greatly', 'successfully', 'psychological', 'institutional', 'dust', 'cancel', 'functional', 'scope', 'species', 'float', 'absolute', 'passion', 'airline', 'motivate', 'module', 'fold', 'theoretical', 'react', 'wooden', 'poet', 'counter', 'insight', 'partnership', 'stain', 'sake', 'automatically', 'penalty', 'rail', 'salt', 'contest', 'violent', 'bin', 'aggressive', 'pale', 'opera', 'undergo', 'embrace', 'pile', 'divorce', 'march', 'acceptable', 'literally', 'permission', 'allege', 'grammar', 'regulate', 'cluster', 'compromise', 'historic', 'diversity', 'immigrant', 'gallery', 'dedicate', 'pretend', 'tackle', 'castle', 'golf', 'celebration', 'embarrass', 'personnel', 'boost', 'roughly', 'pig', 'extract', 'injure', 'fulfill', 'mixture', 'announcement', 'biological', 'praise', 'disagree', 'electric', 'excess', 'depress', 'compose', 'fancy', 'continuous', 'complexity', 'friendship', 'stability', 'accomplish', 'comprise', 'holder', 'inquiry', 'weakness', 'tail', 'noun', 'civilian', 'weigh', 'racial', 'tale', 'evolve', 'poetry', 'fortune', 'potentially', 'mere', 'gently', 'server', 'sanction', 'guitar', 'profession', 'pump', 'chamber', 'veteran', 'shine', 'championship', 'joy', 'stake', 'gear', 'remote', 'entertain', 'reliable', 'strengthen', 'orange', 'rat', 'cheek', 'jail', 'forever', 'imagination', 'bias', 'possession', 'chat', 'servant', 'dramatically', 'carbon', 'curious', 'structural', 'neglect', 'compute', 'rear', 'ski', 'pot', 'revise', 'grin', 'snap', 'stimulate', 'adjustment', 'moon', 'printer', 'boom', 'situate', 'scan', 'cheese', 'pride', 'shell', 'grandmother', 'resign', 'supplement', 'bunch', 'clothing', 'ceremony', 'barely', 'firmly', 'pipe', 'maker', 'hopefully', 'trigger', 'stomach', 'destruction', 'pregnant', 'craft', 'intense', 'shelf', 'logic', 'indication', 'subsequently', 'happiness', 'presumably', 'magic', 'interior', 'menu', 'mystery', 'pro', 'greet', 'humor', 'concrete', 'flag', 'chocolate', 'shelter', 'guideline', 'cow', 'ownership', 'bless', 'summarize', 'knife', 'trick', 'wise', 'motivation', 'silly', 'pray', 'attachment', 'strict', 'cooperation', 'organic', 'reckon', 'uncle', 'surprisingly', 'regardless', 'coin', 'attraction', 'athlete', 'darkness', 'bite', 'harbor', 'stir', 'filter', 'romantic', 'determination', 'tender', 'tongue', 'transportation', 'reasonably', 'vessel', 'piano', 'envelope', 'slope', 'golden', 'catalog', 'belt', 'attendance', 'storage', 'pregnancy', 'invent', 'controversial', 'horrible', 'potato', 'ocean', 'lover', 'uncertainty', 'fiction', 'hint', 'nowhere', 'liquid', 'stranger', 'anxious', 'fool', 'leap', 'adventure', 'carpet', 'shade', 'portrait', 'hook', 'reflection', 'qualification', 'nerve', 'leather', 'exhaust', 'fragment', 'wander', 'distant', 'unite', 'bell', 'grain', 'monthly', 'altogether', 'differently', 'universe', 'weekly', 'empire', 'royal', 'fence', 'luxury', 'comedy', 'confusion', 'curtain', 'consume', 'stamp', 'flexible', 'innocent', 'tent', 'shore', 'voluntary', 'swear', 'genuine', 'panic', 'sheep', 'mayor', 'gentle', 'precise', 'raw', 'wherever', 'refugee', 'listener', 'weird', 'substitute', 'rice', 'aunt', 'excitement', 'fade', 'wipe', 'chase', 'slice', 'alongside', 'suspend', 'tournament', 'autumn', 'ugly', 'hello', 'fortunate', 'insure', 'lazy', 'ashamed', 'hunger', 'found', 'thirst'] | 29,086 | 29,086 | 0.614694 | 2,803 | 29,086 | 6.378166 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096335 | 29,086 | 1 | 29,086 | 29,086 | 0.680186 | 0 | 0 | 0 | 0 | 0 | 0.614467 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 8 |
438bd9f247d3277a77b50ccc7546a45fa10b9b19 | 1,205 | py | Python | opt/resource/payloads.py | cosee-concourse/eb_deployer-resource | 64d1d849e25ae602cf654d579913b2655cf53766 | [
"MIT"
] | null | null | null | opt/resource/payloads.py | cosee-concourse/eb_deployer-resource | 64d1d849e25ae602cf654d579913b2655cf53766 | [
"MIT"
] | null | null | null | opt/resource/payloads.py | cosee-concourse/eb_deployer-resource | 64d1d849e25ae602cf654d579913b2655cf53766 | [
"MIT"
] | null | null | null | check_payload = ('{"source":{'
'"access_key_id":"apiKey123",'
'"secret_access_key":"secretKey321"'
'},'
'"version":{"env":"dev"}}')
in_payload = ('{"source":{'
'"access_key_id":"apiKey123",'
'"secret_access_key":"secretKey321"'
'},'
'"version":{"env":"dev"}}')
out_deploy_payload = ('{"params":{'
'"env":"dev",'
'"deploy": true,'
'"artifact_file": "artifact/package.zip",'
'"config_file": "source/ci'
'"},'
'"source":{'
'"access_key_id":"apiKey123",'
'"secret_access_key":"secretKey321'
'"},'
'"version":{"env":"dev"}}')
out_remove_payload = ('{"params":{'
'"env":"dev",'
'"remove": true,'
'"artifact_file": "artifact/package.zip",'
'"config_file": "source/ci'
'"},'
'"source":{'
'"access_key_id":"apiKey123",'
'"secret_access_key":"secretKey321'
'"},'
'"version":{"env":"dev"}}')
| 36.515152 | 57 | 0.40083 | 86 | 1,205 | 5.313953 | 0.27907 | 0.157549 | 0.131291 | 0.148797 | 0.849015 | 0.849015 | 0.849015 | 0.849015 | 0.849015 | 0.849015 | 0 | 0.032653 | 0.390041 | 1,205 | 32 | 58 | 37.65625 | 0.589116 | 0 | 0 | 0.8125 | 0 | 0 | 0.502905 | 0.321992 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43c183cba262f482064e537d6e19fb6b5063a6bc | 41,564 | py | Python | biosppy/signals/emg.py | megrao/BioSPPy | 52340610f850f382082136cd645496e22fbdbae5 | [
"BSD-3-Clause"
] | 491 | 2015-07-29T17:31:22.000Z | 2022-03-31T22:44:13.000Z | biosppy/signals/emg.py | megrao/BioSPPy | 52340610f850f382082136cd645496e22fbdbae5 | [
"BSD-3-Clause"
] | 78 | 2015-12-28T16:45:24.000Z | 2022-03-19T10:05:08.000Z | biosppy/signals/emg.py | megrao/BioSPPy | 52340610f850f382082136cd645496e22fbdbae5 | [
"BSD-3-Clause"
] | 242 | 2015-10-30T09:52:14.000Z | 2022-03-24T11:45:07.000Z | # -*- coding: utf-8 -*-
"""
biosppy.signals.emg
-------------------
This module provides methods to process Electromyographic (EMG) signals.
:copyright: (c) 2015-2018 by Instituto de Telecomunicacoes
:license: BSD 3-clause, see LICENSE for more details.
"""
# Imports
# compat
from __future__ import absolute_import, division, print_function
# 3rd party
import numpy as np
# local
from . import tools as st
from .. import plotting, utils
def emg(signal=None, sampling_rate=1000., path=None, show=True):
"""Process a raw EMG signal and extract relevant signal features using
default parameters.
Parameters
----------
signal : array
Raw EMG signal.
sampling_rate : int, float, optional
Sampling frequency (Hz).
path : str, optional
If provided, the plot will be saved to the specified file.
show : bool, optional
If True, show a summary plot.
Returns
-------
ts : array
Signal time axis reference (seconds).
filtered : array
Filtered EMG signal.
onsets : array
Indices of EMG pulse onsets.
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
# ensure numpy
signal = np.array(signal)
sampling_rate = float(sampling_rate)
# filter signal
filtered, _, _ = st.filter_signal(signal=signal,
ftype='butter',
band='highpass',
order=4,
frequency=100,
sampling_rate=sampling_rate)
# find onsets
onsets, = find_onsets(signal=filtered, sampling_rate=sampling_rate)
# get time vectors
length = len(signal)
T = (length - 1) / sampling_rate
ts = np.linspace(0, T, length, endpoint=True)
# plot
if show:
plotting.plot_emg(ts=ts,
sampling_rate=1000.,
raw=signal,
filtered=filtered,
processed=None,
onsets=onsets,
path=path,
show=True)
# output
args = (ts, filtered, onsets)
names = ('ts', 'filtered', 'onsets')
return utils.ReturnTuple(args, names)
def find_onsets(signal=None, sampling_rate=1000., size=0.05, threshold=None):
"""Determine onsets of EMG pulses.
Skips corrupted signal parts.
Parameters
----------
signal : array
Input filtered EMG signal.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : float, optional
Detection window size (seconds).
threshold : float, optional
Detection threshold.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
# full-wave rectification
fwlo = np.abs(signal)
# smooth
size = int(sampling_rate * size)
mvgav, _ = st.smoother(signal=fwlo,
kernel='boxzen',
size=size,
mirror=True)
# threshold
if threshold is None:
aux = np.abs(mvgav)
threshold = 1.2 * np.mean(aux) + 2.0 * np.std(aux, ddof=1)
# find onsets
length = len(signal)
start = np.nonzero(mvgav > threshold)[0]
stop = np.nonzero(mvgav <= threshold)[0]
onsets = np.union1d(np.intersect1d(start - 1, stop),
np.intersect1d(start + 1, stop))
if np.any(onsets):
if onsets[-1] >= length:
onsets[-1] = length - 1
return utils.ReturnTuple((onsets,), ('onsets',))
def hodges_bui_onset_detector(signal=None, rest=None, sampling_rate=1000.,
size=None, threshold=None):
"""Determine onsets of EMG pulses.
Follows the approach by Hodges and Bui [HoBu96]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : int
Detection window size (seconds).
threshold : int, float
Detection threshold.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [HoBu96] Hodges PW, Bui BH, "A comparison of computer-based methods for
the determination of onset of muscle contraction using
electromyography", Electroencephalography and Clinical Neurophysiology
- Electromyography and Motor Control, vol. 101:6, pp. 511-519, 1996
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if size is None:
raise TypeError("Please specify the detection window size.")
if threshold is None:
raise TypeError("Please specify the detection threshold.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
mean_rest = statistics['mean']
std_dev_rest = statistics['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
mean_rest = rest['mean']
std_dev_rest = rest['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# full-wave rectification
fwlo = np.abs(signal_zero_mean)
# moving average
mvgav = np.convolve(fwlo, np.ones((size,))/size, mode='valid')
# calculate the test function
tf = (1 / std_dev_rest) * (mvgav - mean_rest)
# find onsets
length = len(signal)
start = np.nonzero(tf >= threshold)[0]
stop = np.nonzero(tf < threshold)[0]
onsets = np.union1d(np.intersect1d(start - 1, stop),
np.intersect1d(start + 1, stop))
# adjust indices because of moving average
onsets += int(size / 2)
if np.any(onsets):
if onsets[-1] >= length:
onsets[-1] = length - 1
return utils.ReturnTuple((onsets, tf), ('onsets', 'processed'))
def bonato_onset_detector(signal=None, rest=None, sampling_rate=1000.,
threshold=None, active_state_duration=None,
samples_above_fail=None, fail_size=None):
"""Determine onsets of EMG pulses.
Follows the approach by Bonato et al. [Bo98]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
threshold : int, float
Detection threshold.
active_state_duration: int
Minimum duration of the active state.
samples_above_fail : int
Number of samples above the threshold level in a group of successive
samples.
fail_size : int
Number of successive samples.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Bo98] Bonato P, D’Alessio T, Knaflitz M, "A statistical method for the
measurement of muscle activation intervals from surface myoelectric
signal during gait", IEEE Transactions on Biomedical Engineering,
vol. 45:3, pp. 287–299, 1998
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if threshold is None:
raise TypeError("Please specify the detection threshold.")
if active_state_duration is None:
raise TypeError("Please specify the mininum duration of the "
"active state.")
if samples_above_fail is None:
raise TypeError("Please specify the number of samples above the "
"threshold level in a group of successive samples.")
if fail_size is None:
raise TypeError("Please specify the number of successive samples.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
var_rest = statistics['var']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
var_rest = rest['var']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
tf_list = []
onset_time_list = []
offset_time_list = []
alarm_time = 0
state_duration = 0
j = 0
n = 0
onset = False
alarm = False
for k in range(1, len(signal_zero_mean), 2): # odd values only
# calculate the test function
tf = (1 / var_rest) * (signal_zero_mean[k-1]**2 + signal_zero_mean[k]**2)
tf_list.append(tf)
if onset is True:
if alarm is False:
if tf < threshold:
alarm_time = k // 2
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of inactive state
if tf < threshold:
state_duration += 1
if j > 0: # there was one (or more) samples above the threshold level but now one is bellow it
# the test function may go above the threshold , but each time not longer than j samples
n += 1
if n == samples_above_fail:
n = 0
j = 0
if state_duration == active_state_duration:
offset_time_list.append(alarm_time)
onset = False
alarm = False
n = 0
j = 0
state_duration = 0
else: # sample falls below the threshold level
j += 1
if j > fail_size:
# the inactive state is above the threshold for longer than the predefined number of samples
alarm = False
n = 0
j = 0
state_duration = 0
else: # we only look for another onset if a previous offset was detected
if alarm is False: # if the alarm time has not yet been identified
if tf >= threshold: # alarm time
alarm_time = k // 2
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of active state
if tf >= threshold:
state_duration += 1
if j > 0: # there was one (or more) samples below the threshold level but now one is above it.
# a total of n samples must be above it
n += 1
if n == samples_above_fail:
n = 0
j = 0
if state_duration == active_state_duration:
onset_time_list.append(alarm_time)
onset = True
alarm = False
n = 0
j = 0
state_duration = 0
else: # sample falls below the threshold level
j += 1
if j > fail_size:
# the active state has fallen below the threshold for longer than the predefined number of samples
alarm = False
n = 0
j = 0
state_duration = 0
onsets = np.union1d(onset_time_list,
offset_time_list)
# adjust indices because of odd numbers
onsets *= 2
return utils.ReturnTuple((onsets, tf_list), ('onsets', 'processed'))
def lidierth_onset_detector(signal=None, rest=None, sampling_rate=1000.,
size=None, threshold=None,
active_state_duration=None, fail_size=None):
"""Determine onsets of EMG pulses.
Follows the approach by Lidierth. [Li86]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : int
Detection window size (seconds).
threshold : int, float
Detection threshold.
active_state_duration: int
Minimum duration of the active state.
fail_size : int
Number of successive samples.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Li86] Lidierth M, "A computer based method for automated measurement
of the periods of muscular activity from an EMG and its application to
locomotor EMGs", ElectroencephClin Neurophysiol, vol. 64:4,
pp. 378–380, 1986
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if size is None:
raise TypeError("Please specify the detection window size.")
if threshold is None:
raise TypeError("Please specify the detection threshold.")
if active_state_duration is None:
raise TypeError("Please specify the mininum duration of the "
"active state.")
if fail_size is None:
raise TypeError("Please specify the number of successive samples.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
mean_rest = statistics['mean']
std_dev_rest = statistics['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
mean_rest = rest['mean']
std_dev_rest = rest['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# full-wave rectification
fwlo = np.abs(signal_zero_mean)
# moving average
mvgav = np.convolve(fwlo, np.ones((size,)) / size, mode='valid')
# calculate the test function
tf = (1 / std_dev_rest) * (mvgav - mean_rest)
onset_time_list = []
offset_time_list = []
alarm_time = 0
state_duration = 0
j = 0
onset = False
alarm = False
for k in range(0, len(tf)):
if onset is True:
# an onset was previously detected and we are looking for the offset time applying the same criteria
if alarm is False: # if the alarm time has not yet been identified
if tf[k] < threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of inactive state
if tf[k] < threshold:
state_duration += 1
if j > 0: # there was one (or more) samples above the threshold level but now one is bellow it
# the test function may go above the threshold , but each time not longer than j samples
j = 0
if state_duration == active_state_duration:
offset_time_list.append(alarm_time)
onset = False
alarm = False
j = 0
state_duration = 0
else: # sample falls below the threshold level
j += 1
if j > fail_size:
# the inactive state is above the threshold for longer than the predefined number of samples
alarm = False
j = 0
state_duration = 0
else: # we only look for another onset if a previous offset was detected
if alarm is False: # if the alarm time has not yet been identified
if tf[k] >= threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of active state
if tf[k] >= threshold:
state_duration += 1
if j > 0: # there was one (or more) samples below the threshold level but now one is above it
# the test function may repeatedly fall below the threshold, but each time not longer than j samples
j = 0
if state_duration == active_state_duration:
onset_time_list.append(alarm_time)
onset = True
alarm = False
j = 0
state_duration = 0
else: # sample falls below the threshold level
j += 1
if j > fail_size:
# the active state has fallen below the threshold for longer than the predefined number of samples
alarm = False
j = 0
state_duration = 0
onsets = np.union1d(onset_time_list,
offset_time_list)
# adjust indices because of moving average
onsets += int(size / 2)
return utils.ReturnTuple((onsets, tf), ('onsets', 'processed'))
def abbink_onset_detector(signal=None, rest=None, sampling_rate=1000.,
size=None, alarm_size=None, threshold=None,
transition_threshold=None):
"""Determine onsets of EMG pulses.
Follows the approach by Abbink et al.. [Abb98]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : int
Detection window size (seconds).
alarm_size : int
Number of amplitudes searched in the calculation of the transition
index.
threshold : int, float
Detection threshold.
transition_threshold: int, float
Threshold used in the calculation of the transition index.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Abb98] Abbink JH, van der Bilt A, van der Glas HW, "Detection of onset
and termination of muscle activity in surface electromyograms",
Journal of Oral Rehabilitation, vol. 25, pp. 365–369, 1998
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if size is None:
raise TypeError("Please specify the detection window size.")
if alarm_size is None:
raise TypeError("Please specify the number of amplitudes searched in "
"the calculation of the transition index.")
if threshold is None:
raise TypeError("Please specify the detection threshold.")
if transition_threshold is None:
raise TypeError("Please specify the second threshold.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
mean_rest = statistics['mean']
std_dev_rest = statistics['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
mean_rest = rest['mean']
std_dev_rest = rest['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# full-wave rectification
fwlo = np.abs(signal_zero_mean)
# moving average
mvgav = np.convolve(fwlo, np.ones((size,)) / size, mode='valid')
# calculate the test function
tf = (1 / std_dev_rest) * (mvgav - mean_rest)
# additional filter
filtered_tf, _, _ = st.filter_signal(signal=tf,
ftype='butter',
band='lowpass',
order=10,
frequency=30,
sampling_rate=sampling_rate)
# convert from numpy array to list to use list comprehensions
filtered_tf = filtered_tf.tolist()
onset_time_list = []
offset_time_list = []
alarm_time = 0
onset = False
alarm = False
for k in range(0, len(tf)):
if onset is True:
# an onset was previously detected and we are looking for the offset time, applying the same criteria
if alarm is False:
if filtered_tf[k] < threshold:
# the first index of the sliding window is used as an estimate for the onset time (simple post-processor)
alarm_time = k
alarm = True
else:
# if alarm_time > alarm_window_size and len(emg_conditioned_list) == (alarm_time + alarm_window_size + 1):
if alarm_time > alarm_size and k == (alarm_time + alarm_size + 1):
transition_indices = []
for j in range(alarm_size, alarm_time):
low_list = [filtered_tf[j-alarm_size+a] for a in range(1, alarm_size+1)]
low = sum(i < transition_threshold for i in low_list)
high_list = [filtered_tf[j+b] for b in range(1, alarm_size+1)]
high = sum(i > transition_threshold for i in high_list)
transition_indices.append(low + high)
offset_time_list = np.where(transition_indices == np.amin(transition_indices))[0].tolist()
onset = False
alarm = False
else: # we only look for another onset if a previous offset was detected
if alarm is False:
if filtered_tf[k] >= threshold:
# the first index of the sliding window is used as an estimate for the onset time (simple post-processor)
alarm_time = k
alarm = True
else:
# if alarm_time > alarm_window_size and len(emg_conditioned_list) == (alarm_time + alarm_window_size + 1):
if alarm_time > alarm_size and k == (alarm_time + alarm_size + 1):
transition_indices = []
for j in range(alarm_size, alarm_time):
low_list = [filtered_tf[j-alarm_size+a] for a in range(1, alarm_size+1)]
low = sum(i < transition_threshold for i in low_list)
high_list = [filtered_tf[j+b] for b in range(1, alarm_size+1)]
high = sum(i > transition_threshold for i in high_list)
transition_indices.append(low + high)
onset_time_list = np.where(transition_indices == np.amax(transition_indices))[0].tolist()
onset = True
alarm = False
onsets = np.union1d(onset_time_list,
offset_time_list)
# adjust indices because of moving average
onsets += int(size / 2)
return utils.ReturnTuple((onsets, filtered_tf), ('onsets', 'processed'))
def solnik_onset_detector(signal=None, rest=None, sampling_rate=1000.,
threshold=None, active_state_duration=None):
"""Determine onsets of EMG pulses.
Follows the approach by Solnik et al. [Sol10]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
threshold : int, float
Scale factor for calculating the detection threshold.
active_state_duration: int
Minimum duration of the active state.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Sol10] Solnik S, Rider P, Steinweg K, DeVita P, Hortobágyi T,
"Teager-Kaiser energy operator signal conditioning improves EMG onset
detection", European Journal of Applied Physiology, vol 110:3,
pp. 489-498, 2010
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if threshold is None:
raise TypeError("Please specify the scale factor for calculating the "
"detection threshold.")
if active_state_duration is None:
raise TypeError("Please specify the mininum duration of the "
"active state.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
mean_rest = statistics['mean']
std_dev_rest = statistics['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
mean_rest = rest['mean']
std_dev_rest = rest['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# calculate threshold
threshold = mean_rest + threshold * std_dev_rest
tf_list = []
onset_time_list = []
offset_time_list = []
alarm_time = 0
state_duration = 0
onset = False
alarm = False
for k in range(1, len(signal_zero_mean)-1):
# calculate the test function
# Teager-Kaiser energy operator
tf = signal_zero_mean[k]**2 - signal_zero_mean[k+1] * signal_zero_mean[k-1]
# full-wave rectification
tf = np.abs(tf)
tf_list.append(tf)
if onset is True:
# an onset was previously detected and we are looking for the offset time, applying the same criteria
if alarm is False: # if the alarm time has not yet been identified
if tf < threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of inactive state
if tf < threshold:
state_duration += 1
if state_duration == active_state_duration:
offset_time_list.append(alarm_time)
onset = False
alarm = False
state_duration = 0
else: # we only look for another onset if a previous offset was detected
if alarm is False: # if the alarm time has not yet been identified
if tf >= threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of active state
if tf >= threshold:
state_duration += 1
if state_duration == active_state_duration:
onset_time_list.append(alarm_time)
onset = True
alarm = False
state_duration = 0
onsets = np.union1d(onset_time_list,
offset_time_list)
return utils.ReturnTuple((onsets, tf_list), ('onsets', 'processed'))
def silva_onset_detector(signal=None, sampling_rate=1000.,
size=None, threshold_size=None, threshold=None):
"""Determine onsets of EMG pulses.
Follows the approach by Silva et al. [Sil12]_.
Parameters
----------
signal : array
Input filtered EMG signal.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : int
Detection window size (seconds).
threshold_size : int
Window size for calculation of the adaptive threshold; must be bigger
than the detection window size.
threshold : int, float
Fixed threshold for the double criteria.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Sil12] Silva H, Scherer R, Sousa J, Londral A , "Towards improving the
usability of electromyographic interfacess", Journal of Oral
Rehabilitation, pp. 1–2, 2012
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if size is None:
raise TypeError("Please specify the detection window size.")
if threshold_size is None:
raise TypeError("Please specify the window size for calculation of "
"the adaptive threshold.")
if threshold_size <= size:
raise TypeError("The window size for calculation of the adaptive "
"threshold must be bigger than the detection "
"window size")
if threshold is None:
raise TypeError("Please specify the fixed threshold for the "
"double criteria.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# full-wave rectification
fwlo = np.abs(signal_zero_mean)
# moving average for calculating the test function
tf_mvgav = np.convolve(fwlo, np.ones((size,)) / size, mode='valid')
# moving average for calculating the adaptive threshold
threshold_mvgav = np.convolve(fwlo, np.ones((threshold_size,)) / threshold_size, mode='valid')
onset_time_list = []
offset_time_list = []
onset = False
for k in range(0, len(threshold_mvgav)):
if onset is True:
# an onset was previously detected and we are looking for the offset time, applying the same criteria
if tf_mvgav[k] < threshold_mvgav[k] and tf_mvgav[k] < threshold:
offset_time_list.append(k)
onset = False # the offset has been detected, and we can look for another activation
else: # we only look for another onset if a previous offset was detected
if tf_mvgav[k] >= threshold_mvgav[k] and tf_mvgav[k] >= threshold:
# the first index of the sliding window is used as an estimate for the onset time (simple post-processor)
onset_time_list.append(k)
onset = True
onsets = np.union1d(onset_time_list,
offset_time_list)
# adjust indices because of moving average
onsets += int(size / 2)
return utils.ReturnTuple((onsets, tf_mvgav), ('onsets', 'processed'))
def londral_onset_detector(signal=None, rest=None, sampling_rate=1000.,
size=None, threshold=None,
active_state_duration=None):
"""Determine onsets of EMG pulses.
Follows the approach by Londral et al. [Lon13]_.
Parameters
----------
signal : array
Input filtered EMG signal.
rest : array, list, dict
One of the following 3 options:
* N-dimensional array with filtered samples corresponding to a
rest period;
* 2D array or list with the beginning and end indices of a segment of
the signal corresponding to a rest period;
* Dictionary with {'mean': mean value, 'std_dev': standard variation}.
sampling_rate : int, float, optional
Sampling frequency (Hz).
size : int
Detection window size (seconds).
threshold : int, float
Scale factor for calculating the detection threshold.
active_state_duration: int
Minimum duration of the active state.
Returns
-------
onsets : array
Indices of EMG pulse onsets.
processed : array
Processed EMG signal.
References
----------
.. [Lon13] Londral A, Silva H, Nunes N, Carvalho M, Azevedo L, "A wireless
user-computer interface to explore various sources of biosignals and
visual biofeedback for severe motor impairment",
Journal of Accessibility and Design for All, vol. 3:2, pp. 118–134, 2013
"""
# check inputs
if signal is None:
raise TypeError("Please specify an input signal.")
if rest is None:
raise TypeError("Please specidy rest parameters.")
if size is None:
raise TypeError("Please specify the detection window size.")
if threshold is None:
raise TypeError("Please specify the scale factor for calculating the "
"detection threshold.")
if active_state_duration is None:
raise TypeError("Please specify the mininum duration of the "
"active state.")
# gather statistics on rest signal
if isinstance(rest, np.ndarray) or isinstance(rest, list):
# if the input parameter is a numpy array or a list
if len(rest) >= 2:
# first ensure numpy
rest = np.array(rest)
if len(rest) == 2:
# the rest signal is a segment of the signal
rest_signal = signal[rest[0]:rest[1]]
else:
# the rest signal is provided as is
rest_signal = rest
rest_zero_mean = rest_signal - np.mean(rest_signal)
statistics = st.signal_stats(signal=rest_zero_mean)
mean_rest = statistics['mean']
std_dev_rest = statistics['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
elif isinstance(rest, dict):
# if the input is a dictionary
mean_rest = rest['mean']
std_dev_rest = rest['std_dev']
else:
raise TypeError("Please specify the rest analysis.")
# subtract baseline offset
signal_zero_mean = signal - np.mean(signal)
# calculate threshold
threshold = mean_rest + threshold * std_dev_rest
# helper function for calculating the test function for each window
def _londral_test_function(signal=None):
tf = (1 / size) * (sum(j ** 2 for j in signal) - (1 / size) * (sum(signal) ** 2))
return tf
# calculate the test function
_, tf = st.windower(
signal=signal_zero_mean,
size=size, step=1,
fcn=_londral_test_function,
kernel='rectangular',
)
onset_time_list = []
offset_time_list = []
alarm_time = 0
state_duration = 0
onset = False
alarm = False
for k in range(0, len(tf)):
if onset is True:
# an onset was previously detected and we are looking for the offset time, applying the same criteria
if alarm is False: # if the alarm time has not yet been identified
if tf[k] < threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of inactive state
if tf[k] < threshold:
state_duration += 1
if state_duration == active_state_duration:
offset_time_list.append(alarm_time)
onset = False
alarm = False
state_duration = 0
else: # we only look for another onset if a previous offset was detected
if alarm is False: # if the alarm time has not yet been identified
if tf[k] >= threshold: # alarm time
alarm_time = k
alarm = True
else: # now we have to check for the remaining rule to me bet - duration of active state
if tf[k] >= threshold:
state_duration += 1
if state_duration == active_state_duration:
onset_time_list.append(alarm_time)
onset = True
alarm = False
state_duration = 0
onsets = np.union1d(onset_time_list,
offset_time_list)
# adjust indices because of moving average
onsets += int(size / 2)
return utils.ReturnTuple((onsets, tf), ('onsets', 'processed'))
| 36.459649 | 125 | 0.57456 | 4,968 | 41,564 | 4.705515 | 0.092391 | 0.028917 | 0.041922 | 0.049664 | 0.838602 | 0.81747 | 0.809599 | 0.801044 | 0.790007 | 0.779997 | 0 | 0.012923 | 0.352132 | 41,564 | 1,139 | 126 | 36.491659 | 0.855021 | 0.362044 | 0 | 0.810526 | 0 | 0 | 0.097353 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017544 | false | 0.003509 | 0.007018 | 0 | 0.042105 | 0.001754 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78e9344d69c7a1f79cf5f8cf52dd78faa52e3a8e | 18,476 | py | Python | Hyperparameter_search.py | HickmannLautaro/BERT_classifier | 7c213afe3241d7d9dc653e2fd1974e3d2af99c8a | [
"Apache-2.0"
] | null | null | null | Hyperparameter_search.py | HickmannLautaro/BERT_classifier | 7c213afe3241d7d9dc653e2fd1974e3d2af99c8a | [
"Apache-2.0"
] | null | null | null | Hyperparameter_search.py | HickmannLautaro/BERT_classifier | 7c213afe3241d7d9dc653e2fd1974e3d2af99c8a | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
import os
os.environ["TF_CPP_MIN_LOG_LEVEL"] = '2' # Lower log level to get less clutter
from BERT_per_lvl import run_experiment # Import functions to run experiments
import tensorflow as tf
from tensorboard.plugins.hparams import api as hp
import numpy as np
import sys
# This file should be redone with nicer functions but not time anymore to corroborate the results after changing functions
# Functions in functions not good but otherwise HParams broke
def hyp_search_lvl1_flatt():
"""
Run hyperparameter on the amazon dataset for the flat approach on level 1
:return: HParam run logs
"""
# Set HParams up
HP_MAX_LENGTH = hp.HParam('max_length', hp.Discrete([64, 100, 256, 512]))
HP_BATCH_SIZE = hp.HParam('batch_size', hp.Discrete([45, 20, 40, 50]))
METRIC_ACCURACY = 'accuracy_score'
METRIC_f1 = 'f1_score'
# Simulate config file
arguments = {'model_name': 'bert-base-uncased',
'max_length': 100,
'epochs': 40, #
'batch_size': 40,
'repetitions': 1,
'data_path': 'amazon',
'lvl': 1,
'labels': None,
'test_labels': None,
'hierar': 'flatt',
'lable_type': '_',
'test_labels_type': '_'}
# Get config values
model_name = arguments['model_name']
lvl = arguments['lvl']
data_path = arguments['data_path']
hierar = arguments['hierar']
lable_type = arguments['lable_type']
test_labels_type = arguments['test_labels_type']
# Create custom summary for the HParam logs
with tf.summary.create_file_writer("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(
lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning').as_default():
hp.hparams_config(
hparams=[HP_MAX_LENGTH, HP_BATCH_SIZE],
metrics=[hp.Metric(METRIC_ACCURACY, display_name='accuracy_score'),
hp.Metric(METRIC_f1, display_name='f1_score')],
)
def run(run_dir, hparams, arguments):
"""
Run experiments twice on a set of hparams and log the metrics
:param run_dir: path of log file
:param hparams: dict with parameters to test in this run
:param arguments: config file for the experiment
:return: log of run is saved to path
"""
with tf.summary.create_file_writer(run_dir).as_default():
hp.hparams(hparams) # record the values used in this trial
arguments['max_length'] = hparams[HP_MAX_LENGTH] # arguments['max_length']
arguments['batch_size'] = hparams[HP_BATCH_SIZE] # arguments['epochs']
f1_score_1, accuracy_score_1 = run_experiment(arguments, hyp_search=True, )
f1_score_2, accuracy_score_2 = run_experiment(arguments, hyp_search=True, )
f1_score, accuracy_score = np.mean([f1_score_1, f1_score_2]), np.mean([accuracy_score_1, accuracy_score_2])
tf.summary.scalar(METRIC_ACCURACY, accuracy_score, step=1)
tf.summary.scalar(METRIC_f1, f1_score, step=1)
# Experiment counter
session_num = 0
for max_length in HP_MAX_LENGTH.domain.values[::-1]:
for batch_size in HP_BATCH_SIZE.domain.values[::-1]:
hparams = {
HP_MAX_LENGTH: max_length,
HP_BATCH_SIZE: batch_size,
}
run_name = "run-%d" % session_num
print('--- Starting trial: %s' % run_name)
print({h.name: hparams[h] for h in hparams})
try:
run("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning/' + run_name, hparams, arguments)
except tf.errors.ResourceExhaustedError as e: # If out of memory error abort this run and test with new hypeparameters.
print("Out of memory")
session_num += 1
# Functions in functions not good but otherwise HParams broke
def hyp_search_lvl2_flatt():
"""
Run hyperparameter on the amazon dataset for the flat approach on level 2
:return: HParam run logs
"""
# Set HParams up
HP_MAX_LENGTH = hp.HParam('max_length', hp.Discrete([100, 256, 512]))
HP_BATCH_SIZE = hp.HParam('batch_size', hp.Discrete([45, 50, 40, 60]))
METRIC_ACCURACY = 'accuracy_score'
METRIC_f1 = 'f1_score'
# Simulate config file
arguments = {'model_name': 'bert-base-uncased',
'max_length': 100,
'epochs': 40, #
'batch_size': 40,
'repetitions': 1,
'data_path': 'amazon',
'lvl': 2,
'labels': None,
'test_labels': None,
'hierar': 'flatt',
'lable_type': '_',
'test_labels_type': '_'}
# Get config values
model_name = arguments['model_name']
lvl = arguments['lvl']
data_path = arguments['data_path']
hierar = arguments['hierar']
lable_type = arguments['lable_type']
test_labels_type = arguments['test_labels_type']
# Create custom summary for the HParam logs
with tf.summary.create_file_writer("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning').as_default():
hp.hparams_config(
hparams=[HP_MAX_LENGTH, HP_BATCH_SIZE],
metrics=[hp.Metric(METRIC_ACCURACY, display_name='accuracy_score'),
hp.Metric(METRIC_f1, display_name='f1_score')],
)
def run(run_dir, hparams, arguments):
"""
Run experiments twice on a set of hparams and log the metrics
:param run_dir: path of log file
:param hparams: dict with parameters to test in this run
:param arguments: config file for the experiment
:return: log of run is saved to path
"""
with tf.summary.create_file_writer(run_dir).as_default():
hp.hparams(hparams) # record the values used in this trial
arguments['max_length'] = hparams[HP_MAX_LENGTH] # arguments['max_length']
arguments['batch_size'] = hparams[HP_BATCH_SIZE] # arguments['epochs']
f1_score_1, accuracy_score_1 = run_experiment(arguments, hyp_search=True, )
f1_score_2, accuracy_score_2 = run_experiment(arguments, hyp_search=True, )
f1_score, accuracy_score = np.mean([f1_score_1, f1_score_2]), np.mean([accuracy_score_1, accuracy_score_2])
tf.summary.scalar(METRIC_ACCURACY, accuracy_score, step=1)
tf.summary.scalar(METRIC_f1, f1_score, step=1)
# Experiment counter
session_num = 0
for max_length in HP_MAX_LENGTH.domain.values[::-1]:
for batch_size in HP_BATCH_SIZE.domain.values[::-1]:
hparams = {
HP_MAX_LENGTH: max_length,
HP_BATCH_SIZE: batch_size,
}
run_name = "run-%d" % session_num
print('--- Starting trial: %s' % run_name)
print({h.name: hparams[h] for h in hparams})
try:
run("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning/' + run_name, hparams, arguments)
except tf.errors.ResourceExhaustedError as e: # If out of memory error abort this run and test with new hypeparameters.
print("Out of memory")
session_num += 1
# Functions in functions not good but otherwise HParams broke
def hyp_search_lvl2_target_target():
"""
Run hyperparameter on the amazon dataset for the target trained and tested per-level approach on level 2
:return: HParam run logs
"""
HP_MAX_LENGTH = hp.HParam('max_length', hp.Discrete([100, 256, 512]))
HP_BATCH_SIZE = hp.HParam('batch_size', hp.Discrete([45, 50, 40, 60]))
METRIC_ACCURACY = 'accuracy_score'
METRIC_f1 = 'f1_score'
# Simulate config file
arguments = {'model_name': 'bert-base-uncased',
'max_length': 100,
'epochs': 40, #
'batch_size': 40,
'repetitions': 1,
'data_path': 'amazon',
'lvl': 2,
'labels': [['Target', 'Cat1']],
'test_labels': [['Target', 'Cat1']],
'hierar': 'hierarchical',
'lable_type': 'Target',
'test_labels_type': 'Target'}
# Get config values
model_name = arguments['model_name']
lvl = arguments['lvl']
data_path = arguments['data_path']
hierar = arguments['hierar']
lable_type = arguments['lable_type']
test_labels_type = arguments['test_labels_type']
# Create custom summary for the HParam logs
with tf.summary.create_file_writer("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(
lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning').as_default():
hp.hparams_config(
hparams=[HP_MAX_LENGTH, HP_BATCH_SIZE],
metrics=[hp.Metric(METRIC_ACCURACY, display_name='accuracy_score'),
hp.Metric(METRIC_f1, display_name='f1_score')],
)
def run(run_dir, hparams, arguments):
"""
Run experiments twice on a set of hparams and log the metrics
:param run_dir: path of log file
:param hparams: dict with parameters to test in this run
:param arguments: config file for the experiment
:return: log of run is saved to path
"""
with tf.summary.create_file_writer(run_dir).as_default():
hp.hparams(hparams) # record the values used in this trial
arguments['max_length'] = hparams[HP_MAX_LENGTH] # arguments['max_length']
arguments['batch_size'] = hparams[HP_BATCH_SIZE] # arguments['epochs']
f1_score_1, accuracy_score_1 = run_experiment(arguments, hyp_search=True, )
f1_score_2, accuracy_score_2 = run_experiment(arguments, hyp_search=True, )
f1_score, accuracy_score = np.mean([f1_score_1, f1_score_2]), np.mean([accuracy_score_1, accuracy_score_2])
tf.summary.scalar(METRIC_ACCURACY, accuracy_score, step=1)
tf.summary.scalar(METRIC_f1, f1_score, step=1)
# Experiment counter
session_num = 0
for max_length in HP_MAX_LENGTH.domain.values[::-1]:
for batch_size in HP_BATCH_SIZE.domain.values[::-1]:
hparams = {
HP_MAX_LENGTH: max_length,
HP_BATCH_SIZE: batch_size,
}
run_name = "run-%d" % session_num
print('--- Starting trial: %s' % run_name)
print({h.name: hparams[h] for h in hparams})
try:
run("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(
lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning/' + run_name, hparams, arguments)
except tf.errors.ResourceExhaustedError as e: # If out of memory error abort this run and test with new hypeparameters.
print("Out of memory")
session_num += 1
# Functions in functions not good but otherwise HParams broke
def hyp_search_lvl2_predicted_predicted(path_predicted):
"""
Run hyperparameter on the amazon dataset for the predicted trained and tested per-level approach on level 2
:return: HParam run logs
"""
HP_MAX_LENGTH = hp.HParam('max_length', hp.Discrete([64, 100, 256, 512]))
HP_BATCH_SIZE = hp.HParam('batch_size', hp.Discrete([10, 45, 20, 40, 50, 60]))
METRIC_ACCURACY = 'accuracy_score'
METRIC_f1 = 'f1_score'
# Simulate config file
arguments = {'model_name': 'bert-base-uncased',
'max_length': 100,
'epochs': 40, #
'batch_size': 40,
'repetitions': 1,
'data_path': 'amazon',
'lvl': 2,
'labels': [[path_predicted]],
'test_labels': [[path_predicted]],
'hierar': 'hierarchical',
'lable_type': 'Predicted',
'test_labels_type': 'Predicted'}
# Get config values
model_name = arguments['model_name']
lvl = arguments['lvl']
data_path = arguments['data_path']
hierar = arguments['hierar']
lable_type = arguments['lable_type']
test_labels_type = arguments['test_labels_type']
# Create custom summary for the HParam logs
with tf.summary.create_file_writer("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(
lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning').as_default():
hp.hparams_config(
hparams=[HP_MAX_LENGTH, HP_BATCH_SIZE],
metrics=[hp.Metric(METRIC_ACCURACY, display_name='accuracy_score'),
hp.Metric(METRIC_f1, display_name='f1_score')],
)
def run(run_dir, hparams, arguments):
"""
Run experiments twice on a set of hparams and log the metrics
:param run_dir: path of log file
:param hparams: dict with parameters to test in this run
:param arguments: config file for the experiment
:return: log of run is saved to path
"""
with tf.summary.create_file_writer(run_dir).as_default():
hp.hparams(hparams) # record the values used in this trial
arguments['max_length'] = hparams[HP_MAX_LENGTH] # arguments['max_length']
arguments['batch_size'] = hparams[HP_BATCH_SIZE] # arguments['epochs']
f1_score_1, accuracy_score_1 = run_experiment(arguments, hyp_search=True, )
f1_score_2, accuracy_score_2 = run_experiment(arguments, hyp_search=True, )
f1_score, accuracy_score = np.mean([f1_score_1, f1_score_2]), np.mean([accuracy_score_1, accuracy_score_2])
tf.summary.scalar(METRIC_ACCURACY, accuracy_score, step=1)
tf.summary.scalar(METRIC_f1, f1_score, step=1)
# Experiment counter
session_num = 0
for max_length in HP_MAX_LENGTH.domain.values[::-1]:
for batch_size in HP_BATCH_SIZE.domain.values[::-1]:
hparams = {
HP_MAX_LENGTH: max_length,
HP_BATCH_SIZE: batch_size,
}
run_name = "run-%d" % session_num
print('--- Starting trial: %s' % run_name)
print({h.name: hparams[h] for h in hparams})
try:
run("hyperparameters_search/" + model_name + "/" + data_path + "/lvl" + str(
lvl) + "/trained_" + hierar + "_" + lable_type + "/tested_" + test_labels_type + '/hparam_tuning/' + run_name, hparams, arguments)
except tf.errors.ResourceExhaustedError as e: # If out of memory error abort this run and test with new hypeparameters.
print("Out of memory")
session_num += 1
def main():
"""
Runs hyperparameter search on the amazon dataset for the flatt and per-level approaches depending on the command line arguments.
Run Hyperparameter_search.py to do a grid-search over the predifined hyperparameters. Hyperparameters can only be done over amazon and per_lvl, but neither on DBpedia nor on per_label.
Give one or more options to search hyperparameters: Flat_lvl1, Flat_lvl2, tgt_pred, tgt_tgt, pred_pred.
For runs containing pred (predictions) give the rep_and_histo.npz path that should be used for the input predictions.
For example run "python Hyperparameter_search.py Flat_lvl2 tgt_tgt pred_pred saved_models/bert-base-uncased/amazon/lvl1/trained_flatt__/100T_60e_45b/Run3/tested__/rep_and_histo.npz"
for the hyp-search on amazon level2 flat (Flat_lvl2), target trained and predicted on level 2 (tgt_tgt), and trained and tested with the predicted label input of the flat level 1 (pred_pred saved_models/bert-base-uncased/amazon/lvl1/trained_flatt__/100T_60e_45b/Run3/tested__/rep_and_histo.npz)
"""
print("Tensorflow version: ", tf.__version__)
# rtx 3080 tf 2.4.0-rc4 bug
gpu_devices = tf.config.experimental.list_physical_devices('GPU')
tf.config.experimental.set_memory_growth(gpu_devices[0], True)
os.environ["TOKENIZERS_PARALLELISM"] = "false" # avoids Hugging Face process forking bug https://github.com/ThilinaRajapakse/simpletransformers/issues/515
list_args = sys.argv[1:] # Read command line arguments
if len(list_args) < 1: # No given parameters
print(
"Give one or more options to search hyperparameters:\n Flat_lvl1, Flat_lvl2, tgt_pred, tgt_tgt, pred_pred \n for runs containing pre give config and path to model")
sys.exit(2)
for i, conf in enumerate(list_args):
if conf == "Flat_lvl1":
hyp_search_lvl1_flatt()
print("hyp_search_lvl1_flatt done")
print("#" * 150)
print("#" * 150)
print("#" * 150)
print("#" * 150)
elif conf == "Flat_lvl2":
hyp_search_lvl2_flatt()
print("hyp_search_lvl2_flatt done")
print("#" * 150)
print("#" * 150)
print("#" * 150)
print("#" * 150)
continue
elif conf == "tgt_tgt":
hyp_search_lvl2_target_target()
print("hyp_search_lvl2_target_target done")
print("#" * 150)
print("#" * 150)
print("#" * 150)
print("#" * 150)
elif conf == "pred_pred":
print(list_args)
hyp_search_lvl2_predicted_predicted(list_args[i + 1])
print("hyp_search_lvl2_prediction_prediction done")
print("#" * 150)
print("#" * 150)
print("#" * 150)
print("#" * 150)
continue
else:
print("Wrong input options to search hyperparameters:\n Flat_lvl1, Flat_lvl2, tgt_pred, tgt_tgt, pred_pred")
return 1
print("Search done for", list_args)
if __name__ == "__main__":
main()
| 44.306954 | 298 | 0.618478 | 2,324 | 18,476 | 4.655336 | 0.109294 | 0.036602 | 0.020335 | 0.019965 | 0.835382 | 0.825122 | 0.822904 | 0.822904 | 0.810056 | 0.810056 | 0 | 0.024428 | 0.273273 | 18,476 | 416 | 299 | 44.413462 | 0.781336 | 0.222721 | 0 | 0.778986 | 0 | 0.003623 | 0.16768 | 0.022462 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032609 | false | 0 | 0.021739 | 0 | 0.057971 | 0.134058 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
607d2b00cc19df71186018945d134c1b770d14be | 4,773 | py | Python | tests/transformers/test_structural_breaks.py | khrapovs/hcl-model | 879740e6072c2ff45864040db0b8364b55de1f44 | [
"MIT"
] | null | null | null | tests/transformers/test_structural_breaks.py | khrapovs/hcl-model | 879740e6072c2ff45864040db0b8364b55de1f44 | [
"MIT"
] | 5 | 2022-02-09T12:38:04.000Z | 2022-02-21T15:25:06.000Z | tests/transformers/test_structural_breaks.py | khrapovs/hcl-model | 879740e6072c2ff45864040db0b8364b55de1f44 | [
"MIT"
] | 1 | 2022-02-17T09:59:22.000Z | 2022-02-17T09:59:22.000Z | import random
import numpy as np
import pandas as pd
import pytest
from pandas._testing import assert_series_equal
from hcl_model.transformers.structural_breaks import TargetStructuralBreakCorrectionTransformer
@pytest.mark.parametrize("y_type", ["series", "ndarray"])
class TestStructuralBreakCorrectionTransformer:
def test_no_correction_series_with_structural_break(self, y_type: str) -> None:
series1 = np.ones(100)
series2 = np.ones(100) + 1
series = pd.Series(np.append(series1, series2))
X = series.values if y_type == "ndarray" else series
transformer = TargetStructuralBreakCorrectionTransformer(structural_break_correction=False)
series_corrected = transformer.transform(X=X)
assert len(series_corrected) == len(series)
if y_type == "ndarray":
np.array_equal(series.values, series_corrected)
np.array_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
else:
assert_series_equal(series, series_corrected)
assert_series_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
def test_correction_series_with_structural_break(self, y_type: str) -> None:
series1 = np.ones(100)
series2 = np.ones(100) + 1
series = pd.Series(np.append(series1, series2))
X = series.values if y_type == "ndarray" else series
transformer = TargetStructuralBreakCorrectionTransformer()
series_corrected = transformer.transform(X=X)
series_corrected_expected = pd.Series(np.ones(200) + 1)
assert len(series_corrected) == len(series)
if y_type == "ndarray":
np.array_equal(series_corrected_expected.values, series_corrected)
np.array_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
else:
assert_series_equal(series_corrected_expected, series_corrected)
assert_series_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
def test_adjusted_variability_still_zero(self, y_type: str) -> None:
random.seed(101)
series1 = pd.Series(np.ones(100))
series2 = pd.Series(np.random.normal(size=100, loc=100))
series = pd.concat([series1, series2])
X = series.values if y_type == "ndarray" else series
series_corrected = TargetStructuralBreakCorrectionTransformer().transform(X=X)
series_corrected_first_part = series_corrected[:100]
assert len(np.unique(series_corrected_first_part)) == len(np.unique(series1))
def test_variability_adjusted(self, y_type: str) -> None:
random.seed(101)
series1 = pd.Series(np.random.normal(size=100, loc=1, scale=1))
series2 = pd.Series(np.random.normal(size=100, loc=100, scale=5))
series = pd.concat([series1, series2])
X = series.values if y_type == "ndarray" else series
series_corrected = TargetStructuralBreakCorrectionTransformer().transform(X=X)
series_corrected_first_part = series_corrected[:100]
assert series_corrected_first_part.std() > series1.std()
def test_last_part_equal_to_original(self, y_type: str) -> None:
random.seed(101)
series1 = pd.Series(np.random.normal(size=100))
series2 = pd.Series(np.random.normal(size=100, loc=100))
series = pd.concat([series1, series2])
X = series.values if y_type == "ndarray" else series
transformer = TargetStructuralBreakCorrectionTransformer()
series_corrected = transformer.transform(X=X)
series_corrected_last_part = series_corrected[100:]
if y_type == "ndarray":
np.array_equal(series_corrected_last_part, series2)
np.array_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
else:
assert_series_equal(series_corrected_last_part, series2)
assert_series_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
def test_correction_series_without_structural_break(self, y_type: str) -> None:
series = pd.Series(np.ones(200))
X = series.values if y_type == "ndarray" else series
transformer = TargetStructuralBreakCorrectionTransformer()
series_corrected = transformer.transform(X=X)
assert len(series_corrected) == len(series)
if y_type == "ndarray":
np.array_equal(series.values, series_corrected)
np.array_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
else:
assert_series_equal(series, series_corrected)
assert_series_equal(transformer.inverse_transform(X=series_corrected), series_corrected)
| 45.894231 | 100 | 0.706474 | 562 | 4,773 | 5.743772 | 0.131673 | 0.204461 | 0.05948 | 0.043371 | 0.826208 | 0.815675 | 0.806382 | 0.796778 | 0.795849 | 0.780359 | 0 | 0.024326 | 0.199036 | 4,773 | 103 | 101 | 46.339806 | 0.820037 | 0 | 0 | 0.658537 | 0 | 0 | 0.018647 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 1 | 0.073171 | false | 0 | 0.073171 | 0 | 0.158537 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60fb2b46750e39e1c9dd4d7e8d6d8657171fc3a3 | 179 | py | Python | print string of unicode.py | manand881/Python-Programs | eb970cb1b21d4aede0102c60425eb8a1d4ac605c | [
"MIT"
] | null | null | null | print string of unicode.py | manand881/Python-Programs | eb970cb1b21d4aede0102c60425eb8a1d4ac605c | [
"MIT"
] | null | null | null | print string of unicode.py | manand881/Python-Programs | eb970cb1b21d4aede0102c60425eb8a1d4ac605c | [
"MIT"
] | null | null | null | str = u'\u0050\u0079\u0074\u0068\u006f\u006e \u0045\u0078\u0065\u0072\u0063\u0069\u0073\u0065\u0073 \u002d \u0077\u0033\u0072\u0065\u0073\u006f\u0075\u0072\u0063\u0065'
print(str) | 89.5 | 168 | 0.793296 | 30 | 179 | 4.733333 | 0.666667 | 0.140845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.578035 | 0.03352 | 179 | 2 | 169 | 89.5 | 0.242775 | 0 | 0 | 0 | 0 | 0.5 | 0.883333 | 0.833333 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
715ad2ddd9353bbb294eaf7729d72df8ebcc9a2b | 24,118 | py | Python | keras/layers/recurrentpp_soft.py | volkancirik/keras | a32a4c2ecdb2b0e528fb45e0942d5262ffcd735b | [
"MIT"
] | 2 | 2018-06-08T13:17:06.000Z | 2020-02-13T09:34:43.000Z | keras/layers/recurrentpp_soft.py | volkancirik/keras | a32a4c2ecdb2b0e528fb45e0942d5262ffcd735b | [
"MIT"
] | null | null | null | keras/layers/recurrentpp_soft.py | volkancirik/keras | a32a4c2ecdb2b0e528fb45e0942d5262ffcd735b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import absolute_import
import theano
import theano.tensor as T
import numpy as np
from keras import activations, initializations
from keras.utils.theano_utils import shared_scalar, shared_zeros, alloc_zeros_matrix, sharedX, shared_ones
from keras.layers.core import Layer, MaskedLayer
from six.moves import range
from keras.layers.recurrent import Recurrent
import math
class LSTMpp_soft(Recurrent):
'''
soft selection of gates
'''
def __init__(self, output_dim,
init='glorot_uniform', inner_init='orthogonal', forget_bias_init='one',
activation='tanh', inner_activation='hard_sigmoid',
weights=None, truncate_gradient=-1, return_sequences=False,
input_dim=None, input_length=None, **kwargs):
self.output_dim = output_dim
self.init = initializations.get(init)
self.inner_init = initializations.get(inner_init)
self.forget_bias_init = initializations.get(forget_bias_init)
self.activation = activations.get(activation)
self.inner_activation = activations.get(inner_activation)
self.truncate_gradient = truncate_gradient
self.return_sequences = return_sequences
self.initial_weights = weights
self.input_dim = input_dim
self.input_length = input_length
if self.input_dim:
kwargs['input_shape'] = (self.input_length, self.input_dim)
super(LSTMpp_soft, self).__init__(**kwargs)
def build(self):
input_dim = self.input_shape[2]
self.input = T.tensor3()
scale=0.05
self.W_g = self.init((input_dim, self.output_dim))
self.U_g = sharedX(np.random.uniform(low=-scale, high=scale, size=(self.output_dim, 9 , self.output_dim)))
self.b_g = shared_zeros((self.output_dim))
self.W_c = self.init((input_dim, self.output_dim))
self.U_c = self.inner_init((self.output_dim, self.output_dim))
self.b_c = shared_zeros((self.output_dim))
self.W_o = self.init((input_dim, self.output_dim))
self.U_o = self.inner_init((self.output_dim, self.output_dim))
self.b_o = shared_zeros((self.output_dim))
self.params = [
self.W_g, self.U_g, self.b_g,
self.W_c, self.U_c, self.b_c,
self.W_o, self.U_o, self.b_o,
]
if self.initial_weights is not None:
self.set_weights(self.initial_weights)
del self.initial_weights
def _step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
c_tilda = self.activation(xc_t + T.dot(h_mask_tm1, u_c))
ops = [c_mask_tm1,c_tilda,(c_mask_tm1 + c_tilda),T.maximum(c_mask_tm1, c_tilda),T.minimum(c_mask_tm1, c_tilda),c_mask_tm1 - c_tilda,c_mask_tm1 * c_tilda,0 * c_tilda,0 * c_tilda + 1]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
return h_t, c_t
def get_output(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories], updates = theano.scan(
self._step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
if self.return_sequences:
return outputs.dimshuffle((1, 0, 2))
return outputs[-1]
def _debug_step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, gates_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
c_tilda = self.activation(xc_t + T.dot(h_mask_tm1, u_c))
ops = [c_mask_tm1,c_tilda,(c_mask_tm1 + c_tilda),T.maximum(c_mask_tm1, c_tilda),T.minimum(c_mask_tm1, c_tilda),c_mask_tm1 - c_tilda,c_mask_tm1 * c_tilda,0 * c_tilda,0 * c_tilda + 1]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
gates_t = gate
return h_t, c_t, gates_t
def get_gates(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories, gates], updates = theano.scan(
self._debug_step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim, 9), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
return outputs, gates, memories
# return gates, memories,
def get_config(self):
config = {"name": self.__class__.__name__,
"output_dim": self.output_dim,
"init": self.init.__name__,
"inner_init": self.inner_init.__name__,
"forget_bias_init": self.forget_bias_init.__name__,
"activation": self.activation.__name__,
"inner_activation": self.inner_activation.__name__,
"truncate_gradient": self.truncate_gradient,
"return_sequences": self.return_sequences,
"input_dim": self.input_dim,
"input_length": self.input_length}
base_config = super(LSTMpp_soft, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class LSTMmul_soft(Recurrent):
'''
soft selection of gates
'''
def __init__(self, output_dim,
init='glorot_uniform', inner_init='orthogonal', forget_bias_init='one',
activation='tanh', inner_activation='hard_sigmoid',
weights=None, truncate_gradient=-1, return_sequences=False,
input_dim=None, input_length=None, **kwargs):
self.output_dim = output_dim
self.init = initializations.get(init)
self.inner_init = initializations.get(inner_init)
self.forget_bias_init = initializations.get(forget_bias_init)
self.activation = activations.get(activation)
self.inner_activation = activations.get(inner_activation)
self.truncate_gradient = truncate_gradient
self.return_sequences = return_sequences
self.initial_weights = weights
self.input_dim = input_dim
self.input_length = input_length
if self.input_dim:
kwargs['input_shape'] = (self.input_length, self.input_dim)
super(LSTMmul_soft, self).__init__(**kwargs)
def build(self):
input_dim = self.input_shape[2]
self.input = T.tensor3()
scale=0.05
self.W_g = self.init((input_dim, self.output_dim))
self.U_g = sharedX(np.random.uniform(low=-scale, high=scale, size=(self.output_dim, 3, self.output_dim)))
self.b_g = shared_zeros((self.output_dim))
self.W_c = self.init((input_dim, self.output_dim))
self.U_c = self.inner_init((self.output_dim, self.output_dim))
self.b_c = shared_zeros((self.output_dim))
self.W_o = self.init((input_dim, self.output_dim))
self.U_o = self.inner_init((self.output_dim, self.output_dim))
self.b_o = shared_zeros((self.output_dim))
self.params = [
self.W_g, self.U_g, self.b_g,
self.W_c, self.U_c, self.b_c,
self.W_o, self.U_o, self.b_o,
]
if self.initial_weights is not None:
self.set_weights(self.initial_weights)
del self.initial_weights
def _step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
ops = [c_mask_tm1,self.activation(xc_t + T.dot(h_mask_tm1, u_c)),c_mask_tm1 * self.activation(xc_t + T.dot(h_mask_tm1, u_c))]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
return h_t, c_t
def get_output(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories], updates = theano.scan(
self._step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
if self.return_sequences:
return outputs.dimshuffle((1, 0, 2))
return outputs[-1]
def _debug_step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, gates_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
ops = [c_mask_tm1,self.activation(xc_t + T.dot(h_mask_tm1, u_c)),c_mask_tm1 * self.activation(xc_t + T.dot(h_mask_tm1, u_c))]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
gates_t = gate
return h_t, c_t, gates_t
def get_gates(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories, gates], updates = theano.scan(
self._debug_step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim, 3), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
return outputs, gates, memories
# return gates, memories
def get_config(self):
config = {"name": self.__class__.__name__,
"output_dim": self.output_dim,
"init": self.init.__name__,
"inner_init": self.inner_init.__name__,
"forget_bias_init": self.forget_bias_init.__name__,
"activation": self.activation.__name__,
"inner_activation": self.inner_activation.__name__,
"truncate_gradient": self.truncate_gradient,
"return_sequences": self.return_sequences,
"input_dim": self.input_dim,
"input_length": self.input_length}
base_config = super(LSTMmul_soft, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class LSTMkernel_soft(Recurrent):
'''
soft selection of gates
'''
def __init__(self, output_dim,
init='glorot_uniform', inner_init='orthogonal', forget_bias_init='one',
activation='tanh', inner_activation='hard_sigmoid',
weights=None, truncate_gradient=-1, return_sequences=False,
input_dim=None, input_length=None, **kwargs):
self.output_dim = output_dim
self.init = initializations.get(init)
self.inner_init = initializations.get(inner_init)
self.forget_bias_init = initializations.get(forget_bias_init)
self.activation = activations.get(activation)
self.inner_activation = activations.get(inner_activation)
self.truncate_gradient = truncate_gradient
self.return_sequences = return_sequences
self.initial_weights = weights
self.input_dim = input_dim
self.input_length = input_length
if self.input_dim:
kwargs['input_shape'] = (self.input_length, self.input_dim)
super(LSTMkernel_soft, self).__init__(**kwargs)
def build(self):
input_dim = self.input_shape[2]
self.input = T.tensor3()
self.W_g = self.init((input_dim, self.output_dim))
# self.U_g = sharedX(np.random.uniform(low=-scale, high=scale, size=(self.output_dim, 6 , self.output_dim)))
self.U_g = self.inner_init((self.output_dim, 6, self.output_dim))
self.b_g = shared_zeros((self.output_dim))
self.W_c = self.init((input_dim, self.output_dim))
self.U_c = self.inner_init((self.output_dim, self.output_dim))
self.b_c = shared_zeros((self.output_dim))
self.W_o = self.init((input_dim, self.output_dim))
self.U_o = self.inner_init((self.output_dim, self.output_dim))
self.b_o = shared_zeros((self.output_dim))
self.EPS = 1e-10
scalar_init = 1
scale=0.01
# self.k_parameters = shared_ones((11,))
self.k_parameters = sharedX(np.random.uniform(low=scalar_init-scale, high=scalar_init+scale, size=(11, )))
# self.sigma_se = shared_scalar(scalar_init)
# self.sigma_per = shared_scalar(scalar_init)
# self.sigma_b_lin = shared_scalar(scalar_init)
# self.sigma_v_lin = shared_scalar(scalar_init)
# self.sigma_rq = shared_scalar(scalar_init)
# self.l_se = shared_scalar(scalar_init)
# self.l_per = shared_scalar(scalar_init)
# self.l_lin = shared_scalar(scalar_init)
# self.l_rq = shared_scalar(scalar_init)
# self.alpha_rq = shared_scalar(scalar_init)
# self.p_per = shared_scalar(scalar_init)
self.params = [
self.k_parameters,
# self.sigma_se, self.sigma_per, self.sigma_b_lin, self.sigma_v_lin,self.sigma_rq,
# self.l_se, self.l_per, self.l_lin, self.l_rq,
# self.alpha_rq, self.p_per,
self.W_g, self.U_g, self.b_g,
self.W_c, self.U_c, self.b_c,
self.W_o, self.U_o, self.b_o,
]
if self.initial_weights is not None:
self.set_weights(self.initial_weights)
del self.initial_weights
def _step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
c_tilda = self.activation(xc_t + T.dot(h_mask_tm1, u_c))
sigma_se = self.k_parameters[0]
sigma_per = self.k_parameters[1]
sigma_b_lin = self.k_parameters[2]
sigma_v_lin = self.k_parameters[3]
sigma_rq = self.k_parameters[4]
l_se = self.k_parameters[5]
l_per = self.k_parameters[6]
l_lin = self.k_parameters[7]
l_rq = self.k_parameters[8]
alpha_rq = self.k_parameters[9]
p_per = self.k_parameters[10]
k_se = T.pow(sigma_se,2) * T.exp( -T.pow(c_mask_tm1 - c_tilda,2) / (2* T.pow(l_se,2) + self.EPS))
k_per = T.pow(sigma_per,2) * T.exp( -2*T.pow(T.sin( math.pi*(c_mask_tm1 - c_tilda)/ (p_per + self.EPS) ),2) / ( T.pow(l_per,2) + self.EPS ))
k_lin = T.pow(sigma_b_lin,2) + T.pow(sigma_v_lin,2) * (c_mask_tm1 - l_lin) * (c_tilda - l_lin )
k_rq = T.pow(sigma_rq,2) * T.pow( 1 + T.pow( (c_mask_tm1 - c_tilda),2) / ( 2 * alpha_rq * T.pow(l_rq,2) + self.EPS), -alpha_rq)
ops = [c_mask_tm1,c_tilda,k_se, k_per, k_lin,k_rq]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
return h_t, c_t
def get_output(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories], updates = theano.scan(
self._step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
if self.return_sequences:
return outputs.dimshuffle((1, 0, 2))
return outputs[-1]
def _debug_step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, gates_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
c_tilda = self.activation(xc_t + T.dot(h_mask_tm1, u_c))
ops = [c_mask_tm1,c_tilda,(c_mask_tm1 + c_tilda),T.maximum(c_mask_tm1, c_tilda),T.minimum(c_mask_tm1, c_tilda),c_mask_tm1 - c_tilda,c_mask_tm1 * c_tilda,0 * c_tilda,0 * c_tilda + 1]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
gates_t = gate
return h_t, c_t, gates_t
def get_gates(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories, gates], updates = theano.scan(
self._debug_step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim, 9), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
return outputs, gates, memories
# return gates, memories
def get_config(self):
config = {"name": self.__class__.__name__,
"output_dim": self.output_dim,
"init": self.init.__name__,
"inner_init": self.inner_init.__name__,
"forget_bias_init": self.forget_bias_init.__name__,
"activation": self.activation.__name__,
"inner_activation": self.inner_activation.__name__,
"truncate_gradient": self.truncate_gradient,
"return_sequences": self.return_sequences,
"input_dim": self.input_dim,
"input_length": self.input_length}
base_config = super(LSTMkernel_soft, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class LSTMbase_soft(Recurrent):
'''
soft selection of gates
'''
def __init__(self, output_dim,
init='glorot_uniform', inner_init='orthogonal', forget_bias_init='one',
activation='tanh', inner_activation='hard_sigmoid',
weights=None, truncate_gradient=-1, return_sequences=False,
input_dim=None, input_length=None, **kwargs):
self.output_dim = output_dim
self.init = initializations.get(init)
self.inner_init = initializations.get(inner_init)
self.forget_bias_init = initializations.get(forget_bias_init)
self.activation = activations.get(activation)
self.inner_activation = activations.get(inner_activation)
self.truncate_gradient = truncate_gradient
self.return_sequences = return_sequences
self.initial_weights = weights
self.input_dim = input_dim
self.input_length = input_length
if self.input_dim:
kwargs['input_shape'] = (self.input_length, self.input_dim)
super(LSTMbase_soft, self).__init__(**kwargs)
def build(self):
input_dim = self.input_shape[2]
self.input = T.tensor3()
scale=0.05
self.W_g = self.init((input_dim, self.output_dim))
self.U_g = sharedX(np.random.uniform(low=-scale, high=scale, size=(self.output_dim, 2, self.output_dim)))
self.b_g = shared_zeros((self.output_dim))
self.W_c = self.init((input_dim, self.output_dim))
self.U_c = self.inner_init((self.output_dim, self.output_dim))
self.b_c = shared_zeros((self.output_dim))
self.W_o = self.init((input_dim, self.output_dim))
self.U_o = self.inner_init((self.output_dim, self.output_dim))
self.b_o = shared_zeros((self.output_dim))
self.params = [
self.W_g, self.U_g, self.b_g,
self.W_c, self.U_c, self.b_c,
self.W_o, self.U_o, self.b_o,
]
if self.initial_weights is not None:
self.set_weights(self.initial_weights)
del self.initial_weights
def _step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
ops = [c_mask_tm1,self.activation(xc_t + T.dot(h_mask_tm1, u_c))]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
return h_t, c_t
def get_output(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories], updates = theano.scan(
self._step,
sequences=[xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
if self.return_sequences:
return outputs.dimshuffle((1, 0, 2))
return outputs[-1]
def _debug_step(self,xg_t, xo_t, xc_t, mask_tm1,h_tm1, c_tm1, gates_tm1, u_g, u_o, u_c):
h_mask_tm1 = mask_tm1 * h_tm1
c_mask_tm1 = mask_tm1 * c_tm1
act = T.tensordot( xg_t + h_mask_tm1, u_g , [[1],[2]])
gate = T.nnet.softmax(act.reshape((-1, act.shape[-1]))).reshape(act.shape)
ops = [c_mask_tm1,self.activation(xc_t + T.dot(h_mask_tm1, u_c))]
yshuff = T.as_tensor_variable( ops, name='yshuff').dimshuffle(1,2,0)
c_t = (gate.reshape((-1,gate.shape[-1])) * yshuff.reshape((-1,yshuff.shape[-1]))).sum(axis = 1).reshape(gate.shape[:2])
o_t = self.inner_activation(xo_t + T.dot(h_mask_tm1, u_o))
h_t = o_t * self.activation(c_t)
gates_t = gate
return h_t, c_t, gates_t
def get_gates(self, train=False):
X = self.get_input(train)
padded_mask = self.get_padded_shuffled_mask(train, X, pad=1)
X = X.dimshuffle((1, 0, 2))
xg = T.dot(X, self.W_g) + self.b_g
xc = T.dot(X, self.W_c) + self.b_c
xo = T.dot(X, self.W_o) + self.b_o
[outputs, memories, gates], updates = theano.scan(
self._debug_step,
sequences = [xg, xo, xc, padded_mask],
outputs_info=[
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim), 1),
T.unbroadcast(alloc_zeros_matrix(X.shape[1], self.output_dim, 2), 1)
],
non_sequences=[self.U_g, self.U_o, self.U_c],
truncate_gradient=self.truncate_gradient)
return outputs, gates, memories
def get_config(self):
config = {"name": self.__class__.__name__,
"output_dim": self.output_dim,
"init": self.init.__name__,
"inner_init": self.inner_init.__name__,
"forget_bias_init": self.forget_bias_init.__name__,
"activation": self.activation.__name__,
"inner_activation": self.inner_activation.__name__,
"truncate_gradient": self.truncate_gradient,
"return_sequences": self.return_sequences,
"input_dim": self.input_dim,
"input_length": self.input_length}
base_config = super(LSTMbase_soft, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
| 37.684375 | 183 | 0.706402 | 4,164 | 24,118 | 3.768732 | 0.039145 | 0.042376 | 0.067928 | 0.048748 | 0.927101 | 0.924616 | 0.905372 | 0.901039 | 0.899 | 0.896451 | 0 | 0.019381 | 0.142135 | 24,118 | 639 | 184 | 37.743349 | 0.739101 | 0.040012 | 0 | 0.892495 | 0 | 0 | 0.032905 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056795 | false | 0 | 0.020284 | 0 | 0.133874 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4606b79eb7c0f9255d807b9ba13ebf625b121a58 | 79 | py | Python | utils/__init__.py | chorseng/UMD | 680681fea76abcea02ff5f351727bcbb468c372a | [
"MIT"
] | 48 | 2019-05-12T08:42:55.000Z | 2022-03-15T07:54:40.000Z | utils/__init__.py | chorseng/UMD | 680681fea76abcea02ff5f351727bcbb468c372a | [
"MIT"
] | 6 | 2019-09-14T14:46:57.000Z | 2021-07-10T02:22:34.000Z | utils/__init__.py | chorseng/UMD | 680681fea76abcea02ff5f351727bcbb468c372a | [
"MIT"
] | 11 | 2019-09-12T03:46:42.000Z | 2021-10-03T17:43:39.000Z | from .utils import to_str, pad_text
from .get_embed_init import get_embed_init
| 26.333333 | 42 | 0.848101 | 15 | 79 | 4.066667 | 0.666667 | 0.262295 | 0.393443 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113924 | 79 | 2 | 43 | 39.5 | 0.871429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1ce70bb358a2cfd0ac6017167365b132c6bc153a | 97,367 | py | Python | bcl_caffe/layers/bcl_layers.py | LEOCUIZHIHAO/segcarpoint | 42d78cde1f28b0c705f7755356610cf3039c3caf | [
"MIT"
] | null | null | null | bcl_caffe/layers/bcl_layers.py | LEOCUIZHIHAO/segcarpoint | 42d78cde1f28b0c705f7755356610cf3039c3caf | [
"MIT"
] | null | null | null | bcl_caffe/layers/bcl_layers.py | LEOCUIZHIHAO/segcarpoint | 42d78cde1f28b0c705f7755356610cf3039c3caf | [
"MIT"
] | null | null | null |
from pathlib import Path
import pickle
import shutil
import time, timeit
import numpy as np
import torch
import torchplus
from google.protobuf import text_format
import second.data.kitti_common as kitti
from second.builder import target_assigner_builder, voxel_builder
from second.pytorch.core import box_torch_ops
from second.data.preprocess import merge_second_batch, merge_second_batch_multigpu
from second.protos import pipeline_pb2
from second.pytorch.builder import box_coder_builder, input_reader_builder
from second.pytorch.models.voxel_encoder import get_paddings_indicator_np #for pillar
from second.utils.log_tool import SimpleModelLog
import caffe
from enum import Enum
import numpy_indexed as npi
from numba import jit
from numba import njit, prange
from second.core import box_np_ops
def build_network(model_cfg, measure_time=False):
voxel_generator = voxel_builder.build(model_cfg.voxel_generator)
bv_range = voxel_generator.point_cloud_range[[0, 1, 3, 4]]
box_coder = box_coder_builder.build(model_cfg.box_coder)
target_assigner_cfg = model_cfg.target_assigner
target_assigner = target_assigner_builder.build(target_assigner_cfg,
bv_range, box_coder)
return voxel_generator, target_assigner
def _worker_init_fn(worker_id):
time_seed = np.array(time.time(), dtype=np.int32)
np.random.seed(time_seed + worker_id)
print(f"WORKER {worker_id} seed:", np.random.get_state()[1][0])
def load_config(model_dir, config_path):
model_dir = str(Path(model_dir).resolve())
model_dir = Path(model_dir)
config_file_bkp = "pipeline.config"
if isinstance(config_path, str):
# directly provide a config object. this usually used
# when you want to train with several different parameters in
# one script.
config = pipeline_pb2.TrainEvalPipelineConfig()
with open(config_path, "r") as f:
proto_str = f.read()
text_format.Merge(proto_str, config)
else:
config = config_path
proto_str = text_format.MessageToString(config, indent=2)
with (model_dir / config_file_bkp).open("w") as f:
f.write(proto_str)
input_cfg = config.train_input_reader
eval_input_cfg = config.eval_input_reader
model_cfg = config.model.second
train_cfg = config.train_config
return (input_cfg, eval_input_cfg, model_cfg, train_cfg)
class LossNormType(Enum):
NormByNumPositives = "norm_by_num_positives"
NormByNumExamples = "norm_by_num_examples"
NormByNumPosNeg = "norm_by_num_pos_neg"
class DataFeature(caffe.Layer):
def setup(self, bottom, top):
params = {}
params.update(eval(self.param_str))
bcl_keep_voxels_eval = params['bcl_keep_voxels_eval']
seg_keep_points_eval = params['seg_keep_points_eval']
num_points_per_voxel = params['num_points_per_voxel']
is_segmentation = params['segmentation']
try:
batch_size = params["eval_batch_size"]
except Exception as e:
batch_size = 1
# BCL
if is_segmentation:
top[0].reshape(*(batch_size, seg_keep_points_eval, 4)) # for pillar shape should (B,C=9,V,N=100), For second (B,C=1,V,N=5)
else:
# top[0].reshape(*(bcl_keep_voxels_eval, num_points_per_voxel, 4)) #pillar
top[0].reshape(*(batch_size, bcl_keep_voxels_eval, 4)) #pillar
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
pass
def backward(self, top, propagate_down, bottom):
pass
class VoxelSegNetInput(caffe.Layer):
def setup(self, bottom, top):
params = {}
params.update(eval(self.param_str))
max_voxels = params['max_voxels']
points_per_voxel = params['points_per_voxel']
seg_keep_points_eval = params['seg_keep_points_eval']
top[0].reshape(*(1, seg_keep_points_eval, 4)) # seg points
top[1].reshape(*(1, max_voxels, 3)) # Coords
top[2].reshape(*(1, seg_keep_points_eval, 3)) # p2voxel_idx
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
pass
def backward(self, top, propagate_down, bottom):
pass
class LatticeFeature(caffe.Layer):
def setup(self, bottom, top):
params = {}
params.update(eval(self.param_str))
bcl_keep_voxels_eval = params['bcl_keep_voxels_eval']
seg_keep_points_eval = params['seg_keep_points_eval']
is_segmentation = params['segmentation']
# BCL
if is_segmentation:
top[0].reshape(*(seg_keep_points_eval,4)) #(V, C=4) # TODO:
else:
top[0].reshape(*(bcl_keep_voxels_eval,4)) # for pillar
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
pass
def backward(self, top, propagate_down, bottom):
pass
#for point-wise segmentation
class InputKittiData(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
# for point segmentation detection
for example in self.dataloader:
seg_points = example['seg_points']
seg_labels =example['seg_labels']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*seg_points.shape)
top[1].reshape(*seg_labels.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
seg_points = example['seg_points']
seg_labels = example['seg_labels']
# """shuffle car seg points""" #move to preprocess
# indices = np.arange(seg_labels.shape[1])
# np.random.shuffle(indices)
# seg_points = seg_points[:,indices]
# seg_labels = seg_labels[:,indices]
# for point object segmentation
top[0].reshape(*seg_points.shape)
top[1].reshape(*seg_labels.shape)
top[0].data[...] = seg_points
top[1].data[...] = seg_labels
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
try: segmentation = args["segmentation"]
except: segmentation = True
try: bcl_keep_voxels = args["bcl_keep_voxels"]
except: bcl_keep_voxels = 6000
try: seg_keep_points = args["seg_keep_points"]
except: seg_keep_points = 8000
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae']) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
#for voxel-wise object detection
class InputKittiDataV2(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
# for point segmentation detection
for example in self.dataloader:
voxels = example['voxels']
coors = example['coordinates']
labels = example['labels']
reg_targets = example['reg_targets']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*voxels.shape)
top[1].reshape(*coors.shape)
top[2].reshape(*labels.shape)
top[3].reshape(*reg_targets.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
voxels = example['voxels']
coors = example['coordinates']
labels = example['labels']
reg_targets = example['reg_targets']
# for point object segmentation
# top[0].reshape(*voxels.shape)
# top[1].reshape(*coors.shape)
# top[2].reshape(*labels.shape)
# top[3].reshape(*reg_targets.shape)
top[0].data[...] = voxels
top[1].data[...] = coors
top[2].data[...] = labels
top[3].data[...] = reg_targets
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
try: segmentation = args["segmentation"]
except: segmentation = False
try: bcl_keep_voxels = args["bcl_keep_voxels"]
except: bcl_keep_voxels = 6000
try: seg_keep_points = args["seg_keep_points"]
except: seg_keep_points = 8000
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae']) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
#for point-wise object detection & segmentation
class InputKittiDataV3(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.generate_anchors_cachae = params['anchors_cachae'] #True FOR Pillar, False For BCL
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg, self.model_cfg)
# for point segmentation detection
for example in self.dataloader:
points = example['points']
coors = example['coordinates']
labels = example['labels']
reg_targets = example['reg_targets']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*points.shape)
top[1].reshape(*coors.shape)
top[2].reshape(*labels.shape)
top[3].reshape(*reg_targets.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
points = example['points']
coors = example['coordinates']
labels = example['labels']
reg_targets = example['reg_targets']
# for point object segmentation
top[0].reshape(*points.shape)
top[1].reshape(*coors.shape)
top[2].reshape(*labels.shape)
top[3].reshape(*reg_targets.shape)
top[0].data[...] = points
top[1].data[...] = coors
top[2].data[...] = labels
top[3].data[...] = reg_targets
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
multi_gpu=False,
#generate_anchors_cachae=self.generate_anchors_cachae
) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
#for point-wise object detection
class InputKittiDataV4(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params['anchors_cachae']=False #False For BCL, Anchor Free
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
for example in self.dataloader:
points = example['points']
labels = example['labels']
reg_targets = example['reg_targets']
break
self.data_iter = iter(self.dataloader)
top[0].reshape(*points.shape)
top[1].reshape(*labels.shape)
top[2].reshape(*reg_targets.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
points = example['points']
labels = example['labels']
reg_targets = example['reg_targets']
top[0].reshape(*points.shape)
top[1].reshape(*labels.shape)
top[2].reshape(*reg_targets.shape)
top[0].data[...] = points
top[1].data[...] = labels
top[2].data[...] = reg_targets
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae']) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
#for seg_feature map
class InputKittiDataV5(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
# for point segmentation detection
for example in self.dataloader:
seg_points = example['seg_points']
seg_labels =example['seg_labels']
labels = example['labels']
reg_targets =example['reg_targets']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*seg_points.shape)
top[1].reshape(*seg_labels.shape)
top[2].reshape(*labels.shape)
top[3].reshape(*reg_targets.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
seg_points = example['seg_points']
seg_labels = example['seg_labels']
labels = example['labels']
reg_targets =example['reg_targets']
# """shuffle car seg points""" #moved to preprocess
# for point object segmentation
top[0].data[...] = seg_points
top[1].data[...] = seg_labels
top[2].data[...] = labels
top[3].data[...] = reg_targets
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
try: segmentation = args["segmentation"]
except: segmentation = True
try: bcl_keep_voxels = args["bcl_keep_voxels"]
except: bcl_keep_voxels = 6000
try: seg_keep_points = args["seg_keep_points"]
except: seg_keep_points = 8000
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae']) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
class InputKittiDataV6(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
# for point segmentation detection
for example in self.dataloader:
seg_points = example['seg_points']
seg_labels =example['seg_labels']
gt_box = example['gt_boxes']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*seg_points.shape)
top[1].reshape(*seg_labels.shape)
top[2].reshape(*gt_box.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
seg_points = example['seg_points']
seg_labels = example['seg_labels']
gt_box = example['gt_boxes']
# """shuffle car seg points""" #moved to preprocess
# for point object segmentation
top[0].data[...] = seg_points
top[1].data[...] = seg_labels
top[2].reshape(*gt_box.shape)
top[2].data[...] = gt_box
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
try: segmentation = args["segmentation"]
except: segmentation = True
try: bcl_keep_voxels = args["bcl_keep_voxels"]
except: bcl_keep_voxels = 6000
try: seg_keep_points = args["seg_keep_points"]
except: seg_keep_points = 8000
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae']) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
class InputKittiDataV7(caffe.Layer):
def setup(self, bottom, top):
params = dict(batch_size=1)
params.update(eval(self.param_str))
model_dir = params['model_dir']
config_path = params['config_path']
self.phase = params['subset']
self.input_cfg, self.eval_input_cfg, self.model_cfg, train_cfg = load_config(model_dir, config_path)
self.voxel_generator, self.target_assigner = build_network(self.model_cfg)
self.dataloader = self.load_dataloader(self.input_cfg, self.eval_input_cfg,
self.model_cfg, args=params)
# for point segmentation detection
for example in self.dataloader:
seg_points = example['seg_points']
seg_labels = example['seg_labels']
coords = example['coords']
p2voxel_idx = example['p2voxel_idx']
cls_labels = example['cls_labels']
reg_targets = example['reg_targets']
break
self.data_iter = iter(self.dataloader)
# for point object segmentation
top[0].reshape(*seg_points.shape)
top[1].reshape(*seg_labels.shape)
top[2].reshape(*coords.shape)
top[3].reshape(*p2voxel_idx.shape)
top[4].reshape(*cls_labels.shape)
top[5].reshape(*reg_targets.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
try:
example = next(self.data_iter)
except Exception as e:
print("\n[info] start a new epoch for {} data\n".format(self.phase))
self.data_iter = iter(self.dataloader)
example = next(self.data_iter)
seg_points = example['seg_points']
seg_labels = example['seg_labels']
coords = example['coords']
p2voxel_idx = example['p2voxel_idx']
cls_labels = example['cls_labels']
reg_targets = example['reg_targets']
# """shuffle car seg points""" #moved to preprocess
# for point object segmentation
top[0].data[...] = seg_points
top[1].data[...] = seg_labels
top[2].data[...] = coords
top[3].data[...] = p2voxel_idx
top[4].data[...] = cls_labels
top[5].data[...] = reg_targets
#print("[debug] train img idx : ", example["metadata"])
def backward(self, top, propagate_down, bottom):
pass
def load_dataloader(self, input_cfg, eval_input_cfg, model_cfg, args):
try: segmentation = args["segmentation"]
except: segmentation = True
try: bcl_keep_voxels = args["bcl_keep_voxels"]
except: bcl_keep_voxels = 6000
try: seg_keep_points = args["seg_keep_points"]
except: seg_keep_points = 8000
try: points_per_voxel = args["points_per_voxel"]
except: points_per_voxel = 200
dataset = input_reader_builder.build(
input_cfg,
model_cfg,
training=True,
voxel_generator=self.voxel_generator,
target_assigner=self.target_assigner,
segmentation=segmentation,
bcl_keep_voxels=bcl_keep_voxels,
seg_keep_points=seg_keep_points,
multi_gpu=False,
generate_anchors_cachae=args['anchors_cachae'],
points_per_voxel=points_per_voxel) #True FOR Pillar, False For BCL
dataloader = torch.utils.data.DataLoader(
dataset,
batch_size=input_cfg.batch_size,
shuffle=True,
num_workers=input_cfg.preprocess.num_workers,
pin_memory=False,
collate_fn=merge_second_batch,
worker_init_fn=_worker_init_fn,
drop_last=not False)
return dataloader
class Scatter(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
output_shape = param['output_shape']
self.ny = output_shape[0]
self.nx = output_shape[1]
self.nchannels = output_shape[2]
self.batch_size = 1
voxel_features = bottom[0].data
voxel_features = np.squeeze(voxel_features) #(1, 64, 1, Voxel) -> (64,Voxel)
coords = bottom[1].data # reverse_index is True, output coordinates will be zyx format
batch_canvas, _ = self.ScatterNet(voxel_features, coords, self.nchannels, self.nx, self.ny)
top[0].reshape(*batch_canvas.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
voxel_features = bottom[0].data #(1,64,-1,1)
voxel_features = np.squeeze(voxel_features) #(1, 64, -1, 1) -> (64,-1)
coords = bottom[1].data
batch_canvas, self.indices = self.ScatterNet(voxel_features, coords, self.nchannels, self.nx, self.ny)
top[0].data[...] = batch_canvas
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(self.batch_size, self.nchannels, self.nx * self.ny)[:,:,self.indices]
bottom[0].diff[...] = np.expand_dims(diff, axis=2)
def ScatterNet(self, voxel_features, coords, nchannels, feature_map_x, feature_map_y):
canvas = np.zeros(shape=(nchannels, feature_map_x * feature_map_y)) #(nchannels,-1)
# Only include non-empty pillars
indices = coords[:, 2] * feature_map_x + coords[:, 3]
indices = indices.astype(int)
canvas[:, indices] = voxel_features
canvas = canvas.reshape(self.batch_size, nchannels, feature_map_y, feature_map_x)
return canvas, indices
def Voxel3DStack2D(self, voxel_features, coors):
# coors = np.delete(coors, obj=1, axis=1) #delete z column
coors = coors[:,2:]
voxel_group = npi.group_by(coors) #features mean
coors_idx, voxel_features = voxel_group.mean(voxel_features) #features max
return voxel_features, coors_idx, voxel_group
class Point2FeatMap(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
# (1,4,100,100,80)
self.feat_map_size = param['feat_map_size']
self.point_cloud_range = np.array(param['point_cloud_range'])
try: self.use_depth = param['use_depth']
except: self.use_depth = False
try: self.use_score = param['use_score']
except: self.use_score = False
try: self.use_points = param['use_points']
except: self.use_points = False
self.thresh = param['thresh']
self.num_feat = self.feat_map_size[1]
self.num_points = self.feat_map_size[2]
self.feat_h = self.feat_map_size[3]
self.feat_w = self.feat_map_size[4]
self.feat_map_size = np.array(self.feat_map_size)
top[0].reshape(1, self.num_feat*self.num_points, self.feat_h, self.feat_w)
# top[0].reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
# if self.num_feat != 4 and self.num_feat != 5:
# print("[Error] Feature number other than 4 and 5 is not yet implemented")
# raise NotImplementedError
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
points = bottom[0].data[...].squeeze()
point_xy = points[:,:2]
score = bottom[1].data[...].squeeze()
if not self.use_depth:
points = points[:,:3]
if self.use_score:
points = np.concatenate((points, score.reshape(-1,1)), axis = -1)
if len(bottom) > 2:
extra_feat = bottom[2].data[...].squeeze().transpose()
self.extra_feat_shape = extra_feat.shape
points = np.concatenate((points, extra_feat), axis = -1)
if not self.use_points:
points = points[:,3:]
self.p2feat_idx = np.zeros((points.shape[0], 3), dtype=np.int_)
points = points[score>self.thresh,:]
point_xy = point_xy[score>self.thresh,:]
p2feat_idx = self.p2feat_idx[score>self.thresh,:]
# Calculate grid size of feature map
# voxel size of [w, h]
voxel_size = (self.point_cloud_range[3:5]-self.point_cloud_range[:2])/np.array([self.feat_w, self.feat_h])
# create a feature map of cooresponding shape
feat_map = np.zeros((1, self.num_feat, self.num_points, self.feat_h, self.feat_w), dtype=np.float32)
points_in_feat_map = np.zeros((self.feat_h, self.feat_w), dtype=np.int_)
#point to voxel indices (num, h, w)
offset = np.array(self.point_cloud_range[:2])
# Indices (w, h)
indices = np.floor((point_xy-offset)/voxel_size).astype(np.int_)
# remove points and indices that are out put range
feat_map, p2feat_idx=self.to_feat_map(points, feat_map, indices, points_in_feat_map,
p2feat_idx, self.num_points)
self.p2feat_idx[score>self.thresh,:] = p2feat_idx
feat_map = feat_map.reshape(1, -1, self.feat_h, self.feat_w)
# feat_map = feat_map.reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
top[0].data[...] = feat_map
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(1,self.num_feat,self.num_points,self.feat_h,
self.feat_w)
backward = np.zeros((1,1,1,self.p2feat_idx.shape[0]))
mask = (self.p2feat_idx > 0).any(-1)
indices = self.p2feat_idx[mask]
diff = diff[:,:,indices[:,0],indices[:,1],indices[:,2]].squeeze().transpose()
if len(bottom) > 2:
backward_extra = np.zeros((1,self.extra_feat_shape[1],1,self.extra_feat_shape[0]))
# OPTIMIZE: get rid of two expand_dims
extra_feat_backward = diff[:,-self.extra_feat_shape[1]:].transpose()
extra_feat_backward = np.expand_dims(extra_feat_backward,0)
extra_feat_backward = np.expand_dims(extra_feat_backward,2)
backward_extra[..., mask] = extra_feat_backward
bottom[2].diff[...] = backward_extra
if self.use_score:
backward[..., mask] = diff[:,(-self.extra_feat_shape[1]-1)]
bottom[1].diff[...] = backward
else:
if self.use_score:
backward[..., mask] = diff[:,-1]
bottom[1].diff[...] = backward
@staticmethod
@njit#(nopython=True)#, parallel=True)
def to_feat_map(points, feat_map, indices, points_in_feat_map, p2feat_idx, num_p_feat = 10):
# Indices is (width, height)
for idx in prange(len(indices)):
feat_index = indices[idx]
num = points_in_feat_map[feat_index[1],feat_index[0]]
if num < num_p_feat:
feat_map[:,:,num,feat_index[1],feat_index[0]] = points[idx]
points_in_feat_map[feat_index[1],feat_index[0]] += 1
p2feat_idx[idx,0] = num
p2feat_idx[idx,1] = feat_index[1]
p2feat_idx[idx,2] = feat_index[0]
return feat_map, p2feat_idx
#return (B,C,N,H,W)
class Point2FeatMapV3(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
# (1,4,100,100,80)
self.feat_map_size = param['feat_map_size']
self.point_cloud_range = np.array(param['point_cloud_range'])
try: self.use_depth = param['use_depth']
except: self.use_depth = False
try: self.use_score = param['use_score']
except: self.use_score = False
try: self.use_points = param['use_points']
except: self.use_points = False
self.thresh = param['thresh']
self.num_feat = self.feat_map_size[1]
self.num_points = self.feat_map_size[2]
self.feat_h = self.feat_map_size[3]
self.feat_w = self.feat_map_size[4]
self.feat_map_size = np.array(self.feat_map_size)
# top[0].reshape(1, self.num_feat*self.num_points, self.feat_h, self.feat_w)
top[0].reshape(1, self.num_feat, self.num_points, self.feat_h* self.feat_w) #leo added to (1,c,n,h*w)
# if self.num_feat != 4 and self.num_feat != 5:
# print("[Error] Feature number other than 4 and 5 is not yet implemented")
# raise NotImplementedError
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
points = bottom[0].data[...].squeeze()
point_xy = points[:,:2]
#score = bottom[1].data[...].squeeze()
if not self.use_depth:
points = points[:,:3]
if self.use_score:
points = np.concatenate((points, score.reshape(-1,1)), axis = -1)
if len(bottom) > 1:
extra_feat = bottom[1].data[...].squeeze().transpose()
self.extra_feat_shape = extra_feat.shape
points = np.concatenate((points, extra_feat), axis = -1)
if not self.use_points:
points = points[:,3:]
self.p2feat_idx = np.zeros((points.shape[0], 3), dtype=np.int_)
#points = points[score>self.thresh,:]
#point_xy = point_xy[score>self.thresh,:]
# p2feat_idx = self.p2feat_idx#[score>self.thresh,:]
# Calculate grid size of feature map
# voxel size of [w, h]
voxel_size = (self.point_cloud_range[3:5]-self.point_cloud_range[:2])/np.array([self.feat_w, self.feat_h])
# create a feature map of cooresponding shape
feat_map = np.zeros((1, self.num_feat, self.num_points, self.feat_h, self.feat_w), dtype=np.float32)
points_in_feat_map = np.zeros((self.feat_h, self.feat_w), dtype=np.int_)
#point to voxel indices (num, h, w)
offset = np.array(self.point_cloud_range[:2])
# Indices (w, h)
indices = np.floor((point_xy-offset)/voxel_size).astype(np.int_)
# remove points and indices that are out put range
feat_map, p2feat_idx=self.to_feat_map(points, feat_map, indices, points_in_feat_map,
self.p2feat_idx, self.num_points)
# self.p2feat_idx[score>self.thresh,:] = p2feat_idx
self.p2feat_idx = p2feat_idx
# feat_map = feat_map.reshape(1, -1, self.feat_h, self.feat_w)
feat_map = feat_map.reshape(1, self.num_feat, self.num_points, self.feat_h* self.feat_w) #leo added to (1,c,n,h*w)
top[0].data[...] = feat_map
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(1,self.num_feat,self.num_points,self.feat_h,
self.feat_w)
#backward = np.zeros((1,1,1,self.p2feat_idx.shape[0]))
mask = (self.p2feat_idx > 0).any(-1)
indices = self.p2feat_idx[mask]
diff = diff[:,:,indices[:,0],indices[:,1],indices[:,2]].squeeze().transpose()
if len(bottom) > 1:
# backward_extra = np.zeros((1,self.extra_feat_shape[1],1,self.extra_feat_shape[0])) #old
# OPTIMIZE: get rid of two expand_dims
extra_feat_backward = diff[:,-self.extra_feat_shape[1]:].transpose()
extra_feat_backward = np.expand_dims(extra_feat_backward,0)
extra_feat_backward = np.expand_dims(extra_feat_backward,2)
# backward_extra[..., mask] = extra_feat_backward #old
# bottom[1].diff[...] = backward_extra #old
#####################Test new backward##############################
bottom[1].diff[...] = 0
bottom[1].diff[..., mask] = extra_feat_backward
#####################Test new backward##############################
if self.use_score:
pass
else:
if self.use_score:
pass
@staticmethod
@njit#(nopython=True)#, parallel=True)
def to_feat_map(points, feat_map, indices, points_in_feat_map, p2feat_idx, num_p_feat = 10):
# Indices is (width, height)
for idx in prange(len(indices)):
feat_index = indices[idx]
num = points_in_feat_map[feat_index[1],feat_index[0]]
if num < num_p_feat:
feat_map[:,:,num,feat_index[1],feat_index[0]] = points[idx]
points_in_feat_map[feat_index[1],feat_index[0]] += 1
p2feat_idx[idx,0] = num
p2feat_idx[idx,1] = feat_index[1]
p2feat_idx[idx,2] = feat_index[0]
return feat_map, p2feat_idx
class Point2FeatMapV2(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
# (1,4,100,100,80)
self.feat_map_size = param['feat_map_size']
self.point_cloud_range = np.array(param['point_cloud_range'])
try: self.use_depth = param['use_depth']
except: self.use_depth = False
try: self.use_score = param['use_score']
except: self.use_score = False
try: self.use_points = param['use_points']
except: self.use_points = False
self.thresh = param['thresh']
self.num_feat = self.feat_map_size[1]
self.num_points = self.feat_map_size[2]
self.feat_h = self.feat_map_size[3]
self.feat_w = self.feat_map_size[4]
self.feat_map_size = np.array(self.feat_map_size)
top[0].reshape(1, self.num_feat*self.num_points, self.feat_h, self.feat_w)
# top[0].reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
# if self.num_feat != 4 and self.num_feat != 5:
# print("[Error] Feature number other than 4 and 5 is not yet implemented")
# raise NotImplementedError
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
points = bottom[0].data[...].squeeze()
point_xy = points[:,:2]
#score = bottom[1].data[...].squeeze()
if not self.use_depth:
points = points[:,:3]
if self.use_score:
points = np.concatenate((points, score.reshape(-1,1)), axis = -1)
if len(bottom) > 1:
extra_feat = bottom[1].data[...].squeeze().transpose()
self.extra_feat_shape = extra_feat.shape
points = np.concatenate((points, extra_feat), axis = -1)
if not self.use_points:
points = points[:,3:]
self.p2feat_idx = np.zeros((points.shape[0], 3), dtype=np.int_)
#points = points[score>self.thresh,:]
#point_xy = point_xy[score>self.thresh,:]
# p2feat_idx = self.p2feat_idx#[score>self.thresh,:]
# Calculate grid size of feature map
# voxel size of [w, h]
voxel_size = (self.point_cloud_range[3:5]-self.point_cloud_range[:2])/np.array([self.feat_w, self.feat_h])
# create a feature map of cooresponding shape
feat_map = np.zeros((1, self.num_feat, self.num_points, self.feat_h, self.feat_w), dtype=np.float32)
points_in_feat_map = np.zeros((self.feat_h, self.feat_w), dtype=np.int_)
#point to voxel indices (num, h, w)
offset = np.array(self.point_cloud_range[:2])
# Indices (w, h)
indices = np.floor((point_xy-offset)/voxel_size).astype(np.int_)
# remove points and indices that are out put range
feat_map, p2feat_idx=self.to_feat_map(points, feat_map, indices, points_in_feat_map,
self.p2feat_idx, self.num_points)
# self.p2feat_idx[score>self.thresh,:] = p2feat_idx
self.p2feat_idx = p2feat_idx
feat_map = feat_map.reshape(1, -1, self.feat_h, self.feat_w)
# feat_map = feat_map.reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
top[0].data[...] = feat_map
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(1,self.num_feat,self.num_points,self.feat_h,
self.feat_w)
#backward = np.zeros((1,1,1,self.p2feat_idx.shape[0]))
mask = (self.p2feat_idx > 0).any(-1)
indices = self.p2feat_idx[mask]
diff = diff[:,:,indices[:,0],indices[:,1],indices[:,2]].squeeze().transpose()
if len(bottom) > 1:
# backward_extra = np.zeros((1,self.extra_feat_shape[1],1,self.extra_feat_shape[0])) #old
# OPTIMIZE: get rid of two expand_dims
extra_feat_backward = diff[:,-self.extra_feat_shape[1]:].transpose()
extra_feat_backward = np.expand_dims(extra_feat_backward,0)
extra_feat_backward = np.expand_dims(extra_feat_backward,2)
# backward_extra[..., mask] = extra_feat_backward #old
# bottom[1].diff[...] = backward_extra #old
#####################Test new backward##############################
bottom[1].diff[...] = 0
bottom[1].diff[..., mask] = extra_feat_backward
#####################Test new backward##############################
if self.use_score:
pass
else:
if self.use_score:
pass
@staticmethod
@njit#(nopython=True)#, parallel=True)
def to_feat_map(points, feat_map, indices, points_in_feat_map, p2feat_idx, num_p_feat = 10):
# Indices is (width, height)
for idx in prange(len(indices)):
feat_index = indices[idx]
num = points_in_feat_map[feat_index[1],feat_index[0]]
if num < num_p_feat:
feat_map[:,:,num,feat_index[1],feat_index[0]] = points[idx]
points_in_feat_map[feat_index[1],feat_index[0]] += 1
p2feat_idx[idx,0] = num
p2feat_idx[idx,1] = feat_index[1]
p2feat_idx[idx,2] = feat_index[0]
return feat_map, p2feat_idx
class Point2FeatMapV4(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
# (1,4,100,100,80)
self.feat_map_size = param['feat_map_size']
self.point_cloud_range = np.array(param['point_cloud_range'])
try: self.use_depth = param['use_depth']
except: self.use_depth = False
try: self.use_score = param['use_score']
except: self.use_score = False
try: self.use_points = param['use_points']
except: self.use_points = False
self.thresh = param['thresh']
self.num_feat = self.feat_map_size[1]
self.num_points = self.feat_map_size[2]
self.feat_h = self.feat_map_size[3]
self.feat_w = self.feat_map_size[4]
self.feat_map_size = np.array(self.feat_map_size)
self.batch_size = bottom[1].data.shape[0]
top[0].reshape(self.batch_size, self.num_feat*self.num_points, self.feat_h, self.feat_w)
# top[0].reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
# if self.num_feat != 4 and self.num_feat != 5:
# print("[Error] Feature number other than 4 and 5 is not yet implemented")
# raise NotImplementedError
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
points = bottom[0].data[...]
point_xy = points[:,:,:2]
#score = bottom[1].data[...].squeeze()
if not self.use_depth:
points = points[:,:,:3]
if self.use_score:
points = np.concatenate((points, score.reshape(-1,1)), axis = -1)
if len(bottom) > 1:
extra_feat = bottom[1].data[...].squeeze(2).transpose(0,2,1)
self.extra_feat_shape = extra_feat.shape
points = np.concatenate((points, extra_feat), axis = -1)
if not self.use_points:
points = points[:,:,3:]
self.p2feat_idx = np.zeros((self.batch_size,points.shape[1], 3), dtype=np.int_)
#points = points[score>self.thresh,:]
#point_xy = point_xy[score>self.thresh,:]
# p2feat_idx = self.p2feat_idx#[score>self.thresh,:]
# Calculate grid size of feature map
# voxel size of [w, h]
voxel_size = (self.point_cloud_range[3:5]-self.point_cloud_range[:2])/np.array([self.feat_w, self.feat_h])
# create a feature map of cooresponding shape
feat_map = np.zeros((self.batch_size, self.num_feat, self.num_points, self.feat_h, self.feat_w), dtype=np.float32)
points_in_feat_map = np.zeros((self.batch_size, self.feat_h, self.feat_w), dtype=np.int_)
#point to voxel indices (num, h, w)
offset = np.array(self.point_cloud_range[:2])
# Indices (w, h)
indices = np.floor((point_xy-offset)/voxel_size).astype(np.int_)
# remove points and indices that are out put range
feat_map, p2feat_idx=self.to_feat_map(points, feat_map, indices, points_in_feat_map,
self.p2feat_idx, self.num_points)
# self.p2feat_idx[score>self.thresh,:] = p2feat_idx
self.p2feat_idx = p2feat_idx
feat_map = feat_map.reshape(self.batch_size, -1, self.feat_h, self.feat_w)
# feat_map = feat_map.reshape(1, self.num_feat, self.num_points, self.feat_h*self.feat_w) #leo added to (1,c,n,h*w)
top[0].data[...] = feat_map
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(self.batch_size,self.num_feat,self.num_points,self.feat_h,
self.feat_w)
bottom[1].diff[...] = 0
for batch in range(self.batch_size):
#backward = np.zeros((1,1,1,self.p2feat_idx.shape[0]))
mask = (self.p2feat_idx[batch,...] > 0).any(-1)
indices = self.p2feat_idx[batch, mask]
diff_ = diff[batch,:,indices[:,0],indices[:,1],indices[:,2]].squeeze().transpose()
if len(bottom) > 1:
# backward_extra = np.zeros((1,self.extra_feat_shape[1],1,self.extra_feat_shape[0])) #old
# OPTIMIZE: get rid of two expand_dims
extra_feat_backward = diff_[:,-self.extra_feat_shape[1]:].transpose()
# extra_feat_backward = np.expand_dims(extra_feat_backward,0)
# print("extra_feat_shape", extra_feat_backward.shape)
extra_feat_backward = np.expand_dims(extra_feat_backward,-1)
# backward_extra[..., mask] = extra_feat_backward #old
# bottom[1].diff[...] = backward_extra #old
#####################Test new backward##############################
bottom[1].diff[batch,:,:,mask] = extra_feat_backward
#####################Test new backward##############################
if self.use_score:
continue
else:
if self.use_score:
continue
#@njit#(nopython=True)#, parallel=True)
@staticmethod
@njit
def to_feat_map(points, feat_map, indices, points_in_feat_map, p2feat_idx, num_p_feat = 10):
# Indices is (width, height)
for batch in prange(indices.shape[0]):
for idx in prange(indices.shape[1]):
feat_index = indices[batch,idx]
num = points_in_feat_map[batch,feat_index[1],feat_index[0]]
if num < num_p_feat:
feat_map[batch,:,num,feat_index[1],feat_index[0]] = points[batch,idx]
points_in_feat_map[batch,feat_index[1],feat_index[0]] += 1
p2feat_idx[batch,idx,0] = num
p2feat_idx[batch,idx,1] = feat_index[1]
p2feat_idx[batch,idx,2] = feat_index[0]
return feat_map, p2feat_idx
class Point2Voxel3D(caffe.Layer):
def setup(self, bottom, top):
param = eval(self.param_str)
self.extra_feat_shape = bottom[0].data.shape
self.p2voxel_idx_shape = bottom[1].data.shape
self.max_voxels = param['max_voxels']
self.points_per_voxel = param['points_per_voxel']
top[0].reshape(1, self.points_per_voxel*self.extra_feat_shape[1], 1, self.max_voxels)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
extra_feat = bottom[0].data[...]
p2voxel_idx = bottom[1].data[...].astype(np.int_)
voxels = np.zeros((1, self.extra_feat_shape[1], self.points_per_voxel, self.max_voxels))
num = p2voxel_idx[:,:,0].squeeze()
voxel_idx = p2voxel_idx[:,:,1].squeeze()
point_idx = p2voxel_idx[:,:,2].squeeze()
voxels[:,:,num,voxel_idx] = extra_feat[...,point_idx].squeeze()
voxels = np.expand_dims(voxels.reshape(1,-1,self.max_voxels), 2)
top[0].reshape(1, self.points_per_voxel*self.extra_feat_shape[1], 1, self.max_voxels)
top[0].data[...] = voxels
def backward(self, top, propagate_down, bottom):
diff = top[0].diff.reshape(1, self.extra_feat_shape[1], self.points_per_voxel, self.max_voxels)
p2voxel_idx = bottom[1].data[...].astype(np.int_)
num = p2voxel_idx[:,:,0].squeeze()
voxel_idx = p2voxel_idx[:,:,1].squeeze()
point_idx = p2voxel_idx[:,:,2].squeeze()
diff = diff[:, :, num, voxel_idx]
backward = np.zeros(bottom[0].data.shape)
backward[..., point_idx] = np.expand_dims(diff, 2)
bottom[0].diff[...] = backward
class SegWeight(caffe.Layer):
def setup(self, bottom, top):
labels = bottom[0].data
seg_weights = self.prepare_loss_weights(labels)
top[0].reshape(*seg_weights.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
labels = bottom[0].data
seg_weights = self.prepare_loss_weights(labels)
top[0].data[...] = seg_weights
def prepare_loss_weights(self,
labels,
pos_cls_weight=1.0,
neg_cls_weight=1.0,
dtype="float32"):
positives = labels > 0
negatives = labels == 0
negative_cls_weights = negatives.astype(dtype) * neg_cls_weight
posetive_cls_weights = positives.astype(dtype) * pos_cls_weight
seg_weights = negative_cls_weights + posetive_cls_weights
reg_weights = positives.astype(dtype)
pos_normalizer = np.sum(positives, 1, keepdims=True).astype(dtype)
seg_weights /= np.clip(pos_normalizer, a_min=1.0, a_max=None) #(1, 107136)
return seg_weights
def backward(self, top, propagate_down, bottom):
pass
class PrepareLossWeight(caffe.Layer):
def setup(self, bottom, top):
labels = bottom[0].data
cls_weights, reg_weights, cared = self.prepare_loss_weights(labels)
top[0].reshape(*cared.shape)
top[1].reshape(*reg_weights.shape) #reg_outside_weights
top[2].reshape(*cls_weights.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
labels = bottom[0].data
cls_weights, reg_weights, cared = self.prepare_loss_weights(labels)
top[0].data[...] = cared
top[1].data[...] = reg_weights #reg_outside_weights
top[2].data[...] = cls_weights
def prepare_loss_weights(self,
labels,
pos_cls_weight=1.0, # TODO: pass params here
neg_cls_weight=1.0,
loss_norm_type=LossNormType.NormByNumPositives,
dtype="float32"):
"""get cls_weights and reg_weights from labels.
"""
cared = labels >= 0
# print("label ", np.unique(labels, return_counts=True))
# cared: [N, num_anchors]
positives = labels > 0
negatives = labels == 0
negative_cls_weights = negatives.astype(dtype) * neg_cls_weight
posetive_cls_weights = positives.astype(dtype) * pos_cls_weight #(1, 107136)
cls_weights = negative_cls_weights + posetive_cls_weights
reg_weights = positives.astype(dtype)
if loss_norm_type == LossNormType.NormByNumExamples:
num_examples = cared.astype(dtype).sum(1, keepdims=True)
num_examples = np.clip(num_examples, a_min=1.0, a_max=None)
cls_weights /= num_examples
bbox_normalizer = np.sum(positives, 1, keepdims=True).astype(dtype)
reg_weights /= np.clip(bbox_normalizer, a_min=1.0, a_max=None)
elif loss_norm_type == LossNormType.NormByNumPositives: # for focal loss
pos_normalizer = np.sum(positives, 1, keepdims=True).astype(dtype)
reg_weights /= np.clip(pos_normalizer, a_min=1.0, a_max=None) #(1, 107136)
cls_weights /= np.clip(pos_normalizer, a_min=1.0, a_max=None) #(1, 107136)
elif loss_norm_type == LossNormType.NormByNumPosNeg:
pos_neg = np.stack([positives, negatives], a_min=-1).astype(dtype)
normalizer = np.sum(pos_neg, 1, keepdims=True) # [N, 1, 2]
cls_normalizer = np.sum((pos_neg * normalizer),-1) # [N, M]
cls_normalizer = np.clip(cls_normalizer, a_min=1.0, a_max=None)
# cls_normalizer will be pos_or_neg_weight/num_pos_or_neg
normalizer = np.clip(normalizer, a_min=1.0, a_max=None)
reg_weights /= normalizer[:, 0:1, 0]
cls_weights /= cls_normalizer
else:
raise ValueError(
"unknown loss norm type. available: {list(LossNormType)}")
return cls_weights, reg_weights, cared
def backward(self, top, propagate_down, bottom):
pass
#For Point-Wise model
class PrepareLossWeightV2(caffe.Layer):
def setup(self, bottom, top):
labels = bottom[0].data
cls_weights, reg_weights = self.prepare_loss_weights(labels)
top[0].reshape(*reg_weights.shape) #reg_outside_weights
top[1].reshape(*cls_weights.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
labels = bottom[0].data
cls_weights, reg_weights = self.prepare_loss_weights(labels)
top[0].data[...] = reg_weights #reg_outside_weights
top[1].data[...] = cls_weights
def prepare_loss_weights(self,
labels,
pos_cls_weight=1.0,
neg_cls_weight=1.0,
loss_norm_type=LossNormType.NormByNumPositives,
dtype="float32"):
# print("label ", np.unique(labels, return_counts=True))
positives = labels > 0
negatives = labels == 0
negative_cls_weights = negatives.astype(dtype) * neg_cls_weight
posetive_cls_weights = positives.astype(dtype) * pos_cls_weight #(1, 107136)
cls_weights = negative_cls_weights + posetive_cls_weights
reg_weights = positives.astype(dtype)
if loss_norm_type == LossNormType.NormByNumExamples:
num_examples = cared.astype(dtype).sum(1, keepdims=True)
num_examples = np.clip(num_examples, a_min=1.0, a_max=None)
cls_weights /= num_examples
bbox_normalizer = np.sum(positives, 1, keepdims=True).astype(dtype)
reg_weights /= np.clip(bbox_normalizer, a_min=1.0, a_max=None)
elif loss_norm_type == LossNormType.NormByNumPositives: # for focal loss
pos_normalizer = np.sum(positives, 1, keepdims=True).astype(dtype)
reg_weights /= np.clip(pos_normalizer, a_min=1.0, a_max=None) #(1, 107136)
cls_weights /= np.clip(pos_normalizer, a_min=1.0, a_max=None) #(1, 107136)
elif loss_norm_type == LossNormType.NormByNumPosNeg:
pos_neg = np.stack([positives, negatives], a_min=-1).astype(dtype)
normalizer = np.sum(pos_neg, 1, keepdims=True) # [N, 1, 2]
cls_normalizer = np.sum((pos_neg * normalizer),-1) # [N, M]
cls_normalizer = np.clip(cls_normalizer, a_min=1.0, a_max=None)
# cls_normalizer will be pos_or_neg_weight/num_pos_or_neg
normalizer = np.clip(normalizer, a_min=1.0, a_max=None)
reg_weights /= normalizer[:, 0:1, 0]
cls_weights /= cls_normalizer
else:
raise ValueError(
"unknown loss norm type. available: {list(LossNormType)}")
return cls_weights, reg_weights
def backward(self, top, propagate_down, bottom):
pass
class LabelEncode(caffe.Layer):
def setup(self, bottom, top):
labels = bottom[0].data
cared = bottom[1].data
cls_targets = labels * cared # (1, 107136)
cls_targets = cls_targets.astype(int)
self.num_class = 1
one_hot_targets = np.eye(self.num_class+1)[cls_targets] #One_hot label -- make sure one hot class is <num_class+1>
one_hot_targets = one_hot_targets[..., 1:]
top[0].reshape(*one_hot_targets.shape) #reshape to caffe pattern
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
labels = bottom[0].data # (1, 107136)
cared = bottom[1].data
cls_targets = labels * cared
cls_targets = cls_targets.astype(int)
one_hot_targets = np.eye(self.num_class+1)[cls_targets] #One_hot label -- make sure one hot class is <num_class+1>
one_hot_targets = one_hot_targets[..., 1:]
top[0].data[...] = one_hot_targets
def backward(self, top, propagate_down, bottom):
pass
#For Point-Wise model
class LabelEncodeV2(caffe.Layer):
def setup(self, bottom, top):
labels = bottom[0].data
labels = labels.astype(int)
labels = np.expand_dims(labels,-1)
top[0].reshape(*labels.shape) #reshape to caffe pattern
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
labels = bottom[0].data # (1, 107136)
labels = labels.astype(int)
labels = np.expand_dims(labels,-1)
top[0].data[...] = labels
def backward(self, top, propagate_down, bottom):
pass
class WeightFocalLoss(caffe.Layer):
def setup(self, bottom, top):
params = eval(self.param_str)
self.gamma = int(params['focusing_parameter'])
self.alpha = params['alpha']
self.batch_size = bottom[0].data.shape[0]
def reshape(self, bottom, top):
# check input dimensions match
# if bottom[0].num != bottom[1].num:
# raise Exception("Infered scores and labels must have the same dimension.")
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.cls_weights = bottom[2].data
self.cls_weights = np.expand_dims(self.cls_weights,-1)
log1p = np.log1p(np.exp(-np.abs(self._p))) #logits
self._p_t = 1 / (1 + np.exp(-self._p)) # Compute sigmoid activations
self.first = (1-self.label) * (1-self.alpha) + self.label * self.alpha
self.second = (1-self.label) * ((self._p_t) ** self.gamma) + self.label * ((1 - self._p_t) ** self.gamma)
self.sigmoid_cross_entropy = (1-self.label) * (log1p + np.clip(self._p, a_min=0, a_max=None)) + \
self.label * (log1p - np.clip(self._p, a_min=None, a_max=0))
logprobs = ((1-self.label) * self.first * self.second * self.sigmoid_cross_entropy) + \
(self.label * self.first * self.second * self.sigmoid_cross_entropy)
top[0].data[...] = np.sum(logprobs*self.cls_weights) / self.batch_size
def backward(self, top, propagate_down, bottom):
dev_log1p = np.sign(self._p) * (1 / (np.exp(np.abs(self._p))+1)) # might fix divided by 0 x/|x| bug
self.dev_sigmoid_cross_entropy = (1-self.label) * (dev_log1p - np.where(self._p<=0, 0, 1)) + \
self.label * (dev_log1p + np.where(self._p>=0, 0, 1))
delta = (1-self.label) * (self.first * self.second * (self.gamma * (1-self._p_t) * self.sigmoid_cross_entropy - self.dev_sigmoid_cross_entropy)) + \
self.label * (-self.first * self.second * (self.gamma * self._p_t * self.sigmoid_cross_entropy + self.dev_sigmoid_cross_entropy))
bottom[0].diff[...] = delta * self.cls_weights / self.batch_size
class WeightedSmoothL1Loss(caffe.Layer):
def setup(self, bottom, top):
self.sigma = 3
self.encode_rad_error_by_sin = True
self.batch_size = bottom[0].data.shape[0]
def reshape(self, bottom, top):
# check input dimensions match
# if bottom[0].num != bottom[1].num:
# raise Exception("Infered scores and labels must have the same dimension.")
top[0].reshape(1)
def forward(self, bottom, top):
box_preds = bottom[0].data
reg_targets = bottom[1].data
self.reg_weights = bottom[2].data
self.reg_weights = np.expand_dims(self.reg_weights,-1)
self.diff = box_preds - reg_targets
#use sin_difference rad to sin
if self.encode_rad_error_by_sin:
diff_rot = self.diff[...,-1:].copy() #copy rotation without add sin
self.sin_diff = np.sin(diff_rot)
self.cos_diff = np.cos(diff_rot)
self.diff[...,-1] = np.sin(self.diff[...,-1]) #use sin_difference
self.abs_diff = np.abs(self.diff)
#change from less than to less or equal
self.cond = self.abs_diff <= (1/(self.sigma**2))
loss = np.where(self.cond, 0.5 * self.sigma**2 * self.abs_diff**2,
self.abs_diff - 0.5/self.sigma**2)
reg_loss = loss * self.reg_weights
top[0].data[...] = np.sum(reg_loss) / self.batch_size # * 2
def backward(self, top, propagate_down, bottom):
if self.encode_rad_error_by_sin:
delta = np.where(self.cond[...,:-1], (self.sigma**2) * self.diff[...,:-1], np.sign(self.diff[...,:-1]))
delta_rotation = np.where(self.cond[...,-1:], (self.sigma**2) * self.sin_diff * self.cos_diff,
np.sign(self.sin_diff) * self.cos_diff) #if sign(0) is gonna be 0!
delta = np.concatenate([delta, delta_rotation], axis=-1)
else:
delta = np.where(self.cond, (self.sigma**2) * self.diff, np.sign(self.diff))
bottom[0].diff[...] = delta * self.reg_weights / self.batch_size# * 2
class FocalLoss(caffe.Layer):
def setup(self, bottom, top):
params = eval(self.param_str)
self.gamma = int(params['focusing_parameter'])
self.alpha = params['alpha']
def reshape(self, bottom, top):
# check input dimensions match
# if bottom[0].num != bottom[1].num:
# raise Exception("Infered scores and labels must have the same dimension.")
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
log1p = np.log1p(np.exp(-np.abs(self._p))) #logits
self._p_t = 1 / (1 + np.exp(-self._p)) # Compute sigmoid activations
self.first = (1-self.label) * (1-self.alpha) + self.label * self.alpha
self.second = (1-self.label) * (self._p_t ** self.gamma) + self.label * ((1 - self._p_t) ** self.gamma)
self.sigmoid_cross_entropy = (1-self.label) * (log1p + np.clip(self._p, a_min=0, a_max=None)) + \
self.label * (log1p - np.clip(self._p, a_min=None, a_max=0))
logprobs = ((1-self.label) * self.first * self.second * self.sigmoid_cross_entropy) + \
(self.label * self.first * self.second * self.sigmoid_cross_entropy)
top[0].data[...] = np.mean(logprobs)
def backward(self, top, propagate_down, bottom):
dev_log1p = np.sign(self._p) * (1 / (np.exp(np.abs(self._p))+1)) # might fix divided by 0 x/|x| bug
self.dev_sigmoid_cross_entropy = (1-self.label) * (dev_log1p - np.where(self._p<=0, 0, 1)) + \
self.label * (dev_log1p + np.where(self._p>=0, 0, 1))
delta = (1-self.label) * (self.first * self.second * (self.gamma * (1-self._p_t) * self.sigmoid_cross_entropy - self.dev_sigmoid_cross_entropy)) + \
self.label * (-self.first * self.second * (self.gamma * self._p_t * self.sigmoid_cross_entropy + self.dev_sigmoid_cross_entropy))
bottom[0].diff[...] = delta
class DiceLoss(caffe.Layer):
def setup(self, bottom, top):
params = eval(self.param_str)
self.belta = params['belta'] #0.5
self.alpha = params['alpha'] #0.5
self.eps = 1e-5
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.tp = self._p * self.label
self.fn = (1- self._p ) * self.label
self.fp = self._p * (1 - self.label)
self.union = self.tp + self.alpha * self.fn + self.belta * self.fp
logprobs = (np.sum(self.tp) + self.eps) / (np.sum(self.union) + self.eps)
top[0].data[...] = 1 - logprobs
def backward(self, top, propagate_down, bottom):
delta = self.alpha * np.square(self.label) / (np.square(self.union) + self.eps)
bottom[0].diff[...] = delta
#for v-net paper
class DiceLossV2(caffe.Layer):
def setup(self, bottom, top):
self.eps = 1e-5
self.smooth = 1
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.inter = np.sum(self._p * self.label)
self.union = np.sum(self._p + self.label)
logprobs = (2 * self.inter + self.smooth) / (self.union + self.smooth)
top[0].data[...] = logprobs
def backward(self, top, propagate_down, bottom):
delta = (self.label * (self.union) - 2 * self._p * (self.inter)) / (np.square(self.union) + self.eps)
bottom[0].diff[...] = 2 * delta
class DiceLossV3(caffe.Layer):
def setup(self, bottom, top):
# params = eval(self.param_str)
# self.belta = params['belta'] #0.5
# self.alpha = params['alpha'] #0.5
self.eps = 1e-5
self.smooth = 1
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.tp = self._p * self.label
self.union = self._p + self.label
logprobs = (2 * np.sum(self.tp) + self.smooth) / (np.sum(self.union) + self.smooth)
top[0].data[...] = logprobs
def backward(self, top, propagate_down, bottom):
delta = 2 * np.square(self.label) / (np.square(self.union) + self.eps)
bottom[0].diff[...] = delta
class IoUSegLoss(caffe.Layer):
def setup(self, bottom, top):
# params = eval(self.param_str)
# self.belta = params['belta'] #0.5
# self.alpha = params['alpha'] #0.5
self.eps = 1e-5
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.inter = self._p * self.label
self.union = self._p + self.label - self.inter
self.iou = self.inter/self.union
logprobs = (np.sum(self.inter) + self.eps) / (np.sum(self.union) + self.eps)
top[0].data[...] = 1 - logprobs
def backward(self, top, propagate_down, bottom):
delta = np.where(self.label>0, -1/(self.union + self.eps), self.inter/(np.square(self.union)+ self.eps))
bottom[0].diff[...] = delta
class DiceFocalLoss(caffe.Layer):
def setup(self, bottom, top):
params = eval(self.param_str)
self.gamma = int(params['focusing_parameter']) #2
self.alpha = params['alpha'] #0.25
self.dice_belta = params['dice_belta'] #0.5
self.dice_alpha = params['dice_alpha'] #0.5
self.lamda = params['lamda'] #trade off between focal and dice loss # 0.1, 0.5 , 1
def reshape(self, bottom, top):
# check input dimensions match
# if bottom[0].num != bottom[1].num:
# raise Exception("Infered scores and labels must have the same dimension.")
top[0].reshape(1)
def forward(self, bottom, top):
self._p = bottom[0].data
self.label = bottom[1].data
self.c = len(np.unique(self.label)) #no background
####################################Focal loss##########################
self._p_t = 1 / (1 + np.exp(-self._p)) # Compute sigmoid activations
self.first = (1-self.label) * (1-self.alpha) + self.label * self.alpha
self.second = (1-self.label) * ((self._p_t) ** self.gamma) + self.label * ((1 - self._p_t) ** self.gamma)
log1p = np.log1p(np.exp(-np.abs(self._p)))
self.sigmoid_cross_entropy = (1-self.label) * (log1p + np.clip(self._p, a_min=0, a_max=None)) + \
self.label * (log1p - np.clip(self._p, a_min=None, a_max=0))
focal = ((1-self.label) * self.first * self.second * self.sigmoid_cross_entropy) + \
(self.label * self.first * self.second * self.sigmoid_cross_entropy)
focal = np.mean(focal)
########################################Dice############################
self.tp = np.sum(self._p * self.label)
self.fn = np.sum((1- self._p ) * self.label)
self.fp = np.sum(self._p * (1 - self.label))
self.union = self.tp + self.alpha * self.fn + self.belta * self.fp
dice = self.tp / (self.union + self.eps )
top[0].data[...] = self.c - dice - self.lamda * focal #average fl
def backward(self, top, propagate_down, bottom):
dev_log1p = np.sign(self._p) * (1 / (np.exp(np.abs(self._p))+1)) # might fix divided by 0 x/|x| bug
self.dev_sigmoid_cross_entropy = (1-self.label) * (dev_log1p - np.where(self._p<=0, 0, 1)) + \
self.label * (dev_log1p + np.where(self._p>=0, 0, 1))
focal_delta = (1-self.label) * (self.first * self.second * (self.gamma * (1-self._p_t) * self.sigmoid_cross_entropy - self.dev_sigmoid_cross_entropy)) + \
self.label * (-self.first * self.second * (self.gamma * self._p_t * self.sigmoid_cross_entropy + self.dev_sigmoid_cross_entropy))
########################################Dice############################
dev_tp = np.sum(self.label)
dev_fn = np.sum(-self.label)
dev_fp = np.sum(1-self.label)
dice_delta = (self.tp * (dev_tp + self.alpha * dev_fn + self.belta * dev_fp) - dev_tp * self.union) / ((self.union)**2 + self.eps)
delta = -(dice_delta + self.lamda * focal_delta)
bottom[0].diff[...] = delta
class IoULoss(caffe.Layer):
def setup(self, bottom, top):
# params = eval(self.param_str)
self.eps = 1e-5
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
pred = bottom[0].data
gt_box = bottom[1].data
self.points_label = bottom[2].data
self.reg_weights = bottom[3].data
self.reg_weights = np.expand_dims(self.reg_weights,-1)
points = bottom[4].data[...,:3]
pred = pred * self.points_label #if label==0 do not count iou
self.pred_up = pred[..., 5:6]
self.pred_down = pred[..., 2:3]
self.pred_fwd = pred[..., 3:4]
self.pred_bwd = pred[..., 0:1]
self.pred_right = pred[..., 4:5]
self.pred_left = pred[..., 1:2]
self.gt_up = gt_box[..., 5:6]
self.gt_down = gt_box[..., 2:3]
self.gt_fwd = gt_box[..., 3:4]
self.gt_bwd = gt_box[..., 0:1]
self.gt_right = gt_box[..., 4:5]
self.gt_left = gt_box[..., 1:2]
pred_min_points = points - pred[..., :3]
pred_max_points = points + pred[..., 3:-1]
gt_min_points = points - gt_box[..., :3]
gt_max_points = points + gt_box[..., 3:-1]
pred_area = np.abs((self.pred_up + self.pred_down) * (self.pred_fwd + self.pred_bwd) * (self.pred_right + self.pred_left))
# pred_area = np.prod(pred_max_points - pred_min_points, axis = -1)
gt_area = (self.gt_up + self.gt_down) * (self.gt_fwd + self.gt_bwd) * (self.gt_right + self.gt_left)
# self.inter_h = np.minimum(self.pred_up, self.gt_up) + np.minimum(self.pred_down, self.gt_down)
# self.inter_w = np.minimum(self.pred_fwd, self.gt_fwd) + np.minimum(self.pred_bwd, self.gt_bwd)
# self.inter_l = np.minimum(self.pred_right, self.gt_right) + np.minimum(self.pred_left, self.gt_left)
h_pred_max = np.maximum(pred_max_points[..., 2:], pred_min_points[..., 2:])
h_pred_min = np.minimum(pred_max_points[..., 2:], pred_min_points[..., 2:])
w_pred_max = np.maximum(pred_max_points[..., 0:1], pred_min_points[..., 0:1])
w_pred_min = np.minimum(pred_max_points[..., 0:1], pred_min_points[..., 0:1])
l_pred_max = np.maximum(pred_max_points[..., 1:2], pred_min_points[..., 1:2])
l_pred_min = np.minimum(pred_max_points[..., 1:2], pred_min_points[..., 1:2])
self.inter_h = np.minimum(h_pred_max, gt_max_points[..., 2:]) - np.maximum(h_pred_min, gt_min_points[..., 2:])
self.inter_w = np.minimum(w_pred_max, gt_max_points[..., 0:1]) - np.maximum(w_pred_min, gt_min_points[..., 0:1])
self.inter_l = np.minimum(l_pred_max, gt_max_points[..., 1:2]) - np.maximum(l_pred_min, gt_min_points[..., 1:2])
self.inter_h = np.clip(self.inter_h, a_min=0, a_max=None)
self.inter_w = np.clip(self.inter_w, a_min=0, a_max=None)
self.inter_l = np.clip(self.inter_l, a_min=0, a_max=None)
# self.inter_h = np.minimum(pred_max_points[..., 2:], gt_max_points[..., 2:]) - np.maximum(pred_min_points[..., 2:], gt_min_points[..., 2:])
# self.inter_w = np.minimum(pred_max_points[..., 0:1], gt_max_points[..., 0:1]) - np.maximum(pred_min_points[..., 0:1], gt_min_points[..., 0:1])
# self.inter_l = np.minimum(pred_max_points[..., 1:2], gt_max_points[..., 1:2]) - np.maximum(pred_min_points[..., 1:2], gt_min_points[..., 1:2])
# self.inter = np.clip(self.inter_h, a_min=0, a_max=None) * np.clip(self.inter_w, a_min=0, a_max=None) * np.clip(self.inter_l, a_min=0, a_max=None)
self.inter = self.inter_h * self.inter_w * self.inter_l
self.union = pred_area + gt_area - self.inter
iou = (self.inter + self.eps) / (self.union + self.eps) #* self.points_label #if label==0 do not count iou
# print("iou", np.unique(iou<=0, return_counts=True))
# print("iou less than 0", iou[iou<=0])
# print("self.inter <= 0", self.inter[iou<=0])
# print("self.union less than 0", self.union[iou<=0])
# print("pred_area less than 0", pred_area[iou<=0])
# print("gt_area less than 0", gt_area[iou<=0])
logprobs = -np.log(iou)
top[0].data[...] = np.sum(logprobs * self.reg_weights)
def backward(self, top, propagate_down, bottom):
dev_h = (self.pred_left * self.pred_fwd) + (self.pred_left * self.pred_bwd) + (self.pred_right * self.pred_fwd) + (self.pred_right * self.pred_bwd)
dev_w = (self.pred_left * self.pred_up) + (self.pred_left * self.pred_down) + (self.pred_right * self.pred_up) + (self.pred_right * self.pred_down)
dev_l = (self.pred_up * self.pred_fwd) + (self.pred_up * self.pred_bwd) + (self.pred_down * self.pred_fwd) + (self.pred_down * self.pred_bwd)
dev_iou_h = self.inter_w * self.inter_l
dev_iou_w = self.inter_h * self.inter_l
dev_iou_l = self.inter_w * self.inter_h
# dev_iou_up = np.where(self.pred_up < self.gt_up, dev_iou_h, 0)
# dev_iou_down = np.where(self.pred_down < self.gt_down, dev_iou_h, 0)
# dev_iou_fwd = np.where(self.pred_fwd < self.gt_fwd, dev_iou_w, 0)
# dev_iou_bwd = np.where(self.pred_bwd < self.gt_bwd, dev_iou_w, 0)
# dev_iou_right = np.where(self.pred_right < self.gt_right, dev_iou_l, 0)
# dev_iou_left = np.where(self.pred_left < self.gt_left, dev_iou_l, 0)
cond_h = (self.pred_up < self.gt_up) + (self.pred_down < self.gt_down) # or condition
cond_w = (self.pred_fwd < self.gt_fwd) + (self.pred_bwd < self.gt_bwd)
cond_l = (self.pred_right < self.gt_right) + (self.pred_left < self.gt_left)
dev_iou_h = np.where(cond_h, dev_iou_h, 0)
dev_iou_w = np.where(cond_w, dev_iou_w, 0)
dev_iou_l = np.where(cond_l, dev_iou_l, 0)
second_term = (self.union + self.inter+ self.eps) / (self.union * self.inter + self.eps)
first_term = 1/(self.union + self.eps)
# delta_up = first_term * dev_h - second_term * dev_iou_up
# delta_down = first_term * dev_h - second_term * dev_iou_down
# delta_fwd = first_term * dev_w - second_term * dev_iou_fwd
# delta_bwd = first_term * dev_w - second_term * dev_iou_bwd
# delta_right = first_term * dev_l - second_term * dev_iou_right
# delta_left = first_term * dev_l - second_term * dev_iou_left
delta_h = first_term * dev_h - second_term * dev_iou_h
delta_w = first_term * dev_w - second_term * dev_iou_w
delta_l = first_term * dev_l - second_term * dev_iou_l
# delta = delta_up + delta_down + delta_fwd + delta_bwd + delta_right + delta_left
delta = 2*delta_h + 2*delta_w + 2*delta_l
bottom[0].diff[...] = delta * self.reg_weights
# print("IoULoss backward", np.mean(delta * self.reg_weights))
class IoULossV2(caffe.Layer):
def setup(self, bottom, top):
self.eps = 1e-5
self.sigma = 3
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
pred = bottom[0].data
gt_box = bottom[1].data
self.points_label = bottom[2].data
self.reg_weights = bottom[3].data
self.reg_weights = np.expand_dims(self.reg_weights,-1)
# points = bottom[4].data[...,:3]
pred = pred * self.points_label #if label==0 do not count iou
# pred = np.where(pred<0, 0, pred) #ReLU
self.pred_up = pred[..., 5:6]
self.pred_down = pred[..., 2:3]
self.pred_fwd = pred[..., 3:4]
self.pred_bwd = pred[..., 0:1]
self.pred_right = pred[..., 4:5]
self.pred_left = pred[..., 1:2]
self.pred_rot = pred[..., 6:]
self.gt_up = gt_box[..., 5:6]
self.gt_down = gt_box[..., 2:3]
self.gt_fwd = gt_box[..., 3:4]
self.gt_bwd = gt_box[..., 0:1]
self.gt_right = gt_box[..., 4:5]
self.gt_left = gt_box[..., 1:2]
self.gt_rot = pred[..., 6:]
self.diff = self.pred_rot - self.gt_rot
self.abs_diff = np.abs(self.diff)
self.cond = self.abs_diff <= (1/(self.sigma**2))
rot_loss = np.where(self.cond, 0.5 * self.sigma**2 * self.abs_diff**2,
self.abs_diff - 0.5/self.sigma**2)
pred_area = (self.pred_up + self.pred_down) * (self.pred_fwd + self.pred_bwd) * (self.pred_right + self.pred_left)
gt_area = (self.gt_up + self.gt_down) * (self.gt_fwd + self.gt_bwd) * (self.gt_right + self.gt_left)
self.inter_h = np.minimum(self.pred_up, self.gt_up) + np.minimum(self.pred_down, self.gt_down)
self.inter_w = np.minimum(self.pred_fwd, self.gt_fwd) + np.minimum(self.pred_bwd, self.gt_bwd)
self.inter_l = np.minimum(self.pred_right, self.gt_right) + np.minimum(self.pred_left, self.gt_left)
self.inter = self.inter_h * self.inter_w * self.inter_l
self.union = pred_area + gt_area - self.inter
iou = (self.inter + self.eps) / (self.union + self.eps) #* self.points_label #if label==0 do not count iou
logprobs = -np.log(iou) + rot_loss
top[0].data[...] = np.sum(logprobs * self.reg_weights)
def backward(self, top, propagate_down, bottom):
dev_h = (self.pred_left * self.pred_fwd) + (self.pred_left * self.pred_bwd) + (self.pred_right * self.pred_fwd) + (self.pred_right * self.pred_bwd)
dev_w = (self.pred_left * self.pred_up) + (self.pred_left * self.pred_down) + (self.pred_right * self.pred_up) + (self.pred_right * self.pred_down)
dev_l = (self.pred_up * self.pred_fwd) + (self.pred_up * self.pred_bwd) + (self.pred_down * self.pred_fwd) + (self.pred_down * self.pred_bwd)
cond_h = (self.pred_up < self.gt_up) + (self.pred_down < self.gt_down) # or condition
cond_w = (self.pred_fwd < self.gt_fwd) + (self.pred_bwd < self.gt_bwd)
cond_l = (self.pred_right < self.gt_right) + (self.pred_left < self.gt_left)
dev_iou_h = np.where(cond_h, self.inter_w * self.inter_l, 0)
dev_iou_w = np.where(cond_w, self.inter_h * self.inter_l, 0)
dev_iou_l = np.where(cond_l, self.inter_w * self.inter_h, 0)
second_term = (self.union + self.inter) / (self.union * self.inter + self.eps)
first_term = 1/(self.union + self.eps)
delta_h = first_term * dev_h - second_term * dev_iou_h
delta_w = first_term * dev_w - second_term * dev_iou_w
delta_l = first_term * dev_l - second_term * dev_iou_l
# start_time = timeit.default_timer()
rot_delta = np.where(self.cond, (self.sigma**2) * self.diff, np.sign(self.diff))
delta = np.concatenate((delta_w, delta_l, delta_h), axis=-1)
delta = np.repeat(delta, 2, axis=-1)
delta = np.concatenate((delta,rotate), axis=-1)
#
# end_time = timeit.default_timer()
# print('np.repeat forwards ran for {}s'.format((end_time-start_time)/60))
bottom[0].diff[...] = delta * self.reg_weights
class IoULossV3(caffe.Layer):
def setup(self, bottom, top):
self.eps = 1e-5
self.smooth = 1
def reshape(self, bottom, top):
top[0].reshape(1)
def forward(self, bottom, top):
pred = bottom[0].data
gt_box = bottom[1].data
self.points_label = bottom[2].data
self.reg_weights = bottom[3].data
self.reg_weights = np.expand_dims(self.reg_weights,-1)
points = bottom[4].data[...,:3]
pred = pred * self.points_label #if label==0 do not count iou
# print("label", np.unique(self.points_label, return_index=True))
# pred = np.where(pred<=0, 0, pred) #ReLU
# print("pred", np.unique(self.points_label>0, return_index=True))
self.pred_up = pred[..., 5:6]
self.pred_down = pred[..., 2:3]
self.pred_fwd = pred[..., 3:4]
self.pred_bwd = pred[..., 0:1]
self.pred_right = pred[..., 4:5]
self.pred_left = pred[..., 1:2]
self.gt_up = gt_box[..., 5:6]
self.gt_down = gt_box[..., 2:3]
self.gt_fwd = gt_box[..., 3:4]
self.gt_bwd = gt_box[..., 0:1]
self.gt_right = gt_box[..., 4:5]
self.gt_left = gt_box[..., 1:2]
pred_area = (self.pred_fwd + self.pred_bwd) * (self.pred_right + self.pred_left)
# print("pred_area", pred_area[pred_area>4])
gt_area = (self.gt_fwd + self.gt_bwd) * (self.gt_right + self.gt_left)
# print("gt_area", gt_area[gt_area>0.8])
# self.inter_h = np.minimum(self.pred_up, self.gt_up) + np.minimum(self.pred_down, self.gt_down)
self.inter_w = np.minimum(self.pred_fwd, self.gt_fwd) + np.minimum(self.pred_bwd, self.gt_bwd)
self.inter_l = np.minimum(self.pred_right, self.gt_right) + np.minimum(self.pred_left, self.gt_left)
self.inter = self.inter_w * self.inter_l
# print("self.inter > 0.4", self.inter[self.inter>0.4])
self.union = pred_area + gt_area - self.inter
iou = (self.inter + self.eps) / (self.union + self.eps) #* self.points_label #if label==0 do not count iou
logprobs = -np.log(iou)
top[0].data[...] = np.sum(logprobs * self.reg_weights)
def backward(self, top, propagate_down, bottom):
# dev_h = (self.pred_left * self.pred_fwd) + (self.pred_left * self.pred_bwd) + (self.pred_right * self.pred_fwd) + (self.pred_right * self.pred_bwd)
dev_w = self.pred_left + self.pred_right
dev_l = self.pred_fwd + self.pred_bwd
# dev_iou_h = self.inter_w * self.inter_l
# dev_iou_w = self.inter_l
# dev_iou_l = self.inter_w
# cond_h = (self.pred_up < self.gt_up) + (self.pred_down < self.gt_down) # or condition
cond_w = (self.pred_fwd < self.gt_fwd) + (self.pred_bwd < self.gt_bwd)
cond_l = (self.pred_right < self.gt_right) + (self.pred_left < self.gt_left)
# dev_iou_h = np.where(cond_h, dev_iou_h, 0)
dev_iou_w = np.where(cond_w, self.inter_l, 0)
dev_iou_l = np.where(cond_l, self.inter_w, 0)
second_term = (self.union + self.inter) / (self.union * self.inter + self.eps)
first_term = 1/(self.union + self.eps)
delta = np.zeros(shape=(1,9000,1))
# delta_h = first_term * dev_h - second_term * dev_iou_h
delta_w = first_term * dev_w - second_term * dev_iou_w # df, db
delta_l = first_term * dev_l - second_term * dev_iou_l # dr, dl
delta[..., 0:1] = delta_w #b
delta[..., 1:2] = delta_l #l
delta[..., 3:4] = delta_w #f
delta[..., 4:5] = delta_l #r
# delta = np.concatenate((),axis=-1)
# delta = delta_w + delta_l
bottom[0].diff[...] = delta * self.reg_weights
class CaLu(caffe.Layer):
def setup(self, bottom, top):
input_tensor = bottom[0].data
top[0].reshape(*input_tensor.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
self.input_tensor = bottom[0].data
# make positives
self.t_mask = self.input_tensor < 0
self.tensor = np.where(self.t_mask, 0, self.input_tensor)
#activate
self.tensor = 1 - 1/(1+self.tensor)
top[0].data[...] = self.tensor
def backward(self, top, propagate_down, bottom):
diff = np.where(self.t_mask, 0, 1/np.square((1+self.input_tensor)))
bottom[0].diff[...] = diff
class CaLuV2(caffe.Layer):
def setup(self, bottom, top):
input_tensor = bottom[0].data
top[0].reshape(*input_tensor.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
self.input_tensor = bottom[0].data
#activate
self.tensor = 1 - 1/(1+self.input_tensor)
top[0].data[...] = self.tensor
def backward(self, top, propagate_down, bottom):
diff = 1/np.square((1+self.input_tensor))
bottom[0].diff[...] = diff
class BCLReshape(caffe.Layer):
def setup(self, bottom, top):
top_prev = bottom[0].data
top_prev, top_lattice = self.reshape_func(top_prev)
top[0].reshape(*top_prev.shape)
top[1].reshape(*top_lattice.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
top_prev = bottom[0].data
top_prev, top_lattice = self.reshape_func(top_prev)
top[0].reshape(*top_prev.shape) #top_prev
top[0].data[...] = top_prev
top[1].reshape(*top_lattice.shape) #top_lattice
top[1].data[...] = top_lattice
def backward(self, top, propagate_down, bottom):
pass
def reshape_func(self, top_prev):
top_prev = top_prev.transpose(0,2,1) #(1,N,C) -> (1,C,N)
top_prev = np.expand_dims(top_prev,2) #(1,C,N) -> (1,C,,1,N)
top_lattice = top_prev[:, :3, ...]
return top_prev, top_lattice
class BCLReshapeV2(caffe.Layer):
def setup(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape)
top[1].reshape(*top_lattice.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape) #top_prev
top[0].data[...] = top_prev
top[1].reshape(*top_lattice.shape) #top_lattice
top[1].data[...] = top_lattice
def backward(self, top, propagate_down, bottom):
pass
def reshape_func(self, top_prev, coords):
top_prev = top_prev.transpose(1,2,0) #(N,1,4) -> (1,4,N)
top_prev = np.expand_dims(top_prev,2) #(1,4,N) -> (1,4,,1,N)
coords = coords[:,1:][:,::-1].transpose() #coors in reverse order bzyx (V, C) -> (C,V)
coords = np.expand_dims(coords,0) #(C,V)-> (1,C,V)
coords = np.expand_dims(coords,2) #(1,C,V)-> (1,C,1,V)
return top_prev, coords
class BCLReshapeV4(caffe.Layer):
def setup(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape)
top[1].reshape(*top_lattice.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape) #top_prev
top[0].data[...] = top_prev
top[1].reshape(*top_lattice.shape) #top_lattice
top[1].data[...] = top_lattice
def backward(self, top, propagate_down, bottom):
pass
def reshape_func(self, top_prev, coords):
top_prev = top_prev.transpose(2,1,0) #(V,100,C) -> (C,100,V)
top_prev = np.expand_dims(top_prev,0) #(C,100,V)-> (1,C,100,V)
coords = coords[:,2:][:,::-1].transpose() #coors in reverse order bzyx, pillar no need z (V,C)
coords = np.expand_dims(coords,0) #(C,V)-> (1,C,V)
coords = np.expand_dims(coords,2) #(1,C,V)-> (1,C,1,V)
coords = np.repeat(coords, top_prev.shape[-2], 2) #repeat 100
return top_prev, coords
class BCLReshapeV5(caffe.Layer):
def setup(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape)
top[1].reshape(*top_lattice.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
top_prev = bottom[0].data
coords = bottom[1].data
top_prev, top_lattice = self.reshape_func(top_prev, coords)
top[0].reshape(*top_prev.shape) #top_prev
top[0].data[...] = top_prev
top[1].reshape(*top_lattice.shape) #top_lattice
top[1].data[...] = top_lattice
def backward(self, top, propagate_down, bottom):
pass
def reshape_func(self, top_prev, coords):
top_prev = top_prev.transpose(2,1,0) #(V,N,C) -> (C,N,V)
top_prev = np.expand_dims(top_prev,0) #(C,N,V)-> (1,C,N,V)
coords = coords[:,2:][:,::-1].transpose() #coors in reverse order bzyx, pillar no need z (V,C)
coords = np.expand_dims(coords,0) #(C,V)-> (1,C,V)
coords = np.expand_dims(coords,2) #(1,C,V)-> (1,C,1,V)
return top_prev, coords
class GlobalPooling(caffe.Layer):
def setup(self, bottom, top):
pass
def reshape(self, bottom, top):
n, c, p, h, w = bottom[0].data.shape
top[0].reshape(*(n, c, h, w))
def forward(self, bottom, top):
n, c, p, h, w = bottom[0].data.shape
self.max_loc = bottom[0].data.argmax(axis=2)
top[0].data[...] = bottom[0].data.max(axis=2)
def backward(self, top, propagate_down, bottom):
n, c, h, w = top[0].diff.shape
nn, cc, hh, ww = np.ix_(np.arange(n), np.arange(c), np.arange(h),np.arange(w))
bottom[0].diff[...] = 0
bottom[0].diff[nn, cc, self.max_loc, hh, ww] = top[0].diff
class LogLayer(caffe.Layer):
def setup(self, bottom, top):
in1 = bottom[0].data
print("debug print", in1)
print("debug print", in1.shape)
top[0].reshape(*in1.shape)
def reshape(self, bottom, top):
pass
def forward(self, bottom, top):
in1 = bottom[0].data
print("forward debug print", in1)
print("forward debug print", in1.shape)
top[0].reshape(*in1.shape)
top[0].data[...] = in1
pass
def backward(self, top, propagate_down, bottom):
pass
class ProbRenorm(caffe.Layer):
def setup(self, bottom, top):
pass
def reshape(self, bottom, top):
top[0].reshape(*bottom[0].data.shape)
def forward(self, bottom, top):
clipped = bottom[0].data * bottom[1].data
self.sc = 1.0 / (np.sum(clipped, axis=1, keepdims=True) + 1e-10)
top[0].data[...] = clipped * self.sc
def backward(self, top, propagate_down, bottom):
bottom[0].diff[...] = top[0].diff * bottom[1].data * self.sc
class PickAndScale(caffe.Layer):
def setup(self, bottom, top):
self.nch_out = len(self.param_str.split('_'))
self.dims = []
for f in self.param_str.split('_'):
if f.find('*') >= 0:
self.dims.append((int(f[:f.find('*')]), float(f[f.find('*') + 1:])))
elif f.find('/') >= 0:
self.dims.append((int(f[:f.find('/')]), 1.0 / float(f[f.find('/') + 1:])))
else:
self.dims.append((int(f), 1.0))
def reshape(self, bottom, top):
top[0].reshape(bottom[0].data.shape[0], self.nch_out, bottom[0].data.shape[2], bottom[0].data.shape[3])
def forward(self, bottom, top):
for i, (j, s) in enumerate(self.dims):
top[0].data[:, i, :, :] = bottom[0].data[:, j, :, :] * s
def backward(self, top, propagate_down, bottom):
pass # TODO NOT_YET_IMPLEMENTED
| 42.855194 | 163 | 0.604322 | 13,499 | 97,367 | 4.13801 | 0.039262 | 0.019764 | 0.029324 | 0.013534 | 0.865017 | 0.845074 | 0.832006 | 0.814103 | 0.793104 | 0.780716 | 0 | 0.021322 | 0.258691 | 97,367 | 2,271 | 164 | 42.874064 | 0.752573 | 0.131369 | 0 | 0.773184 | 0 | 0 | 0.025703 | 0.000251 | 0 | 0 | 0 | 0.000881 | 0 | 1 | 0.112817 | false | 0.034849 | 0.012995 | 0 | 0.165978 | 0.007088 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1cfa44dfa730f39e8555ecf64e10730bc502e06b | 121 | py | Python | pyawair/version.py | andriykorchak/pyawair | 5f0bbcfe79712fca467b116ef1dce77317a692b9 | [
"Apache-2.0"
] | 16 | 2018-07-16T00:15:59.000Z | 2020-09-06T02:24:40.000Z | pyawair/version.py | andriykorchak/pyawair | 5f0bbcfe79712fca467b116ef1dce77317a692b9 | [
"Apache-2.0"
] | 32 | 2018-07-28T17:07:56.000Z | 2021-03-22T16:38:02.000Z | pyawair/version.py | andriykorchak/pyawair | 5f0bbcfe79712fca467b116ef1dce77317a692b9 | [
"Apache-2.0"
] | 3 | 2018-07-29T15:58:05.000Z | 2021-03-18T19:07:54.000Z | #!/usr/bin/env python3
# coding=utf-8
# author: @netmanchris
# -*- coding: utf-8 -*-
def version():
return '0.0.12' | 15.125 | 23 | 0.603306 | 18 | 121 | 4.055556 | 0.777778 | 0.246575 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07 | 0.173554 | 121 | 8 | 24 | 15.125 | 0.66 | 0.636364 | 0 | 0 | 0 | 0 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
1c03e7c18aa52fa6cfc51f3d8c5a795ab6844dac | 1,507 | py | Python | tests/core/test_persistable_store.py | StackVista/sts-agent | f8358ea46820ffb9eb0b4b30c7d7457cc2cc987a | [
"BSD-3-Clause"
] | 4 | 2017-03-18T12:16:40.000Z | 2020-11-12T06:59:29.000Z | tests/core/test_persistable_store.py | StackVista/sts-agent | f8358ea46820ffb9eb0b4b30c7d7457cc2cc987a | [
"BSD-3-Clause"
] | 18 | 2016-09-22T08:01:02.000Z | 2020-07-15T08:30:17.000Z | tests/core/test_persistable_store.py | StackVista/sts-agent | f8358ea46820ffb9eb0b4b30c7d7457cc2cc987a | [
"BSD-3-Clause"
] | 8 | 2016-11-23T06:55:51.000Z | 2021-07-05T05:12:34.000Z | from utils.persistable_store import PersistableStore
from unittest import TestCase
import uuid
class TestPersistableStore(TestCase):
def test_create_store(self):
check_name = str(uuid.uuid4())
test_object = {'test': 42.0}
store = PersistableStore(check_name, "instanceid")
store.load_status()
self.assertEqual(store['test_field'], None)
store['test_field'] = test_object
store.commit_status()
store.load_status()
self.assertEqual(store['test_field'], test_object)
def test_load_existing_store(self):
check_name = str(uuid.uuid4())
test_object = {'test': 42.0}
store = PersistableStore(check_name, "instanceid")
store.load_status()
self.assertEqual(store['test_field'], None)
store['test_field'] = test_object
store.commit_status()
store = PersistableStore(check_name, "instanceid")
store.load_status()
self.assertEqual(store['test_field'], test_object)
def test_clear_store(self):
check_name = str(uuid.uuid4())
test_object = {'test': 42.0}
store = PersistableStore(check_name, "instanceid")
store.load_status()
self.assertEqual(store['test_field'], None)
store['test_field'] = test_object
store.commit_status()
store.clear_status()
store = PersistableStore(check_name, "instanceid")
store.load_status()
self.assertEqual(store['test_field'], None)
| 33.488889 | 58 | 0.652289 | 169 | 1,507 | 5.56213 | 0.183432 | 0.08617 | 0.134043 | 0.121277 | 0.839362 | 0.839362 | 0.839362 | 0.839362 | 0.839362 | 0.839362 | 0 | 0.010363 | 0.231586 | 1,507 | 44 | 59 | 34.25 | 0.801382 | 0 | 0 | 0.783784 | 0 | 0 | 0.100863 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 1 | 0.081081 | false | 0 | 0.081081 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c07749c76995a8a5797aa28a838f9c66455f42d | 41,566 | py | Python | pyboto3/detective.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 91 | 2016-12-31T11:38:37.000Z | 2021-09-16T19:33:23.000Z | pyboto3/detective.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 7 | 2017-01-02T18:54:23.000Z | 2020-08-11T13:54:02.000Z | pyboto3/detective.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 26 | 2016-12-31T13:11:00.000Z | 2022-03-03T21:01:12.000Z | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def accept_invitation(GraphArn=None):
"""
Accepts an invitation for the member account to contribute data to a behavior graph. This operation can only be called by an invited member account.
The request provides the ARN of behavior graph.
The member account status in the graph must be INVITED .
See also: AWS API Documentation
Exceptions
:example: response = client.accept_invitation(
GraphArn='string'
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph that the member account is accepting the invitation for.\nThe member account status in the behavior graph must be INVITED .\n
"""
pass
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
"""
pass
def create_graph():
"""
Creates a new behavior graph for the calling account, and sets that account as the master account. This operation is called by the account that is enabling Detective.
Before you try to enable Detective, make sure that your account has been enrolled in Amazon GuardDuty for at least 48 hours. If you do not meet this requirement, you cannot enable Detective. If you do meet the GuardDuty prerequisite, then when you make the request to enable Detective, it checks whether your data volume is within the Detective quota. If it exceeds the quota, then you cannot enable Detective.
The operation also enables Detective for the calling account in the currently selected Region. It returns the ARN of the new behavior graph.
An account can only be the master account for one behavior graph within a Region. If the same account calls CreateGraph with the same master account, it always returns the same behavior graph ARN. It does not create a new behavior graph.
See also: AWS API Documentation
Exceptions
:example: response = client.create_graph()
:rtype: dict
ReturnsResponse Syntax{
'GraphArn': 'string'
}
Response Structure
(dict) --
GraphArn (string) --The ARN of the new behavior graph.
Exceptions
Detective.Client.exceptions.ConflictException
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ServiceQuotaExceededException
:return: {
'GraphArn': 'string'
}
"""
pass
def create_members(GraphArn=None, Message=None, Accounts=None):
"""
Sends a request to invite the specified AWS accounts to be member accounts in the behavior graph. This operation can only be called by the master account for a behavior graph.
The request provides the behavior graph ARN and the list of accounts to invite.
The response separates the requested accounts into two lists:
See also: AWS API Documentation
Exceptions
:example: response = client.create_members(
GraphArn='string',
Message='string',
Accounts=[
{
'AccountId': 'string',
'EmailAddress': 'string'
},
]
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph to invite the member accounts to contribute their data to.\n
:type Message: string
:param Message: Customized message text to include in the invitation email message to the invited member accounts.
:type Accounts: list
:param Accounts: [REQUIRED]\nThe list of AWS accounts to invite to become member accounts in the behavior graph. For each invited account, the account list contains the account identifier and the AWS account root user email address.\n\n(dict) --An AWS account that is the master of or a member of a behavior graph.\n\nAccountId (string) -- [REQUIRED]The account identifier of the AWS account.\n\nEmailAddress (string) -- [REQUIRED]The AWS account root user email address for the AWS account.\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'Members': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
Response Structure
(dict) --
Members (list) --
The set of member account invitation requests that Detective was able to process. This includes accounts that are being verified, that failed verification, and that passed verification and are being sent an invitation.
(dict) --
Details about a member account that was invited to contribute to a behavior graph.
AccountId (string) --
The AWS account identifier for the member account.
EmailAddress (string) --
The AWS account root user email address for the member account.
GraphArn (string) --
The ARN of the behavior graph that the member account was invited to.
MasterId (string) --
The AWS account identifier of the master account for the behavior graph.
Status (string) --
The current membership status of the member account. The status can have one of the following values:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
Member accounts that declined an invitation or that were removed from the behavior graph are not included.
DisabledReason (string) --
For member accounts with a status of ACCEPTED_BUT_DISABLED , the reason that the member account is not enabled.
The reason can have one of the following values:
VOLUME_TOO_HIGH - Indicates that adding the member account would cause the data volume for the behavior graph to be too high.
VOLUME_UNKNOWN - Indicates that Detective is unable to verify the data volume for the member account. This is usually because the member account is not enrolled in Amazon GuardDuty.
InvitedTime (datetime) --
The date and time that Detective sent the invitation to the member account. The value is in milliseconds since the epoch.
UpdatedTime (datetime) --
The date and time that the member account was last updated. The value is in milliseconds since the epoch.
PercentOfGraphUtilization (float) --
The member account data volume as a percentage of the maximum allowed data volume. 0 indicates 0 percent, and 100 indicates 100 percent.
Note that this is not the percentage of the behavior graph data volume.
For example, the data volume for the behavior graph is 80 GB per day. The maximum data volume is 160 GB per day. If the data volume for the member account is 40 GB per day, then PercentOfGraphUtilization is 25. It represents 25% of the maximum allowed data volume.
PercentOfGraphUtilizationUpdatedTime (datetime) --
The date and time when the graph utilization percentage was last updated.
UnprocessedAccounts (list) --
The list of accounts for which Detective was unable to process the invitation request. For each account, the list provides the reason why the request could not be processed. The list includes accounts that are already member accounts in the behavior graph.
(dict) --
A member account that was included in a request but for which the request could not be processed.
AccountId (string) --
The AWS account identifier of the member account that was not processed.
Reason (string) --
The reason that the member account request could not be processed.
Exceptions
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ResourceNotFoundException
Detective.Client.exceptions.ValidationException
Detective.Client.exceptions.ServiceQuotaExceededException
:return: {
'Members': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
:returns:
GraphArn (string) -- [REQUIRED]
The ARN of the behavior graph to invite the member accounts to contribute their data to.
Message (string) -- Customized message text to include in the invitation email message to the invited member accounts.
Accounts (list) -- [REQUIRED]
The list of AWS accounts to invite to become member accounts in the behavior graph. For each invited account, the account list contains the account identifier and the AWS account root user email address.
(dict) --An AWS account that is the master of or a member of a behavior graph.
AccountId (string) -- [REQUIRED]The account identifier of the AWS account.
EmailAddress (string) -- [REQUIRED]The AWS account root user email address for the AWS account.
"""
pass
def delete_graph(GraphArn=None):
"""
Disables the specified behavior graph and queues it to be deleted. This operation removes the graph from each member account\'s list of behavior graphs.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_graph(
GraphArn='string'
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph to disable.\n
"""
pass
def delete_members(GraphArn=None, AccountIds=None):
"""
Deletes one or more member accounts from the master account behavior graph. This operation can only be called by a Detective master account. That account cannot use DeleteMembers to delete their own account from the behavior graph. To disable a behavior graph, the master account uses the DeleteGraph API method.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_members(
GraphArn='string',
AccountIds=[
'string',
]
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph to delete members from.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nThe list of AWS account identifiers for the member accounts to delete from the behavior graph.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'AccountIds': [
'string',
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
Response Structure
(dict) --
AccountIds (list) --
The list of AWS account identifiers for the member accounts that Detective successfully deleted from the behavior graph.
(string) --
UnprocessedAccounts (list) --
The list of member accounts that Detective was not able to delete from the behavior graph. For each member account, provides the reason that the deletion could not be processed.
(dict) --
A member account that was included in a request but for which the request could not be processed.
AccountId (string) --
The AWS account identifier of the member account that was not processed.
Reason (string) --
The reason that the member account request could not be processed.
Exceptions
Detective.Client.exceptions.ConflictException
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ResourceNotFoundException
Detective.Client.exceptions.ValidationException
:return: {
'AccountIds': [
'string',
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
:returns:
(string) --
"""
pass
def disassociate_membership(GraphArn=None):
"""
Removes the member account from the specified behavior graph. This operation can only be called by a member account that has the ENABLED status.
See also: AWS API Documentation
Exceptions
:example: response = client.disassociate_membership(
GraphArn='string'
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph to remove the member account from.\nThe member account\'s member status in the behavior graph must be ENABLED .\n
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to\nClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid\nfor. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By\ndefault, the http method is whatever is used in the method\'s model.
"""
pass
def get_members(GraphArn=None, AccountIds=None):
"""
Returns the membership details for specified member accounts for a behavior graph.
See also: AWS API Documentation
Exceptions
:example: response = client.get_members(
GraphArn='string',
AccountIds=[
'string',
]
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph for which to request the member details.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nThe list of AWS account identifiers for the member account for which to return member details.\nYou cannot use GetMembers to retrieve information about member accounts that were removed from the behavior graph.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'MemberDetails': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
Response Structure
(dict) --
MemberDetails (list) --
The member account details that Detective is returning in response to the request.
(dict) --
Details about a member account that was invited to contribute to a behavior graph.
AccountId (string) --
The AWS account identifier for the member account.
EmailAddress (string) --
The AWS account root user email address for the member account.
GraphArn (string) --
The ARN of the behavior graph that the member account was invited to.
MasterId (string) --
The AWS account identifier of the master account for the behavior graph.
Status (string) --
The current membership status of the member account. The status can have one of the following values:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
Member accounts that declined an invitation or that were removed from the behavior graph are not included.
DisabledReason (string) --
For member accounts with a status of ACCEPTED_BUT_DISABLED , the reason that the member account is not enabled.
The reason can have one of the following values:
VOLUME_TOO_HIGH - Indicates that adding the member account would cause the data volume for the behavior graph to be too high.
VOLUME_UNKNOWN - Indicates that Detective is unable to verify the data volume for the member account. This is usually because the member account is not enrolled in Amazon GuardDuty.
InvitedTime (datetime) --
The date and time that Detective sent the invitation to the member account. The value is in milliseconds since the epoch.
UpdatedTime (datetime) --
The date and time that the member account was last updated. The value is in milliseconds since the epoch.
PercentOfGraphUtilization (float) --
The member account data volume as a percentage of the maximum allowed data volume. 0 indicates 0 percent, and 100 indicates 100 percent.
Note that this is not the percentage of the behavior graph data volume.
For example, the data volume for the behavior graph is 80 GB per day. The maximum data volume is 160 GB per day. If the data volume for the member account is 40 GB per day, then PercentOfGraphUtilization is 25. It represents 25% of the maximum allowed data volume.
PercentOfGraphUtilizationUpdatedTime (datetime) --
The date and time when the graph utilization percentage was last updated.
UnprocessedAccounts (list) --
The requested member accounts for which Detective was unable to return member details.
For each account, provides the reason why the request could not be processed.
(dict) --
A member account that was included in a request but for which the request could not be processed.
AccountId (string) --
The AWS account identifier of the member account that was not processed.
Reason (string) --
The reason that the member account request could not be processed.
Exceptions
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ResourceNotFoundException
Detective.Client.exceptions.ValidationException
:return: {
'MemberDetails': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Reason': 'string'
},
]
}
:returns:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
ReturnsA paginator object.
"""
pass
def get_waiter(waiter_name=None):
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters\nsection of the service docs for a list of available waiters.
:rtype: botocore.waiter.Waiter
"""
pass
def list_graphs(NextToken=None, MaxResults=None):
"""
Returns the list of behavior graphs that the calling account is a master of. This operation can only be called by a master account.
Because an account can currently only be the master of one behavior graph within a Region, the results always contain a single graph.
See also: AWS API Documentation
Exceptions
:example: response = client.list_graphs(
NextToken='string',
MaxResults=123
)
:type NextToken: string
:param NextToken: For requests to get the next page of results, the pagination token that was returned with the previous set of results. The initial request does not include a pagination token.
:type MaxResults: integer
:param MaxResults: The maximum number of graphs to return at a time. The total must be less than the overall limit on the number of results to return, which is currently 200.
:rtype: dict
ReturnsResponse Syntax
{
'GraphList': [
{
'Arn': 'string',
'CreatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
GraphList (list) --
A list of behavior graphs that the account is a master for.
(dict) --
A behavior graph in Detective.
Arn (string) --
The ARN of the behavior graph.
CreatedTime (datetime) --
The date and time that the behavior graph was created. The value is in milliseconds since the epoch.
NextToken (string) --
If there are more behavior graphs remaining in the results, then this is the pagination token to use to request the next page of behavior graphs.
Exceptions
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ValidationException
:return: {
'GraphList': [
{
'Arn': 'string',
'CreatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ValidationException
"""
pass
def list_invitations(NextToken=None, MaxResults=None):
"""
Retrieves the list of open and accepted behavior graph invitations for the member account. This operation can only be called by a member account.
Open invitations are invitations that the member account has not responded to.
The results do not include behavior graphs for which the member account declined the invitation. The results also do not include behavior graphs that the member account resigned from or was removed from.
See also: AWS API Documentation
Exceptions
:example: response = client.list_invitations(
NextToken='string',
MaxResults=123
)
:type NextToken: string
:param NextToken: For requests to retrieve the next page of results, the pagination token that was returned with the previous page of results. The initial request does not include a pagination token.
:type MaxResults: integer
:param MaxResults: The maximum number of behavior graph invitations to return in the response. The total must be less than the overall limit on the number of results to return, which is currently 200.
:rtype: dict
ReturnsResponse Syntax
{
'Invitations': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
Invitations (list) --
The list of behavior graphs for which the member account has open or accepted invitations.
(dict) --
Details about a member account that was invited to contribute to a behavior graph.
AccountId (string) --
The AWS account identifier for the member account.
EmailAddress (string) --
The AWS account root user email address for the member account.
GraphArn (string) --
The ARN of the behavior graph that the member account was invited to.
MasterId (string) --
The AWS account identifier of the master account for the behavior graph.
Status (string) --
The current membership status of the member account. The status can have one of the following values:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
Member accounts that declined an invitation or that were removed from the behavior graph are not included.
DisabledReason (string) --
For member accounts with a status of ACCEPTED_BUT_DISABLED , the reason that the member account is not enabled.
The reason can have one of the following values:
VOLUME_TOO_HIGH - Indicates that adding the member account would cause the data volume for the behavior graph to be too high.
VOLUME_UNKNOWN - Indicates that Detective is unable to verify the data volume for the member account. This is usually because the member account is not enrolled in Amazon GuardDuty.
InvitedTime (datetime) --
The date and time that Detective sent the invitation to the member account. The value is in milliseconds since the epoch.
UpdatedTime (datetime) --
The date and time that the member account was last updated. The value is in milliseconds since the epoch.
PercentOfGraphUtilization (float) --
The member account data volume as a percentage of the maximum allowed data volume. 0 indicates 0 percent, and 100 indicates 100 percent.
Note that this is not the percentage of the behavior graph data volume.
For example, the data volume for the behavior graph is 80 GB per day. The maximum data volume is 160 GB per day. If the data volume for the member account is 40 GB per day, then PercentOfGraphUtilization is 25. It represents 25% of the maximum allowed data volume.
PercentOfGraphUtilizationUpdatedTime (datetime) --
The date and time when the graph utilization percentage was last updated.
NextToken (string) --
If there are more behavior graphs remaining in the results, then this is the pagination token to use to request the next page of behavior graphs.
Exceptions
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ValidationException
:return: {
'Invitations': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
"""
pass
def list_members(GraphArn=None, NextToken=None, MaxResults=None):
"""
Retrieves the list of member accounts for a behavior graph. Does not return member accounts that were removed from the behavior graph.
See also: AWS API Documentation
Exceptions
:example: response = client.list_members(
GraphArn='string',
NextToken='string',
MaxResults=123
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph for which to retrieve the list of member accounts.\n
:type NextToken: string
:param NextToken: For requests to retrieve the next page of member account results, the pagination token that was returned with the previous page of results. The initial request does not include a pagination token.
:type MaxResults: integer
:param MaxResults: The maximum number of member accounts to include in the response. The total must be less than the overall limit on the number of results to return, which is currently 200.
:rtype: dict
ReturnsResponse Syntax
{
'MemberDetails': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
MemberDetails (list) --
The list of member accounts in the behavior graph.
The results include member accounts that did not pass verification and member accounts that have not yet accepted the invitation to the behavior graph. The results do not include member accounts that were removed from the behavior graph.
(dict) --
Details about a member account that was invited to contribute to a behavior graph.
AccountId (string) --
The AWS account identifier for the member account.
EmailAddress (string) --
The AWS account root user email address for the member account.
GraphArn (string) --
The ARN of the behavior graph that the member account was invited to.
MasterId (string) --
The AWS account identifier of the master account for the behavior graph.
Status (string) --
The current membership status of the member account. The status can have one of the following values:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
Member accounts that declined an invitation or that were removed from the behavior graph are not included.
DisabledReason (string) --
For member accounts with a status of ACCEPTED_BUT_DISABLED , the reason that the member account is not enabled.
The reason can have one of the following values:
VOLUME_TOO_HIGH - Indicates that adding the member account would cause the data volume for the behavior graph to be too high.
VOLUME_UNKNOWN - Indicates that Detective is unable to verify the data volume for the member account. This is usually because the member account is not enrolled in Amazon GuardDuty.
InvitedTime (datetime) --
The date and time that Detective sent the invitation to the member account. The value is in milliseconds since the epoch.
UpdatedTime (datetime) --
The date and time that the member account was last updated. The value is in milliseconds since the epoch.
PercentOfGraphUtilization (float) --
The member account data volume as a percentage of the maximum allowed data volume. 0 indicates 0 percent, and 100 indicates 100 percent.
Note that this is not the percentage of the behavior graph data volume.
For example, the data volume for the behavior graph is 80 GB per day. The maximum data volume is 160 GB per day. If the data volume for the member account is 40 GB per day, then PercentOfGraphUtilization is 25. It represents 25% of the maximum allowed data volume.
PercentOfGraphUtilizationUpdatedTime (datetime) --
The date and time when the graph utilization percentage was last updated.
NextToken (string) --
If there are more member accounts remaining in the results, then this is the pagination token to use to request the next page of member accounts.
Exceptions
Detective.Client.exceptions.InternalServerException
Detective.Client.exceptions.ResourceNotFoundException
Detective.Client.exceptions.ValidationException
:return: {
'MemberDetails': [
{
'AccountId': 'string',
'EmailAddress': 'string',
'GraphArn': 'string',
'MasterId': 'string',
'Status': 'INVITED'|'VERIFICATION_IN_PROGRESS'|'VERIFICATION_FAILED'|'ENABLED'|'ACCEPTED_BUT_DISABLED',
'DisabledReason': 'VOLUME_TOO_HIGH'|'VOLUME_UNKNOWN',
'InvitedTime': datetime(2015, 1, 1),
'UpdatedTime': datetime(2015, 1, 1),
'PercentOfGraphUtilization': 123.0,
'PercentOfGraphUtilizationUpdatedTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
INVITED - Indicates that the member was sent an invitation but has not yet responded.
VERIFICATION_IN_PROGRESS - Indicates that Detective is verifying that the account identifier and email address provided for the member account match. If they do match, then Detective sends the invitation. If the email address and account identifier don\'t match, then the member cannot be added to the behavior graph.
VERIFICATION_FAILED - Indicates that the account and email address provided for the member account do not match, and Detective did not send an invitation to the account.
ENABLED - Indicates that the member account accepted the invitation to contribute to the behavior graph.
ACCEPTED_BUT_DISABLED - Indicates that the member account accepted the invitation but is prevented from contributing data to the behavior graph. DisabledReason provides the reason why the member account is not enabled.
"""
pass
def reject_invitation(GraphArn=None):
"""
Rejects an invitation to contribute the account data to a behavior graph. This operation must be called by a member account that has the INVITED status.
See also: AWS API Documentation
Exceptions
:example: response = client.reject_invitation(
GraphArn='string'
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph to reject the invitation to.\nThe member account\'s current member status in the behavior graph must be INVITED .\n
"""
pass
def start_monitoring_member(GraphArn=None, AccountId=None):
"""
Sends a request to enable data ingest for a member account that has a status of ACCEPTED_BUT_DISABLED .
For valid member accounts, the status is updated as follows.
See also: AWS API Documentation
Exceptions
:example: response = client.start_monitoring_member(
GraphArn='string',
AccountId='string'
)
:type GraphArn: string
:param GraphArn: [REQUIRED]\nThe ARN of the behavior graph.\n
:type AccountId: string
:param AccountId: [REQUIRED]\nThe account ID of the member account to try to enable.\nThe account must be an invited member account with a status of ACCEPTED_BUT_DISABLED .\n
:returns:
GraphArn (string) -- [REQUIRED]
The ARN of the behavior graph.
AccountId (string) -- [REQUIRED]
The account ID of the member account to try to enable.
The account must be an invited member account with a status of ACCEPTED_BUT_DISABLED .
"""
pass
| 38.666047 | 505 | 0.719963 | 5,472 | 41,566 | 5.436952 | 0.079496 | 0.054183 | 0.055393 | 0.021075 | 0.811737 | 0.794091 | 0.780848 | 0.765352 | 0.753286 | 0.734732 | 0 | 0.009006 | 0.219939 | 41,566 | 1,074 | 506 | 38.702048 | 0.908555 | 0.971106 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
1c1d02efaa8fe90366f4ac61c61a67b1c6f02959 | 8,129 | py | Python | tests/integrations/test_feature_extraction.py | scottcha/tsfresh | b3395c12d7e25494bdc297a31f6d1136e76c477e | [
"MIT"
] | 1 | 2021-03-16T15:08:04.000Z | 2021-03-16T15:08:04.000Z | tests/integrations/test_feature_extraction.py | scottcha/tsfresh | b3395c12d7e25494bdc297a31f6d1136e76c477e | [
"MIT"
] | null | null | null | tests/integrations/test_feature_extraction.py | scottcha/tsfresh | b3395c12d7e25494bdc297a31f6d1136e76c477e | [
"MIT"
] | null | null | null | # This file as well as the whole tsfresh package are licenced under the MIT licence (see the LICENCE.txt)
# Maximilian Christ (maximilianchrist.com), Blue Yonder Gmbh, 2016
from unittest import TestCase
import dask.dataframe as dd
import pandas as pd
from tsfresh.examples.driftbif_simulation import load_driftbif
from tsfresh import extract_relevant_features, extract_features
from tsfresh.feature_extraction import MinimalFCParameters
class FeatureExtractionTestCase(TestCase):
def setUp(self):
df, y = load_driftbif(100, 10, classification=True, seed=42)
df['my_id'] = df['id'].astype('str')
del df["id"]
self.df = df
def test_pandas(self):
df = self.df
# Test shape and a single entry (to see if it works at all)
X = extract_features(df, column_id="my_id", column_sort="time", column_kind="dimension", column_value="value",
default_fc_parameters=MinimalFCParameters())
self.assertIn("1__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "1__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 18))
X = extract_features(df, column_id="my_id", column_sort="time", column_kind="dimension",
default_fc_parameters=MinimalFCParameters())
self.assertIn("1__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "1__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 18))
X = extract_features(df.drop(columns=["dimension"]), column_id="my_id", column_sort="time",
default_fc_parameters=MinimalFCParameters())
self.assertIn("value__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "value__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 9))
X = extract_features(df.drop(columns=["dimension", "time"]), column_id="my_id",
default_fc_parameters=MinimalFCParameters())
self.assertIn("value__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "value__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 9))
def test_pandas_no_pivot(self):
df = self.df
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension", column_value="value",
pivot=False,
default_fc_parameters=MinimalFCParameters())
X = pd.DataFrame(X, columns=["my_id", "variable", "value"])
self.assertIn("1__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "1__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*18, 3))
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension",
pivot=False,
default_fc_parameters=MinimalFCParameters())
X = pd.DataFrame(X, columns=["my_id", "variable", "value"])
self.assertIn("1__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "1__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*18, 3))
X = extract_features(df.drop(columns=["dimension"]), column_id="my_id",
column_sort="time",
pivot=False,
default_fc_parameters=MinimalFCParameters())
X = pd.DataFrame(X, columns=["my_id", "variable", "value"])
self.assertIn("value__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "value__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*9, 3))
X = extract_features(df.drop(columns=["dimension", "time"]), column_id="my_id",
pivot=False,
default_fc_parameters=MinimalFCParameters())
X = pd.DataFrame(X, columns=["my_id", "variable", "value"])
self.assertIn("value__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "value__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*9, 3))
def test_dask(self):
df = dd.from_pandas(self.df, npartitions=1)
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension", column_value="value",
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("1__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "1__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 18))
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension",
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("1__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "1__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 18))
X = extract_features(df.drop(columns=["dimension"]), column_id="my_id",
column_sort="time",
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("value__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "value__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 9))
X = extract_features(df.drop(columns=["dimension", "time"]), column_id="my_id",
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("value__mean", X.columns)
self.assertAlmostEqual(X.loc["5", "value__mean"], 5.516e-05, 4)
self.assertIn("11", X.index)
self.assertEqual(X.shape, (100, 9))
def test_dask_no_pivot(self):
df = dd.from_pandas(self.df, npartitions=1)
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension", column_value="value",
pivot=False,
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("1__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "1__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*18, 3))
X = extract_features(df, column_id="my_id", column_sort="time",
column_kind="dimension",
pivot=False,
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("1__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "1__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*18, 3))
X = extract_features(df.drop(columns=["dimension"]), column_id="my_id",
column_sort="time",
pivot=False,
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("value__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "value__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*9, 3))
X = extract_features(df.drop(columns=["dimension", "time"]), column_id="my_id",
pivot=False,
default_fc_parameters=MinimalFCParameters()).compute()
self.assertIn("value__mean", X["variable"].values)
self.assertAlmostEqual(X[(X["my_id"] == "5") & (X["variable"] == "value__mean")]["value"].iloc[0], 5.516e-05, 4)
self.assertEqual(X.shape, (100*9, 3))
| 50.80625 | 120 | 0.57953 | 978 | 8,129 | 4.614519 | 0.110429 | 0.025704 | 0.056725 | 0.063816 | 0.8655 | 0.8655 | 0.8655 | 0.8655 | 0.8655 | 0.8655 | 0 | 0.043515 | 0.262148 | 8,129 | 159 | 121 | 51.125786 | 0.708903 | 0.027802 | 0 | 0.852713 | 0 | 0 | 0.11647 | 0 | 0 | 0 | 0 | 0 | 0.434109 | 1 | 0.03876 | false | 0 | 0.046512 | 0 | 0.093023 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1c80752dd38e637f45c51f83e217a2b388bfe6ce | 132 | py | Python | pybamm/models/submodels/thermal/__init__.py | jedgedrudd/PyBaMM | 79c9d34978382d50e09adaf8bf74c8fa4723f759 | [
"BSD-3-Clause"
] | 1 | 2019-10-29T19:06:04.000Z | 2019-10-29T19:06:04.000Z | pybamm/models/submodels/thermal/__init__.py | jedgedrudd/PyBaMM | 79c9d34978382d50e09adaf8bf74c8fa4723f759 | [
"BSD-3-Clause"
] | null | null | null | pybamm/models/submodels/thermal/__init__.py | jedgedrudd/PyBaMM | 79c9d34978382d50e09adaf8bf74c8fa4723f759 | [
"BSD-3-Clause"
] | null | null | null | from .base_thermal import BaseThermal
from . import isothermal
from . import x_full
from . import x_lumped
from . import xyz_lumped
| 22 | 37 | 0.810606 | 20 | 132 | 5.15 | 0.5 | 0.38835 | 0.213592 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151515 | 132 | 5 | 38 | 26.4 | 0.919643 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1c9d3094b46575872373af5e6d0c09d77f017f27 | 5,303 | py | Python | src/train.py | theofpa/continual-object-instances | 630ab4b115e5bf6004a26855a7af24e37372e5bb | [
"Apache-2.0"
] | 13 | 2020-06-05T13:49:53.000Z | 2022-03-14T10:39:39.000Z | src/train.py | theofpa/continual-object-instances | 630ab4b115e5bf6004a26855a7af24e37372e5bb | [
"Apache-2.0"
] | 1 | 2020-09-03T06:37:30.000Z | 2020-11-19T22:43:31.000Z | src/train.py | theofpa/continual-object-instances | 630ab4b115e5bf6004a26855a7af24e37372e5bb | [
"Apache-2.0"
] | 4 | 2020-06-09T13:23:13.000Z | 2020-10-20T10:37:55.000Z | import torch
from tqdm import tqdm
from utils import device, args
from utils import save_model, send_to_device, print_train_progress
from metrics import evaluation
def train(model, criterion, train_loader, query_loader, gallery_loader, optimizer, experiment_name):
for epoch in range(args.n_epochs):
train_loss, metric = train_epoch(
model, criterion, optimizer, train_loader)
print_train_progress(epoch, train_loss, metric)
if epoch % args.print_every == 0:
evaluation(model, query_loader, gallery_loader)
save_model(model, experiment_name)
def continuous_train(old_model, model, criterion, train_loader, query_loader, gallery_loader, optimizer, experiment_name):
for epoch in range(args.n_epochs):
if args.continuous_learning_method == "naive":
train_loss, metric = train_epoch(
model, criterion, optimizer, train_loader)
elif args.continuous_learning_method == "finetune":
train_loss, metric = train_epoch(
model, criterion, optimizer, train_loader)
elif args.continuous_learning_method == "lfl":
train_loss, metric = train_lfl_epoch(
old_model, model, criterion, optimizer, train_loader)
elif args.continuous_learning_method == "lwf":
train_loss, metric = train_lfl_epoch(
old_model, model, criterion, optimizer, train_loader)
elif args.continuous_learning_method == "ewc":
train_loss, metric = train_ewc_epoch(
old_model, model, criterion, optimizer, train_loader)
else:
raise ValueError(
"Provided Continual Learning method does not exist")
print_train_progress(epoch, train_loss, metric)
save_model(model, experiment_name)
def train_epoch(model, criterion, optimizer, dataloader):
model.train()
total_loss = 0
total_metrics = 0
for idx, data_items in enumerate(tqdm(dataloader)):
optimizer.zero_grad()
data_items = send_to_device(data_items, device)
b, c, h, w = data_items["neg"].size()
data_items["neg"] = data_items["neg"].view(
b*args.neg_samples, int(c/args.neg_samples), h, w)
anchor, pos, neg = model(
data_items["anchor"], data_items["pos"], data_items["neg"])
loss, metric = criterion(
anchor=anchor, pos=pos, neg=neg, targets=data_items["anchor_target"])
total_loss += loss.item()
total_metrics += metric
loss.backward()
torch.nn.utils.clip_grad_norm_(model.parameters(), 10)
optimizer.step()
total_loss /= len(dataloader)
if args.task_method == "regression":
metric = total_metrics/len(dataloader)
else:
metric = total_metrics/len(dataloader.dataset)
return total_loss, metric
def train_lfl_epoch(old_model, model, criterion, optimizer, dataloader):
old_model.eval()
model.train()
total_loss = 0
total_metrics = 0
for idx, data_items in enumerate(tqdm(dataloader)):
optimizer.zero_grad()
data_items = send_to_device(data_items, device)
b, c, h, w = data_items["neg"].size()
data_items["neg"] = data_items["neg"].view(
b*args.neg_samples, int(c/args.neg_samples), h, w)
anchor, pos, neg = model(
data_items["anchor"], data_items["pos"], data_items["neg"])
with torch.no_grad():
old_anchor = old_model.get_embedding(data_items["anchor"])
loss, metric = criterion(old_anchor=old_anchor, anchor=anchor,
pos=pos, neg=neg, targets=data_items["anchor_target"])
total_loss += loss.item()
loss.backward()
total_metrics += metric
torch.nn.utils.clip_grad_norm_(model.parameters(), 10)
optimizer.step()
total_loss /= len(dataloader)
if args.task_method == "regression":
metric = total_metrics/len(dataloader)
else:
metric = total_metrics/len(dataloader.dataset)
return total_loss, metric
def train_ewc_epoch(old_model, model, criterion, optimizer, dataloader):
old_model.eval()
model.train()
total_loss = 0
total_metrics = 0
criterion.update_models(old_model, model)
criterion.update_fisher(dataloader)
data = []
for idx, data_items in enumerate(tqdm(dataloader)):
optimizer.zero_grad()
data_items = send_to_device(data_items, device)
b, c, h, w = data_items["neg"].size()
data_items["neg"] = data_items["neg"].view(
b*args.neg_samples, int(c/args.neg_samples), h, w)
anchor, pos, neg = model(
data_items["anchor"], data_items["pos"], data_items["neg"])
loss, metric = criterion(
anchor=anchor, pos=pos, neg=neg, targets=data_items["anchor_target"])
total_loss += loss.item()
loss.backward()
total_metrics += metric
torch.nn.utils.clip_grad_norm_(model.parameters(), 10)
optimizer.step()
data.append(data_items)
total_loss /= len(dataloader)
if args.task_method == "regression":
metric = total_metrics/len(dataloader)
else:
metric = total_metrics/len(dataloader.dataset)
return total_loss, metric
| 36.572414 | 122 | 0.648878 | 654 | 5,303 | 5.004587 | 0.146789 | 0.087993 | 0.043996 | 0.047052 | 0.822793 | 0.81271 | 0.793767 | 0.770547 | 0.75191 | 0.75191 | 0 | 0.003244 | 0.244201 | 5,303 | 144 | 123 | 36.826389 | 0.813373 | 0 | 0 | 0.762712 | 0 | 0 | 0.039412 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042373 | false | 0 | 0.042373 | 0 | 0.110169 | 0.033898 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
98c8c6f2b3a24defb62092257eda29aa5dd5be6d | 13,089 | py | Python | bireme/main/migrations/0001_initial.py | rfdeoliveira/fi-admin | c2df084c7e79d587e2273dc222f106fa243b7f6e | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | bireme/main/migrations/0001_initial.py | rfdeoliveira/fi-admin | c2df084c7e79d587e2273dc222f106fa243b7f6e | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | bireme/main/migrations/0001_initial.py | rfdeoliveira/fi-admin | c2df084c7e79d587e2273dc222f106fa243b7f6e | [
"MIT",
"Python-2.0",
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('utils', '0001_initial'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('contenttypes', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Descriptor',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('object_id', models.PositiveIntegerField()),
('text', models.CharField(max_length=255, verbose_name='Text', blank=True)),
('code', models.CharField(max_length=50, verbose_name='Code', blank=True)),
('status', models.SmallIntegerField(default=0, verbose_name='Status', choices=[(0, 'Pending'), (1, 'Admitted'), (2, 'Refused')])),
('content_type', models.ForeignKey(related_name='descriptors', to='contenttypes.ContentType')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('updated_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'abstract': False,
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Keyword',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('object_id', models.PositiveIntegerField()),
('text', models.CharField(max_length=255, verbose_name='Text', blank=True)),
('status', models.SmallIntegerField(default=0, verbose_name='Status', choices=[(0, 'Pending'), (1, 'Admitted'), (2, 'Refused')])),
('user_recomendation', models.BooleanField(verbose_name='User recomendation?')),
('content_type', models.ForeignKey(related_name='keywords', to='contenttypes.ContentType')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('updated_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'abstract': False,
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Resource',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('status', models.SmallIntegerField(default=0, null=True, verbose_name='Status', choices=[(0, 'Pending'), (1, 'Admitted'), (2, 'Refused'), (3, 'Deleted')])),
('title', models.CharField(max_length=510, verbose_name='Title')),
('link', models.TextField(verbose_name='Link')),
('originator', models.TextField(verbose_name='Originator')),
('author', models.TextField(help_text='Enter one per line', verbose_name='Authors', blank=True)),
('abstract', models.TextField(verbose_name='Abstract')),
('time_period_textual', models.CharField(max_length=255, verbose_name='Temporal range', blank=True)),
('objective', models.TextField(verbose_name='Objective', blank=True)),
('cooperative_center_code', models.CharField(max_length=55, verbose_name='Cooperative center', blank=True)),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('originator_location', models.ManyToManyField(to='utils.Country', verbose_name='Originator location')),
],
options={
'verbose_name': 'Resource',
'verbose_name_plural': 'Resources',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='ResourceThematic',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('object_id', models.PositiveIntegerField()),
('status', models.SmallIntegerField(default=0, blank=True, verbose_name='Status', choices=[(0, 'Pending'), (1, 'Admitted'), (2, 'Refused')])),
('content_type', models.ForeignKey(related_name='thematics', to='contenttypes.ContentType')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'verbose_name': 'Thematic area',
'verbose_name_plural': 'Thematic areas',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='SourceLanguage',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('acronym', models.CharField(max_length=25, verbose_name='Acronym', blank=True)),
('language', models.CharField(max_length=10, verbose_name='Language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='Name')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('updated_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'verbose_name': 'Source language',
'verbose_name_plural': 'Source languages',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='SourceLanguageLocal',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('language', models.CharField(max_length=10, verbose_name='Language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='Name')),
('source_language', models.ForeignKey(verbose_name='Source language', to='main.SourceLanguage')),
],
options={
'verbose_name': 'Translation',
'verbose_name_plural': 'Translations',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='SourceType',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('acronym', models.CharField(max_length=25, verbose_name='Acronym', blank=True)),
('language', models.CharField(max_length=10, verbose_name='Language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='Name')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('updated_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'verbose_name': 'source type',
'verbose_name_plural': 'source types',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='SourceTypeLocal',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('language', models.CharField(max_length=10, verbose_name='language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='name')),
('source_type', models.ForeignKey(verbose_name='Source type', to='main.SourceType')),
],
options={
'verbose_name': 'Translation',
'verbose_name_plural': 'Translations',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='ThematicArea',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created_time', models.DateTimeField(auto_now_add=True, verbose_name='created at')),
('updated_time', models.DateTimeField(auto_now=True, verbose_name='updated', null=True)),
('acronym', models.CharField(max_length=25, verbose_name='Acronym', blank=True)),
('language', models.CharField(max_length=10, verbose_name='Language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='Name')),
('created_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('updated_by', models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'verbose_name': 'Thematic area',
'verbose_name_plural': 'Thematic areas',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='ThematicAreaLocal',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('language', models.CharField(max_length=10, verbose_name='Language', choices=[(b'en', 'English'), (b'pt-br', 'Portuguese'), (b'es', 'Spanish')])),
('name', models.CharField(max_length=255, verbose_name='Name')),
('thematic_area', models.ForeignKey(verbose_name='Thematic area', to='main.ThematicArea')),
],
options={
'verbose_name': 'Translation',
'verbose_name_plural': 'Translations',
},
bases=(models.Model,),
),
migrations.AddField(
model_name='resourcethematic',
name='thematic_area',
field=models.ForeignKey(related_name='+', to='main.ThematicArea'),
preserve_default=True,
),
migrations.AddField(
model_name='resourcethematic',
name='updated_by',
field=models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True),
preserve_default=True,
),
migrations.AddField(
model_name='resource',
name='source_language',
field=models.ManyToManyField(to='main.SourceLanguage', verbose_name='Source language'),
preserve_default=True,
),
migrations.AddField(
model_name='resource',
name='source_type',
field=models.ManyToManyField(to='main.SourceType', verbose_name='Source type'),
preserve_default=True,
),
migrations.AddField(
model_name='resource',
name='updated_by',
field=models.ForeignKey(related_name='+', blank=True, editable=False, to=settings.AUTH_USER_MODEL, null=True),
preserve_default=True,
),
]
| 58.695067 | 173 | 0.59019 | 1,300 | 13,089 | 5.742308 | 0.110769 | 0.113463 | 0.050636 | 0.067515 | 0.817683 | 0.783523 | 0.7499 | 0.738647 | 0.738647 | 0.731413 | 0 | 0.008002 | 0.255329 | 13,089 | 222 | 174 | 58.959459 | 0.757874 | 0.001604 | 0 | 0.703704 | 0 | 0 | 0.182841 | 0.007271 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.013889 | 0 | 0.027778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
98d92a8c8d27ecf1840ff5a07c222bffb4d3c925 | 1,006 | py | Python | NvTK/Model/ResNet.py | JiaqiLiZju/NvTK | 6b887670a03d63c1747d9854ecbbac13cc06461c | [
"BSD-3-Clause"
] | null | null | null | NvTK/Model/ResNet.py | JiaqiLiZju/NvTK | 6b887670a03d63c1747d9854ecbbac13cc06461c | [
"BSD-3-Clause"
] | null | null | null | NvTK/Model/ResNet.py | JiaqiLiZju/NvTK | 6b887670a03d63c1747d9854ecbbac13cc06461c | [
"BSD-3-Clause"
] | null | null | null | from ..Modules import ResidualNet
def cbam_resnet18(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 18, n_features, 'CBAM')
return model
def cbam_resnet34(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 34, n_features, 'CBAM')
return model
def cbam_resnet50(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 50, n_features, 'CBAM')
return model
def cbam_resnet101(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 101, n_features, 'CBAM')
return model
def resnet18(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 18, n_features, None)
return model
def resnet34(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 34, n_features, None)
return model
def resnet50(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 50, n_features, None)
return model
def resnet101(n_features=1000, **kwargs):
model = ResidualNet('ImageNet', 101, n_features, None)
return model
| 29.588235 | 60 | 0.703777 | 128 | 1,006 | 5.375 | 0.171875 | 0.209302 | 0.151163 | 0.22093 | 0.949128 | 0.927326 | 0.822674 | 0.726744 | 0.726744 | 0.726744 | 0 | 0.081146 | 0.166998 | 1,006 | 34 | 61 | 29.588235 | 0.739857 | 0 | 0 | 0.32 | 0 | 0 | 0.079444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.32 | false | 0 | 0.04 | 0 | 0.68 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
c748baf5d68e392c92365ee5a97755f615372283 | 6,373 | py | Python | ceph_deploy/tests/test_keys_equivalent.py | weisongf/ceph-deploy | bcb968a13e0f2643507b06aa8f6249e360e8e742 | [
"MIT"
] | 353 | 2015-01-08T06:25:40.000Z | 2022-03-25T01:13:45.000Z | ceph_deploy/tests/test_keys_equivalent.py | weisongf/ceph-deploy | bcb968a13e0f2643507b06aa8f6249e360e8e742 | [
"MIT"
] | 218 | 2015-01-02T17:45:33.000Z | 2022-02-06T21:40:52.000Z | ceph_deploy/tests/test_keys_equivalent.py | weisongf/ceph-deploy | bcb968a13e0f2643507b06aa8f6249e360e8e742 | [
"MIT"
] | 282 | 2015-01-02T23:02:24.000Z | 2021-12-27T02:31:49.000Z | from ceph_deploy import gatherkeys
from ceph_deploy import new
import tempfile
import shutil
import pytest
def write_key_mon_with_caps(path, secret):
mon_keyring = '[mon.]\nkey = %s\ncaps mon = allow *\n' % secret
with open(path, 'w', 0o600) as f:
f.write(mon_keyring)
def write_key_mon_with_caps_with_tab(path, secret):
mon_keyring = '[mon.]\n\tkey = %s\n\tcaps mon = allow *\n' % secret
with open(path, 'w', 0o600) as f:
f.write(mon_keyring)
def write_key_mon_with_caps_with_tab_quote(path, secret):
mon_keyring = '[mon.]\n\tkey = %s\n\tcaps mon = "allow *"\n' % secret
with open(path, 'w', 0o600) as f:
f.write(mon_keyring)
def write_key_mon_without_caps(path, secret):
mon_keyring = '[mon.]\nkey = %s\n' % secret
with open(path, 'w', 0o600) as f:
f.write(mon_keyring)
class TestKeysEquivalent(object):
"""
Since we are testing things that effect the content of the current working
directory we should test in a clean empty directory.
"""
def setup(self):
"""
Make temp directory for tests.
"""
self.test_dir = tempfile.mkdtemp()
def teardown(self):
"""
Remove temp directory and content
"""
shutil.rmtree(self.test_dir)
def test_identical_with_caps(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_with_caps(key_path_02, secret_01)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is True
def test_different_with_caps(self):
secret_01 = new.generate_auth_key()
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_with_caps(key_path_02, secret_02)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is False
def test_identical_without_caps(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_without_caps(key_path_01, secret_01)
write_key_mon_without_caps(key_path_02, secret_01)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is True
def test_different_without_caps(self):
secret_01 = new.generate_auth_key()
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_without_caps(key_path_01, secret_01)
write_key_mon_without_caps(key_path_02, secret_02)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is False
def test_identical_mixed_caps(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_without_caps(key_path_02, secret_01)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is True
def test_different_mixed_caps(self):
secret_01 = new.generate_auth_key()
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_without_caps(key_path_02, secret_02)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is False
def test_identical_caps_mixed_tabs(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_with_caps_with_tab(key_path_02, secret_01)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is True
def test_different_caps_mixed_tabs(self):
secret_01 = new.generate_auth_key()
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps(key_path_01, secret_01)
write_key_mon_with_caps_with_tab(key_path_02, secret_02)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is False
def test_identical_caps_mixed_quote(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps_with_tab(key_path_01, secret_01)
write_key_mon_with_caps_with_tab_quote(key_path_02, secret_01)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is True
def test_different_caps_mixed_quote(self):
secret_01 = new.generate_auth_key()
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps_with_tab(key_path_01, secret_01)
write_key_mon_with_caps_with_tab_quote(key_path_02, secret_02)
same = gatherkeys._keyring_equivalent(key_path_01, key_path_02)
assert same is False
def test_missing_key_1(self):
secret_02 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps_with_tab_quote(key_path_02, secret_02)
with pytest.raises(IOError):
gatherkeys._keyring_equivalent(key_path_01, key_path_02)
def test_missing_key_2(self):
secret_01 = new.generate_auth_key()
key_path_01 = self.test_dir + "/01.keyring"
key_path_02 = self.test_dir + "/02.keyring"
write_key_mon_with_caps_with_tab_quote(key_path_01, secret_01)
with pytest.raises(IOError):
gatherkeys._keyring_equivalent(key_path_01, key_path_02)
| 37.052326 | 78 | 0.691511 | 965 | 6,373 | 4.108808 | 0.089119 | 0.123581 | 0.079445 | 0.071879 | 0.896847 | 0.89256 | 0.88802 | 0.88802 | 0.870618 | 0.862547 | 0 | 0.057212 | 0.221089 | 6,373 | 171 | 79 | 37.269006 | 0.741539 | 0.030127 | 0 | 0.712 | 0 | 0 | 0.067048 | 0 | 0 | 0 | 0 | 0 | 0.08 | 1 | 0.144 | false | 0 | 0.04 | 0 | 0.192 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c74ccf841302b92447d79c7e62a05aff7280aca7 | 47,942 | py | Python | models.py | laphisboy/mvsnerf | ea1aecd7d653b04a7f4bec27ad978f64a038bc92 | [
"MIT"
] | null | null | null | models.py | laphisboy/mvsnerf | ea1aecd7d653b04a7f4bec27ad978f64a038bc92 | [
"MIT"
] | null | null | null | models.py | laphisboy/mvsnerf | ea1aecd7d653b04a7f4bec27ad978f64a038bc92 | [
"MIT"
] | null | null | null | import torch
torch.autograd.set_detect_anomaly(True)
import torch.nn as nn
from utils import *
from utils import homo_warp, homo_warp_debug
from inplace_abn import InPlaceABN
from renderer import run_network_mvs
import sys; import pdb
class ForkedPdb(pdb.Pdb):
"""A Pdb subclass that may be used
from a forked multiprocessing child
"""
def interaction(self, *args, **kwargs):
_stdin = sys.stdin
try:
sys.stdin = open('/dev/stdin')
pdb.Pdb.interaction(self, *args, **kwargs)
finally:
sys.stdin = _stdin
def weights_init(m):
if isinstance(m, nn.Linear):
nn.init.kaiming_normal_(m.weight.data)
if m.bias is not None:
nn.init.zeros_(m.bias.data)
class Embedder:
def __init__(self, **kwargs):
self.kwargs = kwargs
self.create_embedding_fn()
def create_embedding_fn(self):
embed_fns = []
d = self.kwargs['input_dims']
out_dim = 0
if self.kwargs['include_input']:
embed_fns.append(lambda x : x)
out_dim += d
max_freq = self.kwargs['max_freq_log2']
N_freqs = self.kwargs['num_freqs']
if self.kwargs['log_sampling']:
freq_bands = 2.**torch.linspace(0., max_freq, steps=N_freqs)
else:
freq_bands = torch.linspace(2.**0., 2.**max_freq, steps=N_freqs)
self.freq_bands = freq_bands.reshape(1,-1,1).cuda()
for freq in freq_bands:
for p_fn in self.kwargs['periodic_fns']:
embed_fns.append(lambda x, p_fn=p_fn, freq=freq : p_fn(x * freq))
out_dim += d
self.embed_fns = embed_fns
self.out_dim = out_dim
def embed(self, inputs):
repeat = inputs.dim()-1
inputs_scaled = (inputs.unsqueeze(-2) * self.freq_bands.view(*[1]*repeat,-1,1)).reshape(*inputs.shape[:-1],-1)
inputs_scaled = torch.cat((inputs, torch.sin(inputs_scaled), torch.cos(inputs_scaled)),dim=-1)
return inputs_scaled
def get_embedder(multires, i=0, input_dims=3):
if i == -1:
return nn.Identity(), 3
embed_kwargs = {
'include_input' : True,
'input_dims' : input_dims,
'max_freq_log2' : multires-1,
'num_freqs' : multires,
'log_sampling' : True,
'periodic_fns' : [torch.sin, torch.cos],
}
embedder_obj = Embedder(**embed_kwargs)
embed = lambda x, eo=embedder_obj : eo.embed(x)
return embed, embedder_obj.out_dim
class ScaledDotProductAttention(nn.Module):
''' Scaled Dot-Product Attention '''
def __init__(self, temperature, attn_dropout=0.1):
super().__init__()
self.temperature = temperature
# self.dropout = nn.Dropout(attn_dropout)
def forward(self, q, k, v, mask=None):
attn = torch.matmul(q / self.temperature, k.transpose(2, 3))
if mask is not None:
attn = attn.masked_fill(mask == 0, -1e9)
# attn = attn * mask
attn = F.softmax(attn, dim=-1)
# attn = self.dropout(F.softmax(attn, dim=-1))
output = torch.matmul(attn, v)
return output, attn
class MultiHeadAttention(nn.Module):
''' Multi-Head Attention module '''
def __init__(self, n_head, d_model, d_k, d_v, dropout=0.1):
super().__init__()
self.n_head = n_head
self.d_k = d_k
self.d_v = d_v
self.w_qs = nn.Linear(d_model, n_head * d_k, bias=False)
self.w_ks = nn.Linear(d_model, n_head * d_k, bias=False)
self.w_vs = nn.Linear(d_model, n_head * d_v, bias=False)
self.fc = nn.Linear(n_head * d_v, d_model, bias=False)
self.attention = ScaledDotProductAttention(temperature=d_k ** 0.5)
# self.dropout = nn.Dropout(dropout)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
def forward(self, q, k, v, mask=None):
d_k, d_v, n_head = self.d_k, self.d_v, self.n_head
sz_b, len_q, len_k, len_v = q.size(0), q.size(1), k.size(1), v.size(1)
residual = q
# Pass through the pre-attention projection: b x lq x (n*dv)
# Separate different heads: b x lq x n x dv
q = self.w_qs(q).view(sz_b, len_q, n_head, d_k)
k = self.w_ks(k).view(sz_b, len_k, n_head, d_k)
v = self.w_vs(v).view(sz_b, len_v, n_head, d_v)
# Transpose for attention dot product: b x n x lq x dv
q, k, v = q.transpose(1, 2), k.transpose(1, 2), v.transpose(1, 2)
if mask is not None:
mask = mask.unsqueeze(1) # For head axis broadcasting.
q, attn = self.attention(q, k, v, mask=mask)
# Transpose to move the head dimension back: b x lq x n x dv
# Combine the last two dimensions to concatenate all the heads together: b x lq x (n*dv)
q = q.transpose(1, 2).contiguous().view(sz_b, len_q, -1)
q = self.fc(q)
q += residual
q = self.layer_norm(q)
return q, attn
class Renderer_ours(nn.Module):
def __init__(self, D=8, W=256, input_ch=3, input_ch_views=3, output_ch=4, input_ch_feat=8, skips=[4], use_viewdirs=False):
"""
"""
super(Renderer_ours, self).__init__()
self.D = D
self.W = W
self.input_ch = input_ch
self.input_ch_views = input_ch_views
self.skips = skips
self.use_viewdirs = use_viewdirs
self.in_ch_pts, self.in_ch_views, self.in_ch_feat = input_ch, input_ch_views, input_ch_feat
self.pts_linears = nn.ModuleList(
[nn.Linear(self.in_ch_pts, W, bias=True)] + [nn.Linear(W, W, bias=True) if i not in self.skips else nn.Linear(W + self.in_ch_pts, W) for i in range(D-1)])
self.pts_bias = nn.Linear(input_ch_feat, W)
self.views_linears = nn.ModuleList([nn.Linear(input_ch_views + W, W//2)])
if use_viewdirs:
self.feature_linear = nn.Linear(W, W)
self.alpha_linear = nn.Linear(W, 1)
self.rgb_linear = nn.Linear(W//2, 3)
else:
self.output_linear = nn.Linear(W, output_ch)
self.pts_linears.apply(weights_init)
self.views_linears.apply(weights_init)
self.feature_linear.apply(weights_init)
self.alpha_linear.apply(weights_init)
self.rgb_linear.apply(weights_init)
def forward_alpha(self, x):
dim = x.shape[-1]
in_ch_feat = dim-self.in_ch_pts
input_pts, input_feats = torch.split(x, [self.in_ch_pts, in_ch_feat], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats)
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) * bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
alpha = torch.relu(self.alpha_linear(h))
return alpha
def forward(self, x):
dim = x.shape[-1]
in_ch_feat = dim-self.in_ch_pts-self.in_ch_views
input_pts, input_feats, input_views = torch.split(x, [self.in_ch_pts, in_ch_feat, self.in_ch_views], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats)
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) * bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
if self.use_viewdirs:
alpha = torch.relu(self.alpha_linear(h))
feature = self.feature_linear(h)
h = torch.cat([feature, input_views], -1)
for i, l in enumerate(self.views_linears):
h = self.views_linears[i](h)
h = F.relu(h)
rgb = torch.sigmoid(self.rgb_linear(h))
outputs = torch.cat([rgb, alpha], -1)
else:
outputs = self.output_linear(h)
return outputs
class Renderer_color_fusion(nn.Module):
def __init__(self, D=8, W=128, input_ch=3, input_ch_views=3, output_ch=4, input_ch_feat=8, skips=[4],use_viewdirs=False):
"""
"""
super(Renderer_color_fusion, self).__init__()
self.D = D
self.W = W
self.input_ch = input_ch
self.input_ch_views = input_ch_views
self.skips = skips
self.use_viewdirs = use_viewdirs
self.in_ch_pts, self.in_ch_views, self.in_ch_feat = input_ch, input_ch_views, input_ch_feat
self.pts_linears = nn.ModuleList(
[nn.Linear(input_ch, W, bias=True)] + [
nn.Linear(W, W, bias=True) if i not in self.skips else nn.Linear(W + input_ch, W) for i in
range(D - 1)])
self.pts_bias = nn.Linear(input_ch_feat, W)
attension_dim = 16 + 3 + self.in_ch_views//3 # 16 + rgb dim + angle dim
self.ray_attention = MultiHeadAttention(4, attension_dim, 4, 4)
if use_viewdirs:
self.feature_linear = nn.Sequential(nn.Linear(W, 16), nn.ReLU())
self.alpha_linear = nn.Sequential(nn.Linear(W, 1), nn.ReLU())
self.rgb_out = nn.Sequential(nn.Linear(attension_dim, 3),nn.Sigmoid()) #
else:
self.output_linear = nn.Linear(W, output_ch)
self.pts_linears.apply(weights_init)
self.feature_linear.apply(weights_init)
self.alpha_linear.apply(weights_init)
self.rgb_out.apply(weights_init)
def forward_alpha(self,x):
input_pts, input_feats = torch.split(x, [self.in_ch_pts, self.in_ch_feat], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats)
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) * bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
alpha = self.alpha_linear(h)
return alpha
def forward(self, x):
dim = x.shape[-1]
in_ch_feat = dim - self.in_ch_pts - self.in_ch_views
input_pts, input_feats, input_views = torch.split(x, [self.in_ch_pts, in_ch_feat, self.in_ch_views], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats)
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) * bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
alpha = self.alpha_linear(h)
# color
input_views = input_views.reshape(-1, 3, self.in_ch_views//3)
rgb = input_feats[..., 8:].reshape(-1, 3, 4)
rgb_in = rgb[..., :3]
N = rgb.shape[0]
feature = self.feature_linear(h)
h = feature.reshape(N, 1, -1).expand(-1, 3, -1)
h = torch.cat((h, input_views, rgb_in), dim=-1)
h, _ = self.ray_attention(h, h, h, mask=rgb[...,-1:])
rgb = self.rgb_out(h)
rgb = torch.sum(rgb , dim=1).reshape(*alpha.shape[:2], 3)
outputs = torch.cat([rgb, alpha], -1)
return outputs
class Renderer_attention2(nn.Module):
def __init__(self, D=8, W=256, input_ch=3, input_ch_views=3, output_ch=4, input_ch_feat=8, skips=[4], use_viewdirs=False):
"""
"""
super(Renderer_attention, self).__init__()
self.D = D
self.W = W
self.input_ch = input_ch
self.input_ch_views = input_ch_views
self.skips = skips
self.use_viewdirs = use_viewdirs
self.in_ch_pts, self.in_ch_views, self.in_ch_feat = input_ch, input_ch_views, input_ch_feat
self.attension_dim = 4 + 8
self.color_attention = MultiHeadAttention(4, self.attension_dim, 4, 4)
self.weight_out = nn.Linear(self.attension_dim, 3)
self.pts_linears = nn.ModuleList(
[nn.Linear(self.in_ch_pts, W, bias=True)] + [nn.Linear(W, W, bias=True) if i not in self.skips else nn.Linear(W + self.in_ch_pts, W) for i in range(D-1)])
self.pts_bias = nn.Linear(11, W)
self.views_linears = nn.ModuleList([nn.Linear(input_ch_views + W, W//2)])
if use_viewdirs:
self.feature_linear = nn.Linear(W, W)
self.alpha_linear = nn.Linear(W, 1)
self.rgb_linear = nn.Linear(W//2, 3)
else:
self.output_linear = nn.Linear(W, output_ch)
self.pts_linears.apply(weights_init)
self.views_linears.apply(weights_init)
self.feature_linear.apply(weights_init)
self.alpha_linear.apply(weights_init)
self.rgb_linear.apply(weights_init)
def forward(self, x):
N_ray, N_sample, dim = x.shape
in_ch_feat = dim-self.in_ch_pts-self.in_ch_views
input_pts, input_feats, input_views = torch.split(x, [self.in_ch_pts, in_ch_feat, self.in_ch_views], dim=-1)
if input_feats.shape[-1]>8+3:
colors = input_feats[...,8:].view(N_ray*N_sample,-1,4)
weight = torch.cat((colors,input_feats[...,:8].reshape(N_ray*N_sample, 1, -1).expand(-1, colors.shape[-2], -1)),dim=-1)
weight, _ = self.color_attention(weight, weight, weight)
colors = torch.sum(self.weight_out(weight),dim=-2).view(N_ray, N_sample, -1)
# colors = self.weight_out(input_feats)
else:
colors = input_feats[...,-3:]
h = input_pts
# bias = self.pts_bias(colors)
bias = self.pts_bias(torch.cat((input_feats[...,:8],colors),dim=-1))
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) * bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
if self.use_viewdirs:
alpha = torch.relu(self.alpha_linear(h))
feature = self.feature_linear(h)
h = torch.cat([feature, input_views], -1)
for i, l in enumerate(self.views_linears):
h = self.views_linears[i](h)
h = F.relu(h)
rgb = torch.sigmoid(self.rgb_linear(h))
outputs = torch.cat([rgb, alpha], -1)
else:
outputs = self.output_linear(h)
outputs = torch.cat((outputs,colors), dim=-1)
return outputs
class Renderer_attention(nn.Module):
def __init__(self, D=8, W=256, input_ch=3, input_ch_views=3, output_ch=4, input_ch_feat=8, skips=[4], use_viewdirs=False):
"""
"""
super(Renderer_attention, self).__init__()
self.D = D
self.W = W
self.input_ch = input_ch
self.input_ch_views = input_ch_views
self.skips = skips
self.use_viewdirs = use_viewdirs
self.in_ch_pts, self.in_ch_views, self.in_ch_feat = input_ch, input_ch_views, input_ch_feat
self.attension_dim = 4 + 8
self.color_attention = MultiHeadAttention(4, self.attension_dim, 4, 4)
self.weight_out = nn.Linear(self.attension_dim, 3)
# self.weight_out = nn.Linear(self.in_ch_feat, 8)
self.pts_linears = nn.ModuleList(
[nn.Linear(self.in_ch_pts, W, bias=True)] + [nn.Linear(W, W, bias=True)]*(D-1))
self.pts_bias = nn.Linear(11, W)
self.views_linears = nn.ModuleList([nn.Linear(input_ch_views + W, W//2)])
if use_viewdirs:
self.feature_linear = nn.Linear(W, W)
self.alpha_linear = nn.Linear(W, 1)
self.rgb_linear = nn.Linear(W//2, 3)
else:
self.output_linear = nn.Linear(W, output_ch)
self.pts_linears.apply(weights_init)
self.views_linears.apply(weights_init)
self.feature_linear.apply(weights_init)
self.alpha_linear.apply(weights_init)
self.rgb_linear.apply(weights_init)
def forward(self, x):
N_ray, N_sample, dim = x.shape
in_ch_feat = dim-self.in_ch_pts-self.in_ch_views
input_pts, input_feats, input_views = torch.split(x, [self.in_ch_pts, in_ch_feat, self.in_ch_views], dim=-1)
if input_feats.shape[-1]>8+3:
colors = input_feats[...,8:].view(N_ray*N_sample,-1,4)
weight = torch.cat((colors,input_feats[...,:8].reshape(N_ray*N_sample, 1, -1).expand(-1, colors.shape[-2], -1)),dim=-1)
weight, _ = self.color_attention(weight, weight, weight)
colors = torch.sum(torch.sigmoid(self.weight_out(weight)),dim=-2).view(N_ray, N_sample, -1)
# colors = self.weight_out(input_feats)
else:
colors = input_feats[...,-3:]
h = input_pts
# bias = self.pts_bias(colors)
bias = self.pts_bias(torch.cat((input_feats[...,:8],colors),dim=-1))
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) + bias
h = F.relu(h)
# if i in self.skips:
# h = torch.cat([input_pts, h], -1)
if self.use_viewdirs:
alpha = torch.relu(self.alpha_linear(h))
feature = self.feature_linear(h)
h = torch.cat([feature, input_views], -1)
for i, l in enumerate(self.views_linears):
h = self.views_linears[i](h)
h = F.relu(h)
rgb = torch.sigmoid(self.rgb_linear(h))
outputs = torch.cat([rgb, alpha, colors], -1)
else:
outputs = self.output_linear(h)
outputs = torch.cat((outputs,colors), dim=-1)
return outputs
class Renderer_linear(nn.Module):
def __init__(self, D=8, W=256, input_ch=3, input_ch_views=3, output_ch=4, input_ch_feat=8, skips=[4], use_viewdirs=False):
"""
"""
super(Renderer_linear, self).__init__()
self.D = D
self.W = W
self.input_ch = input_ch
self.input_ch_views = input_ch_views
self.skips = skips
self.use_viewdirs = use_viewdirs
self.in_ch_pts, self.in_ch_views, self.in_ch_feat = input_ch, input_ch_views, input_ch_feat
self.pts_linears = nn.ModuleList(
[nn.Linear(input_ch, W, bias=True)] + [nn.Linear(W, W, bias=True) if i not in self.skips else nn.Linear(W + input_ch, W) for i in range(D-1)])
self.pts_bias = nn.Linear(input_ch_feat, W)
self.views_linears = nn.ModuleList([nn.Linear(input_ch_views + W, W//2)])
if use_viewdirs:
self.feature_linear = nn.Linear(W, W)
self.alpha_linear = nn.Linear(W, 1)
self.rgb_linear = nn.Linear(W//2, 3)
else:
self.output_linear = nn.Linear(W, output_ch)
self.pts_linears.apply(weights_init)
self.views_linears.apply(weights_init)
self.feature_linear.apply(weights_init)
self.alpha_linear.apply(weights_init)
self.rgb_linear.apply(weights_init)
def forward_alpha(self,x):
dim = x.shape[-1]
input_pts, input_feats = torch.split(x, [self.in_ch_pts, self.in_ch_feat], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats)
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) + bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
alpha = self.alpha_linear(h)
return alpha
def forward(self, x):
dim = x.shape[-1]
in_ch_feat = dim-self.in_ch_pts-self.in_ch_views
input_pts, input_feats, input_views = torch.split(x, [self.in_ch_pts, in_ch_feat, self.in_ch_views], dim=-1)
h = input_pts
bias = self.pts_bias(input_feats) #if in_ch_feat == self.in_ch_feat else input_feats
for i, l in enumerate(self.pts_linears):
h = self.pts_linears[i](h) + bias
h = F.relu(h)
if i in self.skips:
h = torch.cat([input_pts, h], -1)
if self.use_viewdirs:
alpha = torch.relu(self.alpha_linear(h))
feature = self.feature_linear(h)
h = torch.cat([feature, input_views], -1)
for i, l in enumerate(self.views_linears):
h = self.views_linears[i](h)
h = F.relu(h)
rgb = torch.sigmoid(self.rgb_linear(h))
outputs = torch.cat([rgb, alpha], -1)
else:
outputs = self.output_linear(h)
return outputs
class MVSNeRF(nn.Module):
def __init__(self, D=8, W=256, input_ch_pts=3, input_ch_views=3, input_ch_feat=8, skips=[4], net_type='v2'):
"""
"""
super(MVSNeRF, self).__init__()
self.in_ch_pts, self.in_ch_views,self.in_ch_feat = input_ch_pts, input_ch_views, input_ch_feat
# we provide two version network structure
if 'v0' == net_type:
self.nerf = Renderer_ours(D=D, W=W,input_ch_feat=input_ch_feat,
input_ch=input_ch_pts, output_ch=4, skips=skips,
input_ch_views=input_ch_views, use_viewdirs=True)
elif 'v1' == net_type:
self.nerf = Renderer_attention(D=D, W=W,input_ch_feat=input_ch_feat,
input_ch=input_ch_pts, output_ch=4, skips=skips,
input_ch_views=input_ch_views, use_viewdirs=True)
elif 'v2' == net_type:
self.nerf = Renderer_linear(D=D, W=W,input_ch_feat=input_ch_feat,
input_ch=input_ch_pts, output_ch=4, skips=skips,
input_ch_views=input_ch_views, use_viewdirs=True)
def forward_alpha(self, x):
return self.nerf.forward_alpha(x)
def forward(self, x):
RGBA = self.nerf(x)
return RGBA
def create_nerf_mvs(args, pts_embedder=True, use_mvs=False, dir_embedder=True):
"""Instantiate mvs NeRF's MLP model.
"""
if pts_embedder:
embed_fn, input_ch = get_embedder(args.multires, args.i_embed, input_dims=args.pts_dim)
else:
embed_fn, input_ch = None, args.pts_dim
embeddirs_fn = None
if dir_embedder:
embeddirs_fn, input_ch_views = get_embedder(args.multires_views, args.i_embed, input_dims=args.dir_dim)
else:
embeddirs_fn, input_ch_views = None, args.dir_dim
skips = [4]
model = MVSNeRF(D=args.netdepth, W=args.netwidth,
input_ch_pts=input_ch, skips=skips,
input_ch_views=input_ch_views, input_ch_feat=args.feat_dim, net_type=args.net_type).to(device)
grad_vars = []
grad_vars += list(model.parameters())
model_fine = None
if args.N_importance > 0:
model_fine = MVSNeRF(D=args.netdepth, W=args.netwidth,
input_ch_pts=input_ch, skips=skips,
input_ch_views=input_ch_views, input_ch_feat=args.feat_dim).to(device)
grad_vars += list(model_fine.parameters())
network_query_fn = lambda pts, viewdirs, rays_feats, network_fn: run_network_mvs(pts, viewdirs, rays_feats, network_fn,
embed_fn=embed_fn,
embeddirs_fn=embeddirs_fn,
netchunk=args.netchunk)
EncodingNet = None
if use_mvs:
EncodingNet = MVSNet().to(device)
grad_vars += list(EncodingNet.parameters()) #!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
start = 0
##########################
# Load checkpoints
ckpts = []
if args.ckpt is not None and args.ckpt != 'None':
ckpts = [args.ckpt]
print('Found ckpts', ckpts)
if len(ckpts) > 0 :
ckpt_path = ckpts[-1]
print('Reloading from', ckpt_path)
ckpt = torch.load(ckpt_path)
# Load model
if use_mvs:
state_dict = ckpt['network_mvs_state_dict']
EncodingNet.load_state_dict(state_dict)
model.load_state_dict(ckpt['network_fn_state_dict'])
# if model_fine is not None:
# model_fine.load_state_dict(ckpt['network_fine_state_dict'])
##########################
render_kwargs_train = {
'network_query_fn': network_query_fn,
'perturb': args.perturb,
'N_importance': args.N_importance,
'network_fine': model_fine,
'N_samples': args.N_samples,
'network_fn': model,
'network_mvs': EncodingNet,
'use_viewdirs': args.use_viewdirs,
'white_bkgd': args.white_bkgd,
'raw_noise_std': args.raw_noise_std,
}
render_kwargs_test = {k: render_kwargs_train[k] for k in render_kwargs_train}
render_kwargs_test['perturb'] = False
return render_kwargs_train, render_kwargs_test, start, grad_vars
def create_nerf_mvs_debug(args, pts_embedder=True, use_mvs=False, dir_embedder=True):
"""Instantiate mvs NeRF's MLP model.
"""
if pts_embedder:
embed_fn, input_ch = get_embedder(args.multires, args.i_embed, input_dims=args.pts_dim)
else:
embed_fn, input_ch = None, args.pts_dim
embeddirs_fn = None
if dir_embedder:
embeddirs_fn, input_ch_views = get_embedder(args.multires_views, args.i_embed, input_dims=args.dir_dim)
else:
embeddirs_fn, input_ch_views = None, args.dir_dim
skips = [4]
model = MVSNeRF(D=args.netdepth, W=args.netwidth,
input_ch_pts=input_ch, skips=skips,
input_ch_views=input_ch_views, input_ch_feat=args.feat_dim, net_type=args.net_type).to(device)
grad_vars = []
grad_vars += list(model.parameters())
model_fine = None
if args.N_importance > 0:
model_fine = MVSNeRF(D=args.netdepth, W=args.netwidth,
input_ch_pts=input_ch, skips=skips,
input_ch_views=input_ch_views, input_ch_feat=args.feat_dim).to(device)
grad_vars += list(model_fine.parameters())
network_query_fn = lambda pts, viewdirs, rays_feats, network_fn: run_network_mvs(pts, viewdirs, rays_feats, network_fn,
embed_fn=embed_fn,
embeddirs_fn=embeddirs_fn,
netchunk=args.netchunk)
EncodingNet = None
if use_mvs:
EncodingNet = MVSNet_debug().to(device)
grad_vars += list(EncodingNet.parameters()) #!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
start = 0
##########################
# Load checkpoints
ckpts = []
if args.ckpt is not None and args.ckpt != 'None':
ckpts = [args.ckpt]
print('Found ckpts', ckpts)
if len(ckpts) > 0 :
ckpt_path = ckpts[-1]
print('Reloading from', ckpt_path)
ckpt = torch.load(ckpt_path)
# Load model
if use_mvs:
state_dict = ckpt['network_mvs_state_dict']
EncodingNet.load_state_dict(state_dict)
model.load_state_dict(ckpt['network_fn_state_dict'])
# if model_fine is not None:
# model_fine.load_state_dict(ckpt['network_fine_state_dict'])
##########################
render_kwargs_train = {
'network_query_fn': network_query_fn,
'perturb': args.perturb,
'N_importance': args.N_importance,
'network_fine': model_fine,
'N_samples': args.N_samples,
'network_fn': model,
'network_mvs': EncodingNet,
'use_viewdirs': args.use_viewdirs,
'white_bkgd': args.white_bkgd,
'raw_noise_std': args.raw_noise_std,
}
render_kwargs_test = {k: render_kwargs_train[k] for k in render_kwargs_train}
render_kwargs_test['perturb'] = False
return render_kwargs_train, render_kwargs_test, start, grad_vars
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
############################################# MVS Net models ################################################
class ConvBnReLU(nn.Module):
def __init__(self, in_channels, out_channels,
kernel_size=3, stride=1, pad=1,
norm_act=InPlaceABN):
super(ConvBnReLU, self).__init__()
self.conv = nn.Conv2d(in_channels, out_channels,
kernel_size, stride=stride, padding=pad, bias=False)
self.bn = norm_act(out_channels)
def forward(self, x):
return self.bn(self.conv(x))
class ConvBnReLU3D(nn.Module):
def __init__(self, in_channels, out_channels,
kernel_size=3, stride=1, pad=1,
norm_act=InPlaceABN):
super(ConvBnReLU3D, self).__init__()
self.conv = nn.Conv3d(in_channels, out_channels,
kernel_size, stride=stride, padding=pad, bias=False)
self.bn = norm_act(out_channels)
# self.bn = nn.ReLU()
def forward(self, x):
return self.bn(self.conv(x))
################################### feature net ######################################
class FeatureNet(nn.Module):
"""
output 3 levels of features using a FPN structure
"""
def __init__(self, norm_act=InPlaceABN):
super(FeatureNet, self).__init__()
self.conv0 = nn.Sequential(
ConvBnReLU(3, 8, 3, 1, 1, norm_act=norm_act),
ConvBnReLU(8, 8, 3, 1, 1, norm_act=norm_act))
self.conv1 = nn.Sequential(
ConvBnReLU(8, 16, 5, 2, 2, norm_act=norm_act),
ConvBnReLU(16, 16, 3, 1, 1, norm_act=norm_act),
ConvBnReLU(16, 16, 3, 1, 1, norm_act=norm_act))
self.conv2 = nn.Sequential(
ConvBnReLU(16, 32, 5, 2, 2, norm_act=norm_act),
ConvBnReLU(32, 32, 3, 1, 1, norm_act=norm_act),
ConvBnReLU(32, 32, 3, 1, 1, norm_act=norm_act))
self.toplayer = nn.Conv2d(32, 32, 1)
def _upsample_add(self, x, y):
return F.interpolate(x, scale_factor=2,
mode="bilinear", align_corners=True) + y
def forward(self, x):
# x: (B, 3, H, W)
x = self.conv0(x) # (B, 8, H, W)
x = self.conv1(x) # (B, 16, H//2, W//2)
x = self.conv2(x) # (B, 32, H//4, W//4)
x = self.toplayer(x) # (B, 32, H//4, W//4)
return x
class CostRegNet(nn.Module):
def __init__(self, in_channels, norm_act=InPlaceABN):
super(CostRegNet, self).__init__()
self.conv0 = ConvBnReLU3D(in_channels, 8, norm_act=norm_act)
self.conv1 = ConvBnReLU3D(8, 16, stride=2, norm_act=norm_act)
self.conv2 = ConvBnReLU3D(16, 16, norm_act=norm_act)
self.conv3 = ConvBnReLU3D(16, 32, stride=2, norm_act=norm_act)
self.conv4 = ConvBnReLU3D(32, 32, norm_act=norm_act)
self.conv5 = ConvBnReLU3D(32, 64, stride=2, norm_act=norm_act)
self.conv6 = ConvBnReLU3D(64, 64, norm_act=norm_act)
self.conv7 = nn.Sequential(
nn.ConvTranspose3d(64, 32, 3, padding=1, output_padding=1,
stride=2, bias=False),
norm_act(32))
self.conv9 = nn.Sequential(
nn.ConvTranspose3d(32, 16, 3, padding=1, output_padding=1,
stride=2, bias=False),
norm_act(16))
self.conv11 = nn.Sequential(
nn.ConvTranspose3d(16, 8, 3, padding=1, output_padding=1,
stride=2, bias=False),
norm_act(8))
# self.conv12 = nn.Conv3d(8, 8, 3, stride=1, padding=1, bias=True)
def forward(self, x):
conv0 = self.conv0(x)
conv2 = self.conv2(self.conv1(conv0))
conv4 = self.conv4(self.conv3(conv2))
x = self.conv6(self.conv5(conv4))
x = conv4 + self.conv7(x)
del conv4
x = conv2 + self.conv9(x)
del conv2
x = conv0 + self.conv11(x)
del conv0
# x = self.conv12(x)
return x
class MVSNet(nn.Module):
def __init__(self,
num_groups=1,
norm_act=InPlaceABN,
levels=1):
super(MVSNet, self).__init__()
self.levels = levels # 3 depth levels
self.n_depths = [128,32,8]
self.G = num_groups # number of groups in groupwise correlation
self.feature = FeatureNet()
self.N_importance = 0
self.chunk = 1024
self.cost_reg_2 = CostRegNet(32+9, norm_act)
def build_volume_costvar(self, feats, proj_mats, depth_values, pad=0):
# feats: (B, V, C, H, W)
# proj_mats: (B, V, 3, 4)
# depth_values: (B, D, H, W)
# cost_reg: nn.Module of input (B, C, D, h, w) and output (B, 1, D, h, w)
# volume_sum [B, G, D, h, w]
# prob_volume [B D H W]
# volume_feature [B C D H W]
B, V, C, H, W = feats.shape
D = depth_values.shape[1]
ref_feats, src_feats = feats[:, 0], feats[:, 1:]
src_feats = src_feats.permute(1, 0, 2, 3, 4) # (V-1, B, C, h, w)
proj_mats = proj_mats[:, 1:]
proj_mats = proj_mats.permute(1, 0, 2, 3) # (V-1, B, 3, 4)
if pad > 0:
ref_feats = F.pad(ref_feats, (pad, pad, pad, pad), "constant", 0)
ref_volume = ref_feats.unsqueeze(2).repeat(1, 1, D, 1, 1) # (B, C, D, h, w)
volume_sum = ref_volume
volume_sq_sum = ref_volume ** 2
del ref_feats
in_masks = torch.ones((B, 1, D, H + pad * 2, W + pad * 2), device=volume_sum.device)
for i, (src_feat, proj_mat) in enumerate(zip(src_feats, proj_mats)):
warped_volume, grid = homo_warp(src_feat, proj_mat, depth_values, pad=pad)
grid = grid.view(B, 1, D, H + pad * 2, W + pad * 2, 2)
in_mask = ((grid > -1.0) * (grid < 1.0))
in_mask = (in_mask[..., 0] * in_mask[..., 1])
in_masks += in_mask.float()
if self.training:
volume_sum = volume_sum + warped_volume
volume_sq_sum = volume_sq_sum + warped_volume ** 2
else:
volume_sum += warped_volume
volume_sq_sum += warped_volume.pow_(2)
del warped_volume, src_feat, proj_mat
del src_feats, proj_mats
count = 1.0 / in_masks
img_feat = volume_sq_sum * count - (volume_sum * count) ** 2
del volume_sq_sum, volume_sum, count
return img_feat, in_masks
def build_volume_costvar_img(self, imgs, feats, proj_mats, depth_values, pad=0):
# feats: (B, V, C, H, W)
# proj_mats: (B, V, 3, 4)
# depth_values: (B, D, H, W)
# cost_reg: nn.Module of input (B, C, D, h, w) and output (B, 1, D, h, w)
# volume_sum [B, G, D, h, w]
# prob_volume [B D H W]
# volume_feature [B C D H W]
B, V, C, H, W = feats.shape
D = depth_values.shape[1]
ref_feats, src_feats = feats[:, 0], feats[:, 1:]
src_feats = src_feats.permute(1, 0, 2, 3, 4) # (V-1, B, C, h, w)
proj_mats = proj_mats[:, 1:]
proj_mats = proj_mats.permute(1, 0, 2, 3) # (V-1, B, 3, 4)
if pad > 0:
ref_feats = F.pad(ref_feats, (pad, pad, pad, pad), "constant", 0)
img_feat = torch.empty((B, 9 + 32, D, *ref_feats.shape[-2:]), device=feats.device, dtype=torch.float)
imgs = F.interpolate(imgs.view(B * V, *imgs.shape[2:]), (H, W), mode='bilinear', align_corners=False).view(B, V,-1,H,W).permute(1, 0, 2, 3, 4)
img_feat[:, :3, :, pad:H + pad, pad:W + pad] = imgs[0].unsqueeze(2).expand(-1, -1, D, -1, -1)
ref_volume = ref_feats.unsqueeze(2).repeat(1, 1, D, 1, 1) # (B, C, D, h, w)
volume_sum = ref_volume
volume_sq_sum = ref_volume ** 2
del ref_feats
in_masks = torch.ones((B, V, D, H + pad * 2, W + pad * 2), device=volume_sum.device)
for i, (src_img, src_feat, proj_mat) in enumerate(zip(imgs[1:], src_feats, proj_mats)):
warped_volume, grid = homo_warp(src_feat, proj_mat, depth_values, pad=pad)
img_feat[:, (i + 1) * 3:(i + 2) * 3], _ = homo_warp(src_img, proj_mat, depth_values, src_grid=grid, pad=pad)
grid = grid.view(B, 1, D, H + pad * 2, W + pad * 2, 2)
in_mask = ((grid > -1.0) * (grid < 1.0))
in_mask = (in_mask[..., 0] * in_mask[..., 1])
in_masks[:, i + 1] = in_mask.float()
if self.training:
volume_sum = volume_sum + warped_volume
volume_sq_sum = volume_sq_sum + warped_volume ** 2
else:
volume_sum += warped_volume
volume_sq_sum += warped_volume.pow_(2)
del warped_volume, src_feat, proj_mat
del src_feats, proj_mats
count = 1.0 / torch.sum(in_masks, dim=1, keepdim=True)
img_feat[:, -32:] = volume_sq_sum * count - (volume_sum * count) ** 2
del volume_sq_sum, volume_sum, count
return img_feat, in_masks
def forward(self, imgs, proj_mats, near_far, pad=0, return_color=False, lindisp=False):
# imgs: (B, V, 3, H, W)
# proj_mats: (B, V, 3, 4) from fine to coarse
# init_depth_min, depth_interval: (B) or float
# near_far (B, V, 2)
B, V, _, H, W = imgs.shape
imgs = imgs.reshape(B * V, 3, H, W)
feats = self.feature(imgs) # (B*V, 8, H, W), (B*V, 16, H//2, W//2), (B*V, 32, H//4, W//4)
imgs = imgs.view(B, V, 3, H, W)
feats_l = feats # (B*V, C, h, w)
feats_l = feats_l.view(B, V, *feats_l.shape[1:]) # (B, V, C, h, w)
D = 128
t_vals = torch.linspace(0., 1., steps=D, device=imgs.device, dtype=imgs.dtype) # (B, D)
near, far = near_far # assume batch size==1
if not lindisp:
depth_values = near * (1.-t_vals) + far * (t_vals)
else:
depth_values = 1. / (1. / near * (1. - t_vals) + 1. / far * (t_vals))
depth_values = depth_values.unsqueeze(0)
# volume_feat, in_masks = self.build_volume_costvar(feats_l, proj_mats, depth_values, pad=pad)
volume_feat, in_masks = self.build_volume_costvar_img(imgs, feats_l, proj_mats, depth_values, pad=pad)
if return_color:
feats_l = torch.cat((volume_feat[:,:V*3].view(B, V, 3, *volume_feat.shape[2:]),in_masks.unsqueeze(2)),dim=2)
volume_feat = self.cost_reg_2(volume_feat) # (B, 1, D, h, w)
volume_feat = volume_feat.reshape(1,-1,*volume_feat.shape[2:])
return volume_feat, feats_l, depth_values
class MVSNet_debug(nn.Module):
def __init__(self,
num_groups=1,
norm_act=InPlaceABN,
levels=1):
super(MVSNet_debug, self).__init__()
self.levels = levels # 3 depth levels
self.n_depths = [128,32,8]
self.G = num_groups # number of groups in groupwise correlation
self.feature = FeatureNet()
self.N_importance = 0
self.chunk = 1024
self.cost_reg_2 = CostRegNet(32+9, norm_act)
def build_volume_costvar(self, feats, proj_mats, depth_values, pad=0):
# feats: (B, V, C, H, W)
# proj_mats: (B, V, 3, 4)
# depth_values: (B, D, H, W)
# cost_reg: nn.Module of input (B, C, D, h, w) and output (B, 1, D, h, w)
# volume_sum [B, G, D, h, w]
# prob_volume [B D H W]
# volume_feature [B C D H W]
B, V, C, H, W = feats.shape
D = depth_values.shape[1]
ref_feats, src_feats = feats[:, 0], feats[:, 1:]
src_feats = src_feats.permute(1, 0, 2, 3, 4) # (V-1, B, C, h, w)
proj_mats = proj_mats[:, 1:]
proj_mats = proj_mats.permute(1, 0, 2, 3) # (V-1, B, 3, 4)
if pad > 0:
ref_feats = F.pad(ref_feats, (pad, pad, pad, pad), "constant", 0)
ref_volume = ref_feats.unsqueeze(2).repeat(1, 1, D, 1, 1) # (B, C, D, h, w)
volume_sum = ref_volume
volume_sq_sum = ref_volume ** 2
del ref_feats
in_masks = torch.ones((B, 1, D, H + pad * 2, W + pad * 2), device=volume_sum.device)
for i, (src_feat, proj_mat) in enumerate(zip(src_feats, proj_mats)):
warped_volume, grid = homo_warp_debug(src_feat, proj_mat, depth_values, pad=pad)
grid = grid.view(B, 1, D, H + pad * 2, W + pad * 2, 2)
in_mask = ((grid > -1.0) * (grid < 1.0))
in_mask = (in_mask[..., 0] * in_mask[..., 1])
in_masks += in_mask.float()
if self.training:
volume_sum = volume_sum + warped_volume
volume_sq_sum = volume_sq_sum + warped_volume ** 2
else:
volume_sum += warped_volume
volume_sq_sum += warped_volume.pow_(2)
del warped_volume, src_feat, proj_mat
del src_feats, proj_mats
count = 1.0 / in_masks
img_feat = volume_sq_sum * count - (volume_sum * count) ** 2
del volume_sq_sum, volume_sum, count
return img_feat, in_masks
def build_volume_costvar_img(self, imgs, feats, proj_mats, depth_values, pad=0):
# feats: (B, V, C, H, W)
# proj_mats: (B, V, 3, 4)
# depth_values: (B, D, H, W)
# cost_reg: nn.Module of input (B, C, D, h, w) and output (B, 1, D, h, w)
# volume_sum [B, G, D, h, w]
# prob_volume [B D H W]
# volume_feature [B C D H W]
B, V, C, H, W = feats.shape
D = depth_values.shape[1]
ref_feats, src_feats = feats[:, 0], feats[:, 1:]
src_feats = src_feats.permute(1, 0, 2, 3, 4) # (V-1, B, C, h, w)
proj_mats = proj_mats[:, 1:]
proj_mats = proj_mats.permute(1, 0, 2, 3) # (V-1, B, 3, 4)
if pad > 0:
ref_feats = F.pad(ref_feats, (pad, pad, pad, pad), "constant", 0)
img_feat = torch.empty((B, 9 + 32, D, *ref_feats.shape[-2:]), device=feats.device, dtype=torch.float)
imgs = F.interpolate(imgs.view(B * V, *imgs.shape[2:]), (H, W), mode='bilinear', align_corners=False).view(B, V,-1,H,W).permute(1, 0, 2, 3, 4)
img_feat[:, :3, :, pad:H + pad, pad:W + pad] = imgs[0].unsqueeze(2).expand(-1, -1, D, -1, -1)
ref_volume = ref_feats.unsqueeze(2).repeat(1, 1, D, 1, 1) # (B, C, D, h, w)
volume_sum = ref_volume
volume_sq_sum = ref_volume ** 2
del ref_feats
in_masks = torch.ones((B, V, D, H + pad * 2, W + pad * 2), device=volume_sum.device)
for i, (src_img, src_feat, proj_mat) in enumerate(zip(imgs[1:], src_feats, proj_mats)):
warped_volume, grid = homo_warp(src_feat, proj_mat, depth_values, pad=pad)
img_feat[:, (i + 1) * 3:(i + 2) * 3], _ = homo_warp(src_img, proj_mat, depth_values, src_grid=grid, pad=pad)
grid = grid.view(B, 1, D, H + pad * 2, W + pad * 2, 2)
in_mask = ((grid > -1.0) * (grid < 1.0))
in_mask = (in_mask[..., 0] * in_mask[..., 1])
in_masks[:, i + 1] = in_mask.float()
if self.training:
volume_sum = volume_sum + warped_volume
volume_sq_sum = volume_sq_sum + warped_volume ** 2
else:
volume_sum += warped_volume
volume_sq_sum += warped_volume.pow_(2)
del warped_volume, src_feat, proj_mat
del src_feats, proj_mats
count = 1.0 / torch.sum(in_masks, dim=1, keepdim=True)
img_feat[:, -32:] = volume_sq_sum * count - (volume_sum * count) ** 2
del volume_sq_sum, volume_sum, count
return img_feat, in_masks
def build_volume_costvar_img_debug(self, imgs, feats, proj_mats, depth_values, pad=0):
# feats: (B, V, C, H, W)
# proj_mats: (B, V, 3, 4)
# depth_values: (B, D, H, W)
# cost_reg: nn.Module of input (B, C, D, h, w) and output (B, 1, D, h, w)
# volume_sum [B, G, D, h, w]
# prob_volume [B D H W]
# volume_feature [B C D H W]
B, V, C, H, W = feats.shape
D = depth_values.shape[1]
ref_feats, src_feats = feats[:, 0], feats[:, 1:]
src_feats = src_feats.permute(1, 0, 2, 3, 4) # (V-1, B, C, h, w)
proj_mats = proj_mats[:, 1:]
proj_mats = proj_mats.permute(1, 0, 2, 3) # (V-1, B, 3, 4)
ForkedPdb().set_trace()
if pad > 0:
ref_feats = F.pad(ref_feats, (pad, pad, pad, pad), "constant", 0)
img_feat = torch.empty((B, 9 + 32, D, *ref_feats.shape[-2:]), device=feats.device, dtype=torch.float)
imgs = F.interpolate(imgs.view(B * V, *imgs.shape[2:]), (H, W), mode='bilinear', align_corners=False).view(B, V,-1,H,W).permute(1, 0, 2, 3, 4)
img_feat[:, :3, :, pad:H + pad, pad:W + pad] = imgs[0].unsqueeze(2).expand(-1, -1, D, -1, -1)
ref_volume = ref_feats.unsqueeze(2).repeat(1, 1, D, 1, 1) # (B, C, D, h, w)
volume_sum = ref_volume
volume_sq_sum = ref_volume ** 2
del ref_feats
in_masks = torch.ones((B, V, D, H + pad * 2, W + pad * 2), device=volume_sum.device)
for i, (src_img, src_feat, proj_mat) in enumerate(zip(imgs[1:], src_feats, proj_mats)):
warped_volume, grid = homo_warp_debug(src_feat, proj_mat, depth_values, pad=pad)
img_feat[:, (i + 1) * 3:(i + 2) * 3], _ = homo_warp_debug(src_img, proj_mat, depth_values, src_grid=grid, pad=pad)
grid = grid.view(B, 1, D, H + pad * 2, W + pad * 2, 2)
in_mask = ((grid > -1.0) * (grid < 1.0))
in_mask = (in_mask[..., 0] * in_mask[..., 1])
in_masks[:, i + 1] = in_mask.float()
if self.training:
volume_sum = volume_sum + warped_volume
volume_sq_sum = volume_sq_sum + warped_volume ** 2
else:
volume_sum += warped_volume
volume_sq_sum += warped_volume.pow_(2)
del warped_volume, src_feat, proj_mat
del src_feats, proj_mats
count = 1.0 / torch.sum(in_masks, dim=1, keepdim=True)
img_feat[:, -32:] = volume_sq_sum * count - (volume_sum * count) ** 2
del volume_sq_sum, volume_sum, count
return img_feat, in_masks
def forward(self, imgs, proj_mats, near_far, pad=0, return_color=False, lindisp=False):
# imgs: (B, V, 3, H, W)
# proj_mats: (B, V, 3, 4) from fine to coarse
# init_depth_min, depth_interval: (B) or float
# near_far (B, V, 2)
B, V, _, H, W = imgs.shape
imgs = imgs.reshape(B * V, 3, H, W)
feats = self.feature(imgs) # (B*V, 8, H, W), (B*V, 16, H//2, W//2), (B*V, 32, H//4, W//4)
imgs = imgs.view(B, V, 3, H, W)
feats_l = feats # (B*V, C, h, w)
feats_l = feats_l.view(B, V, *feats_l.shape[1:]) # (B, V, C, h, w)
D = 128
t_vals = torch.linspace(0., 1., steps=D, device=imgs.device, dtype=imgs.dtype) # (B, D)
near, far = near_far # assume batch size==1
if not lindisp:
depth_values = near * (1.-t_vals) + far * (t_vals)
else:
depth_values = 1. / (1. / near * (1. - t_vals) + 1. / far * (t_vals))
depth_values = depth_values.unsqueeze(0)
# volume_feat, in_masks = self.build_volume_costvar(feats_l, proj_mats, depth_values, pad=pad)
volume_feat, in_masks = self.build_volume_costvar_img_debug(imgs, feats_l, proj_mats, depth_values, pad=pad)
if return_color:
feats_l = torch.cat((volume_feat[:,:V*3].view(B, V, 3, *volume_feat.shape[2:]),in_masks.unsqueeze(2)),dim=2)
volume_feat = self.cost_reg_2(volume_feat) # (B, 1, D, h, w)
volume_feat = volume_feat.reshape(1,-1,*volume_feat.shape[2:])
return volume_feat, feats_l, depth_values
class RefVolume(nn.Module):
def __init__(self, volume):
super(RefVolume, self).__init__()
self.feat_volume = nn.Parameter(volume)
def forward(self, ray_coordinate_ref):
'''coordinate: [N, 3]
z,x,y
'''
device = self.feat_volume.device
H, W = ray_coordinate_ref.shape[-3:-1]
grid = ray_coordinate_ref.view(-1, 1, H, W, 3).to(device) * 2 - 1.0 # [1 1 H W 3] (x,y,z)
features = F.grid_sample(self.feat_volume, grid, align_corners=True, mode='bilinear')[:, :, 0].permute(2, 3, 0,1).squeeze()
return features
| 37.542678 | 166 | 0.572379 | 7,119 | 47,942 | 3.613148 | 0.053238 | 0.030752 | 0.016484 | 0.010691 | 0.822409 | 0.804175 | 0.792434 | 0.784698 | 0.778011 | 0.778011 | 0 | 0.026579 | 0.290559 | 47,942 | 1,276 | 167 | 37.5721 | 0.729684 | 0.082183 | 0 | 0.742232 | 0 | 0 | 0.014368 | 0.001977 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055236 | false | 0 | 0.01496 | 0.004603 | 0.124281 | 0.004603 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c75327b1b3f6610aa7f75279ad1cc406be5f6e28 | 25,682 | py | Python | src/datashare/azext_datashare/manual/custom.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/datashare/azext_datashare/manual/custom.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/datashare/azext_datashare/manual/custom.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
# pylint: disable=line-too-long
# pylint: disable=too-many-lines
# pylint: disable=unused-argument
from azure.cli.core.util import sdk_no_wait
def datashare_account_list(cmd, client,
resource_group_name=None):
if resource_group_name:
return client.list_by_resource_group(resource_group_name=resource_group_name)
return client.list_by_subscription()
def datashare_account_show(cmd, client,
resource_group_name,
account_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name)
def datashare_account_create(cmd, client,
resource_group_name,
account_name,
identity=None,
location=None,
tags=None,
no_wait=False):
if identity is None:
identity = {'type': 'SystemAssigned'}
return sdk_no_wait(no_wait,
client.begin_create,
resource_group_name=resource_group_name,
account_name=account_name,
location=location,
tags=tags,
identity=identity)
def datashare_account_update(cmd, client,
resource_group_name,
account_name,
tags=None):
return client.update(resource_group_name=resource_group_name, account_name=account_name, tags=tags)
def datashare_account_delete(cmd, client,
resource_group_name,
account_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name)
def datashare_consumer_invitation_list(cmd, client):
return client.list_invitation()
def datashare_consumer_invitation_show(cmd, client,
location,
invitation_id):
return client.get(location=location, invitation_id=invitation_id)
def datashare_consumer_invitation_reject_invitation(cmd, client,
location,
invitation_id):
return client.reject_invitation(location=location, invitation_id=invitation_id)
def datashare_data_set_list(cmd, client,
resource_group_name,
account_name,
share_name):
return client.list_by_share(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_data_set_show(cmd, client,
resource_group_name,
account_name,
share_name,
data_set_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, data_set_name=data_set_name)
def datashare_data_set_create(cmd, client,
resource_group_name,
account_name,
share_name,
data_set_name,
data_set):
from azure.cli.core.commands.client_factory import get_subscription_id
if 'resource_group' not in data_set:
data_set['resource_group'] = resource_group_name
if 'subscription_id' not in data_set:
data_set['subscription_id'] = get_subscription_id(cmd.cli_ctx)
return client.create(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
data_set_name=data_set_name,
data_set=data_set)
def datashare_data_set_delete(cmd, client,
resource_group_name,
account_name,
share_name,
data_set_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
data_set_name=data_set_name)
def datashare_data_set_mapping_list(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.list_by_share_subscription(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
def datashare_data_set_mapping_show(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
data_set_mapping_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name, data_set_mapping_name=data_set_mapping_name)
def datashare_data_set_mapping_create(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
data_set_mapping_name,
data_set_mapping):
from azure.cli.core.commands.client_factory import get_subscription_id
if 'resource_group' not in data_set_mapping:
data_set_mapping['resource_group'] = resource_group_name
if 'subscription_id' not in data_set_mapping:
data_set_mapping['subscription_id'] = get_subscription_id(cmd.cli_ctx)
return client.create(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
data_set_mapping_name=data_set_mapping_name,
data_set_mapping=data_set_mapping)
def datashare_data_set_mapping_delete(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
data_set_mapping_name):
return client.delete(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name, data_set_mapping_name=data_set_mapping_name)
def datashare_invitation_list(cmd, client,
resource_group_name,
account_name,
share_name):
return client.list_by_share(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_invitation_show(cmd, client,
resource_group_name,
account_name,
share_name,
invitation_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, invitation_name=invitation_name)
def datashare_invitation_create(cmd, client,
resource_group_name,
account_name,
share_name,
invitation_name,
target_active_directory_id=None,
target_email=None,
target_object_id=None):
return client.create(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, invitation_name=invitation_name, target_active_directory_id=target_active_directory_id, target_email=target_email, target_object_id=target_object_id)
def datashare_invitation_delete(cmd, client,
resource_group_name,
account_name,
share_name,
invitation_name):
return client.delete(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, invitation_name=invitation_name)
def datashare_share_list(cmd, client,
resource_group_name,
account_name):
return client.list_by_account(resource_group_name=resource_group_name, account_name=account_name)
def datashare_share_show(cmd, client,
resource_group_name,
account_name,
share_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name)
def datashare_share_create(cmd, client,
resource_group_name,
account_name,
share_name,
description=None,
share_kind=None,
terms=None):
return client.create(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, description=description, share_kind=share_kind, terms=terms)
def datashare_share_delete(cmd, client,
resource_group_name,
account_name,
share_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_share_list_synchronization_detail(cmd, client,
resource_group_name,
account_name,
share_name,
synchronization_id=None):
return client.list_synchronization_detail(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
synchronization_id=synchronization_id)
def datashare_share_list_synchronization(cmd, client,
resource_group_name,
account_name,
share_name):
return client.list_synchronization(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_provider_share_subscription_list(cmd, client,
resource_group_name,
account_name,
share_name):
return client.list_by_share(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_provider_share_subscription_show(cmd, client,
resource_group_name,
account_name,
share_name,
provider_share_subscription_id):
return client.get_by_share(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, provider_share_subscription_id=provider_share_subscription_id)
def datashare_provider_share_subscription_revoke(cmd, client,
resource_group_name,
account_name,
share_name,
provider_share_subscription_id,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_revoke,
resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
provider_share_subscription_id=provider_share_subscription_id)
def datashare_provider_share_subscription_reinstate(cmd, client,
resource_group_name,
account_name,
share_name,
provider_share_subscription_id):
return client.reinstate(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, provider_share_subscription_id=provider_share_subscription_id)
def datashare_share_subscription_list(cmd, client,
resource_group_name,
account_name):
return client.list_by_account(resource_group_name=resource_group_name, account_name=account_name)
def datashare_share_subscription_show(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name)
def datashare_share_subscription_create(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
invitation_id,
source_share_location):
return client.create(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name, invitation_id=invitation_id, source_share_location=source_share_location)
def datashare_share_subscription_delete(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
def datashare_share_subscription_list_synchronization_detail(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
synchronization_id):
return client.list_synchronization_detail(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
synchronization_id=synchronization_id)
def datashare_share_subscription_synchronize(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
synchronization_mode=None,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_synchronize,
resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
synchronization_mode=synchronization_mode)
def datashare_share_subscription_cancel_synchronization(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
synchronization_id,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_cancel_synchronization,
resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
synchronization_id=synchronization_id)
def datashare_share_subscription_list_source_share_synchronization_setting(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.list_source_share_synchronization_setting(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
def datashare_share_subscription_list_synchronization(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.list_synchronization(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
def _datashare_share_subscription_get_synchronization(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
synchronization_id):
from knack.util import todict
from azure.cli.core.commands import AzCliCommandInvoker
result = client.list_synchronization(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
result = todict(list(result), AzCliCommandInvoker.remove_additional_prop_layer)
return next((x for x in result if x['synchronizationId'] == synchronization_id), None)
def datashare_consumer_source_data_set_list(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.list_by_share_subscription(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name)
def datashare_synchronization_setting_list(cmd, client,
resource_group_name,
account_name,
share_name):
return client.list_by_share(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name)
def datashare_synchronization_setting_show(cmd, client,
resource_group_name,
account_name,
share_name,
synchronization_setting_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_name=share_name, synchronization_setting_name=synchronization_setting_name)
# def _format_datetime(date_string):
# from dateutil.parser import parse
# try:
# return parse(date_string).strftime("%Y-%m-%dT%H:%M:%SZ")
# except ValueError:
# # logger.debug("Unable to parse date_string '%s'", date_string)
# return date_string or ' '
def datashare_synchronization_setting_create(cmd, client,
resource_group_name,
account_name,
share_name,
synchronization_setting_name,
recurrence_interval,
synchronization_time,
kind=None):
synchronization_setting = {
'synchronizationTime': synchronization_time,
'recurrenceInterval': recurrence_interval,
'kind': kind
}
return client.create(resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
synchronization_setting_name=synchronization_setting_name,
synchronization_setting=synchronization_setting)
def datashare_synchronization_setting_delete(cmd, client,
resource_group_name,
account_name,
share_name,
synchronization_setting_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name,
share_name=share_name,
synchronization_setting_name=synchronization_setting_name)
def datashare_trigger_list(cmd, client,
resource_group_name,
account_name,
share_subscription_name):
return client.list_by_share_subscription(resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name)
def datashare_trigger_show(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
trigger_name):
return client.get(resource_group_name=resource_group_name, account_name=account_name, share_subscription_name=share_subscription_name, trigger_name=trigger_name)
def datashare_trigger_create(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
trigger_name,
recurrence_interval,
synchronization_time,
kind=None,
no_wait=False):
synchronization_setting = {
'synchronizationTime': synchronization_time,
'recurrenceInterval': recurrence_interval,
'kind': kind
}
return sdk_no_wait(no_wait,
client.begin_create,
resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
trigger_name=trigger_name,
trigger=synchronization_setting)
def datashare_trigger_delete(cmd, client,
resource_group_name,
account_name,
share_subscription_name,
trigger_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
account_name=account_name,
share_subscription_name=share_subscription_name,
trigger_name=trigger_name)
| 49.293666 | 265 | 0.522584 | 2,165 | 25,682 | 5.709007 | 0.061432 | 0.15356 | 0.193932 | 0.174757 | 0.866667 | 0.845146 | 0.830987 | 0.804693 | 0.781311 | 0.778641 | 0 | 0 | 0.429172 | 25,682 | 520 | 266 | 49.388462 | 0.843111 | 0.027412 | 0 | 0.662562 | 0 | 0 | 0.009334 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12069 | false | 0 | 0.012315 | 0.103448 | 0.256158 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
c76515004e8897acf0fb29e946078fd3d098a35f | 160 | py | Python | frappe/patches/v12_0/setup_email_linking.py | erpnext-tm/frappe | 7b470f28e1cf00b0659c01e06a2d0a4693b28d98 | [
"MIT"
] | null | null | null | frappe/patches/v12_0/setup_email_linking.py | erpnext-tm/frappe | 7b470f28e1cf00b0659c01e06a2d0a4693b28d98 | [
"MIT"
] | null | null | null | frappe/patches/v12_0/setup_email_linking.py | erpnext-tm/frappe | 7b470f28e1cf00b0659c01e06a2d0a4693b28d98 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
from frappe.desk.page.setup_wizard.install_fixtures import setup_email_linking
def execute():
setup_email_linking()
| 20 | 78 | 0.85 | 22 | 160 | 5.681818 | 0.727273 | 0.16 | 0.272 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 160 | 7 | 79 | 22.857143 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c77ef079855052dd4e2bc1d8b3bd490d1f490c97 | 64,275 | py | Python | remodet_repository_wdh_part/Projects/DAP_XJX2.bak/mPoseNet_ResidualNet.py | UrwLee/Remo_experience | a59d5b9d6d009524672e415c77d056bc9dd88c72 | [
"MIT"
] | null | null | null | remodet_repository_wdh_part/Projects/DAP_XJX2.bak/mPoseNet_ResidualNet.py | UrwLee/Remo_experience | a59d5b9d6d009524672e415c77d056bc9dd88c72 | [
"MIT"
] | null | null | null | remodet_repository_wdh_part/Projects/DAP_XJX2.bak/mPoseNet_ResidualNet.py | UrwLee/Remo_experience | a59d5b9d6d009524672e415c77d056bc9dd88c72 | [
"MIT"
] | null | null | null | import caffe
from caffe import layers as L
from caffe import params as P
from PyLib.NetLib.MultiScaleLayer import *
from PyLib.NetLib.ConvBNLayer import *
from PyLib.NetLib.PoseNet import *
from PyLib.NetLib.VggNet import *
from mPoseBaseNet import *
from mPoseNet_Reduce import *
from mPoseNet_DarkNet import *
from solverParam_pose import flag_TX2_global
from BaseNet import *
def ResNet_UnitA(net, base_layer, name_prefix, stride, num_channel,bridge = False,num_channel_change = 0,
flag_hasresid = True,channel_scale = 4,check_macc = False,flag_withparamname = False):
add_layer = name_prefix + '_1x1Conv1'
ConvBNUnitLayer(net, base_layer, add_layer, use_bn=True, use_relu=True,
num_output=num_channel/channel_scale, kernel_size=1, pad=0, stride=1, use_scale=True, leaky=False,
check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
from_layer = add_layer
add_layer = name_prefix + '_3x3Conv'
ConvBNUnitLayer(net, from_layer, add_layer, use_bn=True, use_relu=False, leaky=False,
num_output=num_channel / channel_scale, kernel_size=3, pad=1, stride=stride, use_scale=True,
n_group=1,check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
from_layer = add_layer
add_layer = name_prefix + '_1x1Conv2'
if num_channel_change != 0:
num_channel = num_channel_change
ConvBNUnitLayer(net, from_layer, add_layer, use_bn=True, use_relu=False, leaky=False,
num_output=num_channel, kernel_size=1, pad=0, stride=1, use_scale=True,
check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
if flag_hasresid:
from_layer = add_layer
if stride == 2:
feature_layers = []
feature_layers.append(net[from_layer])
add_layer = name_prefix + '_AVEPool'
net[add_layer] = L.Pooling(net[base_layer], pool=P.Pooling.AVE, kernel_size=2, stride=2, pad=0)
feature_layers.append(net[add_layer])
add_layer = name_prefix + '_Concat'
net[add_layer] = L.Concat(*feature_layers, axis=1)
else:
add_layer1 = from_layer
if bridge:
from_layer = base_layer
add_layer = name_prefix + '_bridge'
ConvBNUnitLayer(net, from_layer, add_layer, use_bn=True, use_relu=False, leaky=False,
num_output=num_channel, kernel_size=1, pad=0, stride=1, use_scale=True,check_macc=check_macc,
flag_withparamname = flag_withparamname,pose_string=pose_string)
add_layer2 = add_layer
else:
add_layer2 = base_layer
add_layer = name_prefix + '_Add'
net[add_layer] = L.Eltwise(net[add_layer1], net[add_layer2], eltwise_param=dict(operation=P.Eltwise.SUM))
from_layer = add_layer
add_layer = name_prefix + '_relu'
net[add_layer] = L.ReLU(net[from_layer], in_place=True)
def ResNetTwoLayers_UnitA(net, base_layer, name_prefix, stride, num_channel,bridge = False,num_channel_change = 0,
flag_hasresid = True,channel_scale = 4,check_macc = False,lr_mult=0.1,decay_mult=1.0,flag_withparamname=False):
add_layer = name_prefix + '_1x1Conv'
ConvBNUnitLayer(net, base_layer, add_layer, use_bn=True, use_relu=True,lr_mult=lr_mult, decay_mult=decay_mult,
num_output=num_channel/channel_scale, kernel_size=1, pad=0, stride=1, use_scale=True, leaky=False,
check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
from_layer = add_layer+pose_string
add_layer = name_prefix + '_3x3Conv'
if num_channel_change != 0:
num_channel = num_channel_change
ConvBNUnitLayer(net, from_layer, add_layer, use_bn=True, use_relu=False, leaky=False,lr_mult=lr_mult, decay_mult=decay_mult,
num_output=num_channel, kernel_size=3, pad=1, stride=stride, use_scale=True,
n_group=1,check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
# for old_name in net.keys():
# print old_name,'$$$$$'
if flag_hasresid:
from_layer = add_layer+pose_string
if stride == 2:
feature_layers = []
feature_layers.append(net[from_layer])
add_layer = name_prefix + '_AVEPool'+pose_string
net[add_layer] = L.Pooling(net[base_layer], pool=P.Pooling.AVE, kernel_size=2, stride=2, pad=0)
feature_layers.append(net[add_layer])
add_layer = name_prefix + '_Concat'+pose_string
net[add_layer] = L.Concat(*feature_layers, axis=1)
# for old_name in net.keys():
# print old_name,'^^^'
else:
add_layer1 = from_layer
if bridge:
from_layer = base_layer
add_layer = name_prefix + '_bridge'
# for old_name in net.keys():
# print old_name,'!!!'
ConvBNUnitLayer(net, from_layer, add_layer, use_bn=True, use_relu=False, leaky=False,lr_mult=lr_mult, decay_mult=decay_mult,
num_output=num_channel, kernel_size=1, pad=0, stride=1, use_scale=True,check_macc=check_macc,flag_withparamname=flag_withparamname,pose_string=pose_string)
# for old_name in net.keys():
# print old_name,'~~~'
add_layer2 = add_layer+pose_string
else:
add_layer2 = base_layer
add_layer = name_prefix + '_Add'+pose_string
net[add_layer] = L.Eltwise(net[add_layer1], net[add_layer2], eltwise_param=dict(operation=P.Eltwise.SUM))
from_layer = add_layer
add_layer = name_prefix + '_relu'+pose_string
print from_layer,add_layer
net[add_layer] = L.ReLU(net[from_layer], in_place=True)
def resnext_block(bottom, base_output=64, card=32,kernel_size = 3):
"""
input:4*base_output x n x n
output:4*base_output x n x n
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
Args:
card:
card:
"""
conv1 = L.Convolution(bottom, num_output=base_output * (card / 16), kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv1_bn = L.BatchNorm(conv1, use_global_stats=False, in_place=True)
conv1_scale = L.Scale(conv1, scale_param=dict(bias_term=True), in_place=True)
conv1_relu = L.ReLU(conv1, in_place=True)
conv2 = L.Convolution(conv1, num_output=base_output * (card / 16), kernel_size=kernel_size, stride=1, pad=(kernel_size - 1)/2, group=card,
bias_term=False, param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'),engine=P.Convolution.CAFFE)
conv2_bn = L.BatchNorm(conv2, use_global_stats=False, in_place=True)
conv2_scale = L.Scale(conv2, scale_param=dict(bias_term=True), in_place=True)
conv2_relu = L.ReLU(conv2, in_place=True)
conv3 = L.Convolution(conv2, num_output=base_output * 4, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv3_bn = L.BatchNorm(conv3, use_global_stats=False, in_place=True)
conv3_scale = L.Scale(conv3, scale_param=dict(bias_term=True), in_place=True)
eltwise = L.Eltwise(bottom, conv3, eltwise_param=dict(operation=1))
eltwise_relu = L.ReLU(eltwise, in_place=True)
return conv1, conv1_bn, conv1_scale, conv1_relu, conv2, conv2_bn, conv2_scale, conv2_relu, \
conv3, conv3_bn, conv3_scale, eltwise, eltwise_relu
def resnet_block(bottom, base_output=64,kernel_size = 3):
"""
input:4*base_output x n x n
output:4*base_output x n x n
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
Args:
card:
"""
conv1 = L.Convolution(bottom, num_output=base_output, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv1_bn = L.BatchNorm(conv1, use_global_stats=False, in_place=True)
conv1_scale = L.Scale(conv1, scale_param=dict(bias_term=True), in_place=True)
conv1_relu = L.ReLU(conv1, in_place=True)
conv2 = L.Convolution(conv1, num_output=base_output , kernel_size=kernel_size, stride=1, pad=(kernel_size - 1)/2,
bias_term=False, param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv2_bn = L.BatchNorm(conv2, use_global_stats=False, in_place=True)
conv2_scale = L.Scale(conv2, scale_param=dict(bias_term=True), in_place=True)
conv2_relu = L.ReLU(conv2, in_place=True)
conv3 = L.Convolution(conv2, num_output=base_output*4, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv3_bn = L.BatchNorm(conv3, use_global_stats=False, in_place=True)
conv3_scale = L.Scale(conv3, scale_param=dict(bias_term=True), in_place=True)
eltwise = L.Eltwise(bottom, conv3, eltwise_param=dict(operation=1))
eltwise_relu = L.ReLU(eltwise, in_place=True)
return conv1, conv1_bn, conv1_scale, conv1_relu, conv2, conv2_bn, conv2_scale, conv2_relu, \
conv3, conv3_bn, conv3_scale, eltwise, eltwise_relu
def match_block(bottom, base_output=64, stride=2, card=32, kernel_size = 3):
"""
input:4*base_output x n x n
output:4*base_output x n x n
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
"""
conv1 = L.Convolution(bottom, num_output=base_output * (card / 16), kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv1_bn = L.BatchNorm(conv1, use_global_stats=False, in_place=True)
conv1_scale = L.Scale(conv1, scale_param=dict(bias_term=True), in_place=True)
conv1_relu = L.ReLU(conv1, in_place=True)
conv2 = L.Convolution(conv1, num_output=base_output * (card / 16), kernel_size=kernel_size, stride=stride, pad=(kernel_size-1)/2, group=card,
bias_term=False, param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'),engine=P.Convolution.CAFFE)
conv2_bn = L.BatchNorm(conv2, use_global_stats=False, in_place=True)
conv2_scale = L.Scale(conv2, scale_param=dict(bias_term=True), in_place=True)
conv2_relu = L.ReLU(conv2, in_place=True)
conv3 = L.Convolution(conv2, num_output=base_output * 4, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv3_bn = L.BatchNorm(conv3, use_global_stats=False, in_place=True)
conv3_scale = L.Scale(conv3, scale_param=dict(bias_term=True), in_place=True)
match = L.Convolution(bottom, num_output=base_output * 4, kernel_size=1, stride=stride, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
match_bn = L.BatchNorm(match, use_global_stats=False, in_place=True)
match_scale = L.Scale(match, scale_param=dict(bias_term=True), in_place=True)
eltwise = L.Eltwise(match, conv3, eltwise_param=dict(operation=1))
eltwise_relu = L.ReLU(eltwise, in_place=True)
return conv1, conv1_bn, conv1_scale, conv1_relu, conv2, conv2_bn, conv2_scale, conv2_relu, \
conv3, conv3_bn, conv3_scale, match, match_bn, match_scale, eltwise, eltwise_relu
def match_block_stage(bottom, base_output=64, stride=2, card=32, kernel_size = 3):
"""
input:4*base_output x n x n
output:4*base_output x n x n
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
"""
conv1 = L.Convolution(bottom, num_output=base_output, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv1_bn = L.BatchNorm(conv1, use_global_stats=False, in_place=True)
conv1_scale = L.Scale(conv1, scale_param=dict(bias_term=True), in_place=True)
conv1_relu = L.ReLU(conv1, in_place=True)
conv2 = L.Convolution(conv1, num_output=base_output, kernel_size=kernel_size, stride=stride, pad=(kernel_size-1)/2,
bias_term=False, param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'),engine=P.Convolution.CAFFE)
conv2_bn = L.BatchNorm(conv2, use_global_stats=False, in_place=True)
conv2_scale = L.Scale(conv2, scale_param=dict(bias_term=True), in_place=True)
conv2_relu = L.ReLU(conv2, in_place=True)
conv3 = L.Convolution(conv2, num_output=base_output * 4, kernel_size=1, stride=1, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
conv3_bn = L.BatchNorm(conv3, use_global_stats=False, in_place=True)
conv3_scale = L.Scale(conv3, scale_param=dict(bias_term=True), in_place=True)
match = L.Convolution(bottom, num_output=base_output * 4, kernel_size=1, stride=stride, pad=0, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'),group=card)
match_bn = L.BatchNorm(match, use_global_stats=False, in_place=True)
match_scale = L.Scale(match, scale_param=dict(bias_term=True), in_place=True)
eltwise = L.Eltwise(match, conv3, eltwise_param=dict(operation=1))
eltwise_relu = L.ReLU(eltwise, in_place=True)
return conv1, conv1_bn, conv1_scale, conv1_relu, conv2, conv2_bn, conv2_scale, conv2_relu, \
conv3, conv3_bn, conv3_scale, match, match_bn, match_scale, eltwise, eltwise_relu
def conv_bn_scale_relu(bottom, num_output=64, kernel_size=3, stride=1, pad=0):
conv = L.Convolution(bottom, num_output=num_output, kernel_size=kernel_size, stride=stride, pad=pad,
param=[dict(lr_mult=1, decay_mult=1)],
weight_filler=dict(type='xavier', std=0.01),
bias_term = False)
conv_bn = L.BatchNorm(conv, use_global_stats=False, in_place=True)
conv_scale = L.Scale(conv, scale_param=dict(bias_term=True), in_place=True)
conv_relu = L.ReLU(conv, in_place=True)
return conv, conv_bn, conv_scale, conv_relu
def conv_bn_scale(bottom, num_output=64, kernel_size=3, stride=1, pad=0):
conv = L.Convolution(bottom, num_output=num_output, kernel_size=kernel_size, stride=stride, pad=pad,
param=[dict(lr_mult=1, decay_mult=1)],
weight_filler=dict(type='xavier', std=0.01),
bias_term=False)
conv_bn = L.BatchNorm(conv, use_global_stats=False, in_place=True)
conv_scale = L.Scale(conv, scale_param=dict(bias_term=True), in_place=True)
return conv, conv_bn, conv_scale
def eltwize_relu(bottom1, bottom2):
residual_eltwise = L.Eltwise(bottom1, bottom2, eltwise_param=dict(operation=1))
residual_eltwise_relu = L.ReLU(residual_eltwise, in_place=True)
return residual_eltwise, residual_eltwise_relu
def residual_branch(bottom, base_output=64):
"""
input:4*base_output x n x n
output:4*base_output x n x n
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
"""
branch2a, branch2a_bn, branch2a_scale, branch2a_relu = \
conv_bn_scale_relu(bottom, num_output=base_output, kernel_size=1) # base_output x n x n
branch2b, branch2b_bn, branch2b_scale, branch2b_relu = \
conv_bn_scale_relu(branch2a, num_output=base_output, kernel_size=3, pad=1) # base_output x n x n
branch2c, branch2c_bn, branch2c_scale = \
conv_bn_scale(branch2b, num_output=4 * base_output, kernel_size=1) # 4*base_output x n x n
residual, residual_relu = \
eltwize_relu(bottom, branch2c) # 4*base_output x n x n
return branch2a, branch2a_bn, branch2a_scale, branch2a_relu, branch2b, branch2b_bn, branch2b_scale, branch2b_relu, \
branch2c, branch2c_bn, branch2c_scale, residual, residual_relu
def residual_branch_shortcut(bottom, stride=2, base_output=64):
"""
:param stride: stride
:param base_output: base num_output of branch2
:param bottom: bottom layer
:return: layers
"""
branch1, branch1_bn, branch1_scale = \
conv_bn_scale(bottom, num_output=4 * base_output, kernel_size=1, stride=stride)
branch2a, branch2a_bn, branch2a_scale, branch2a_relu = \
conv_bn_scale_relu(bottom, num_output=base_output, kernel_size=1, stride=stride)
branch2b, branch2b_bn, branch2b_scale, branch2b_relu = \
conv_bn_scale_relu(branch2a, num_output=base_output, kernel_size=3, pad=1)
branch2c, branch2c_bn, branch2c_scale = \
conv_bn_scale(branch2b, num_output=4 * base_output, kernel_size=1)
residual, residual_relu = \
eltwize_relu(branch1, branch2c) # 4*base_output x n x n
return branch1, branch1_bn, branch1_scale, branch2a, branch2a_bn, branch2a_scale, branch2a_relu, branch2b, \
branch2b_bn, branch2b_scale, branch2b_relu, branch2c, branch2c_bn, branch2c_scale, residual, residual_relu
branch_shortcut_string = 'net.res(stage)a_branch1, net.res(stage)a_branch1_bn, net.res(stage)a_branch1_scale, \
net.res(stage)a_branch2a, net.res(stage)a_branch2a_bn, net.res(stage)a_branch2a_scale, net.res(stage)a_branch2a_relu, \
net.res(stage)a_branch2b, net.res(stage)a_branch2b_bn, net.res(stage)a_branch2b_scale, net.res(stage)a_branch2b_relu, \
net.res(stage)a_branch2c, net.res(stage)a_branch2c_bn, net.res(stage)a_branch2c_scale, net.res(stage)a, net.res(stage)a_relu = \
residual_branch_shortcut((bottom), stride=(stride), base_output=(num))'
branch_string = 'net.res(stage)b(order)_branch2a, net.res(stage)b(order)_branch2a_bn, net.res(stage)b(order)_branch2a_scale, \
net.res(stage)b(order)_branch2a_relu, net.res(stage)b(order)_branch2b, net.res(stage)b(order)_branch2b_bn, \
net.res(stage)b(order)_branch2b_scale, net.res(stage)b(order)_branch2b_relu, net.res(stage)b(order)_branch2c, \
net.res(stage)b(order)_branch2c_bn, net.res(stage)b(order)_branch2c_scale, net.res(stage)b(order), net.res(stage)b(order)_relu = \
residual_branch((bottom), base_output=(num))'
resnext_string = 'net.resx(n)_conv1, net.resx(n)_conv1_bn, net.resx(n)_conv1_scale, net.resx(n)_conv1_relu, \
net.resx(n)_conv2, net.resx(n)_conv2_bn, net.resx(n)_conv2_scale, net.resx(n)_conv2_relu, net.resx(n)_conv3, \
net.resx(n)_conv3_bn, net.resx(n)_conv3_scale, net.resx(n)_elewise, net.resx(n)_elewise_relu = \
resnext_block((bottom), base_output=(base), card=(c), kernel_size=(k))'
resnext_string_stage = 'net.stage#(n)_(m)_conv1, net.stage#(n)_(m)_conv1_bn, net.stage#(n)_(m)_conv1_scale, net.stage#(n)_(m)_conv1_relu, \
net.stage#(n)_(m)_conv2, net.stage#(n)_(m)_conv2_bn, net.stage#(n)_(m)_conv2_scale, net.stage#(n)_(m)_conv2_relu, net.stage#(n)_(m)_conv3, \
net.stage#(n)_(m)_conv3_bn, net.stage#(n)_(m)_conv3_scale, net.stage#(n)_(m)_elewise, net.stage#(n)_(m)_elewise_relu = \
resnext_block((bottom), base_output=(base), card=(c), kernel_size=(k))'
resnet_string_stage = 'net.stage#(n)_(m)_conv1, net.stage#(n)_(m)_conv1_bn, net.stage#(n)_(m)_conv1_scale, net.stage#(n)_(m)_conv1_relu, \
net.stage#(n)_(m)_conv2, net.stage#(n)_(m)_conv2_bn, net.stage#(n)_(m)_conv2_scale, net.stage#(n)_(m)_conv2_relu, net.stage#(n)_(m)_conv3, \
net.stage#(n)_(m)_conv3_bn, net.stage#(n)_(m)_conv3_scale, net.stage#(n)_(m)_elewise, net.stage#(n)_(m)_elewise_relu = \
resnet_block((bottom), base_output=(base), kernel_size=(k))'
match_string = 'net.resx(n)_conv1, net.resx(n)_conv1_bn, net.resx(n)_conv1_scale, net.resx(n)_conv1_relu, \
net.resx(n)_conv2, net.resx(n)_conv2_bn, net.resx(n)_conv2_scale, net.resx(n)_conv2_relu, net.resx(n)_conv3, \
net.resx(n)_conv3_bn, net.resx(n)_conv3_scale, net.resx(n)_match_conv, net.resx(n)_match_conv_bn, net.resx(n)_match_conv_scale,\
net.resx(n)_elewise, net.resx(n)_elewise_relu = match_block((bottom), base_output=(base), stride=(s), card=(c), kernel_size=(k))'
match_string_stage = 'net.stage#(n)_(m)_conv1, net.stage#(n)_(m)_conv1_bn, net.stage#(n)_(m)_conv1_scale, net.stage#(n)_(m)_conv1_relu, \
net.stage#(n)_(m)_conv2, net.stage#(n)_(m)_conv2_bn, net.stage#(n)_(m)_conv2_scale, net.stage#(n)_(m)_conv2_relu, net.stage#(n)_(m)_conv3, \
net.stage#(n)_(m)_conv3_bn, net.stage#(n)_(m)_conv3_scale, net.stage#(n)_(m)_match_conv, net.stage#(n)_(m)_match_conv_bn, net.stage#(n)_(m)_match_conv_scale,\
net.stage#(n)_(m)_elewise, net.stage#(n)_(m)_elewise_relu = match_block_stage((bottom), base_output=(base), stride=(s), card=1, kernel_size=(k))'
def ResNeXt_layers(net, from_layer, card=32, stages=(3, 4, 6, 3)):
"""
:param batch_size: the batch_size of train and test phase
:param phase: TRAIN or TEST
:param stages: the num of layers = 2 + 3*sum(stages), layers would better be chosen from [50, 101, 152]
{every stage is composed of 1 residual_branch_shortcut module and stage[i]-1 residual_branch
modules, each module consists of 3 conv layers}
(3, 4, 6, 3) for 50 layers; (3, 4, 23, 3) for 101 layers; (3, 8, 36, 3) for 152 layers
"""
net.conv1 = L.Convolution(net[from_layer], num_output=64, kernel_size=7, stride=2, pad=3, bias_term=False,
param=[dict(lr_mult=1, decay_mult=1)], weight_filler=dict(type='xavier'))
net.conv1_bn = L.BatchNorm(net.conv1, use_global_stats=False, in_place=True)
net.conv1_scale = L.Scale(net.conv1, scale_param=dict(bias_term=True), in_place=True)
net.conv1_relu = L.ReLU(net.conv1, in_place=True) # 64x112x112
net.pool1 = L.Pooling(net.conv1, kernel_size=3, stride=2, pad=0, pool=P.Pooling.MAX) # 64x56x56
for num in xrange(len(stages)): # num = 0, 1, 2, 3
for i in xrange(stages[num]):
if i == 0:
stage_string = match_string
bottom_string = ['net.pool1', 'net.resx{}_elewise'.format(str(sum(stages[:1]))),
'net.resx{}_elewise'.format(str(sum(stages[:2]))),
'net.resx{}_elewise'.format(str(sum(stages[:3])))][num]
else:
stage_string = resnext_string
bottom_string = 'net.resx{}_elewise'.format(str(sum(stages[:num]) + i))
print num, i
exec (stage_string.replace('(bottom)', bottom_string).
replace('(base)', str(2 ** num * 64)).
replace('(n)', str(sum(stages[:num]) + i + 1)).
replace('(s)', str(int(num > 0) + 1)).
replace('(c)', str(card)).
replace('(k)', str(3)))
return net
def ResNet_layers(net, from_layer, stages=(3, 4, 6, 3)):
"""
:param batch_size: the batch_size of train and test phase
:param phase: TRAIN or TEST
:param stages: the num of layers = 2 + 3*sum(stages), layers would better be chosen from [50, 101, 152]
{every stage is composed of 1 residual_branch_shortcut module and stage[i]-1 residual_branch
modules, each module consists of 3 conv layers}
(3, 4, 6, 3) for 50 layers; (3, 4, 23, 3) for 101 layers; (3, 8, 36, 3) for 152 layers
"""
net.conv1, net.conv1_bn, net.conv1_scale, net.conv1_relu = \
conv_bn_scale_relu(net[from_layer], num_output=64, kernel_size=7, stride=2, pad=3) # 64x112x112
net.pool1 = L.Pooling(net.conv1, kernel_size=3, stride=2, pool=P.Pooling.MAX) # 64x56x56
for num in xrange(len(stages)): # num = 0, 1, 2, 3
for i in xrange(stages[num]):
if i == 0:
stage_string = branch_shortcut_string
bottom_string = ['net.pool1', 'net.res2b%s' % str(stages[0] - 1), 'net.res3b%s' % str(stages[1] - 1),
'net.res4b%s' % str(stages[2] - 1)][num]
else:
stage_string = branch_string
if i == 1:
bottom_string = 'net.res%sa' % str(num + 2)
else:
bottom_string = 'net.res%sb%s' % (str(num + 2), str(i - 1))
exec (stage_string.replace('(stage)', str(num + 2)).replace('(bottom)', bottom_string).
replace('(num)', str(2 ** num * 64)).replace('(order)', str(i)).
replace('(stride)', str(int(num > 0) + 1)))
return net
def mPoseNet_ResNeXt_MultiStages_Train(net, data_layer="data", label_layer="label", train=True, lr = 1, decay = 1,**pose_test_kwargs):
kwargs = {'param': [dict(lr_mult=lr, decay_mult=decay), dict(lr_mult=2 * lr, decay_mult=0)],
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0)}
# input
if train:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp = \
L.Slice(net[label_layer], ntop=4, slice_param=dict(slice_point=[34,52,86], axis=1))
else:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp, net.gt = \
L.Slice(net[label_layer], ntop=5, slice_param=dict(slice_point=[34,52,86,104], axis=1))
# label
net.vec_label = L.Eltwise(net.vec_mask, net.vec_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_label = L.Eltwise(net.heat_mask, net.heat_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
stages = (3, 4, 8)
net = ResNeXt_layers(net, from_layer=data_layer, card=32, stages=stages)
from_layer = 'resx{}_elewise'.format(str(sum(stages)))
add_layer = 'upsample'
net[add_layer] = L.Reorg(net[from_layer], reorg_param=dict(up_down=P.Reorg.UP))
base_layer = add_layer
bottom_string = 'net.{}'.format(base_layer)
use_stages = 4
use_sub_layers = 5
num_output = 32
kernel_size = 5
for i_stage in xrange(use_stages):
for i_sub in xrange(use_sub_layers):
for str_i in ['vec', 'heat']:
if i_sub == 0:
stage_string = match_string_stage.replace('#', str_i)
else:
stage_string = resnext_string_stage.replace('#', str_i)
if i_sub != 0:
bottom_string = 'net.stage#(n)_(m)_elewise'.\
replace('#', str_i).replace('(n)',str(i_stage + 1)).\
replace('(m)', str(i_sub))
exec (stage_string.replace('(bottom)', bottom_string).
replace('(base)', str(num_output)).
replace('(n)', str(i_stage+1)).
replace('(m)', str(i_sub+1)).
replace('(s)', str(1)).
replace('(c)', str(32)).
replace('(k)', str(kernel_size)))
from1_layer = 'stage#(n)_(m)_elewise'.replace('#', 'vec').replace('(n)', str(i_stage + 1)).replace(
'(m)', str(use_sub_layers))
conv_vec = "stage{}_conv{}_vec".format(i_stage + 1, use_sub_layers + 1)
net[conv_vec] = L.Convolution(net[from1_layer], num_output=34, pad=1, kernel_size=3, **kwargs)
from2_layer = 'stage#(n)_(m)_elewise'.replace('#', 'heat').replace('(n)', str(i_stage + 1)).replace(
'(m)', str(use_sub_layers))
conv_heat = "stage{}_conv{}_heat".format(i_stage + 1, use_sub_layers + 1)
net[conv_heat] = L.Convolution(net[from2_layer], num_output=18, pad=1, kernel_size=3, **kwargs)
weight_vec = "weight_stage{}_vec".format(i_stage+ 1)
weight_heat = "weight_stage{}_heat".format(i_stage+1)
loss_vec = "loss_stage{}_vec".format(i_stage+1)
loss_heat = "loss_stage{}_heat".format(i_stage+1)
net[weight_vec] = L.Eltwise(net[conv_vec], net.vec_mask, eltwise_param=dict(operation=P.Eltwise.PROD))
net[loss_vec] = L.EuclideanLoss(net[weight_vec], net.vec_label, loss_weight=1)
net[weight_heat] = L.Eltwise(net[conv_heat], net.heat_mask, eltwise_param=dict(operation=P.Eltwise.PROD))
net[loss_heat] = L.EuclideanLoss(net[weight_heat], net.heat_label, loss_weight=1)
if i_stage != use_stages - 1:
out_layer = 'concat_stage{}'.format(str(i_stage + 1))
fea_layers = []
fea_layers.append(net[conv_vec])
fea_layers.append(net[conv_heat])
assert base_layer in net.keys()
fea_layers.append(net[base_layer])
net[out_layer] = L.Concat(*fea_layers, axis=1)
bottom_string = 'net.{}'.format(out_layer)
if not train:
print(net.keys())
conv_vec = "stage{}_conv{}_vec".format(use_stages,use_sub_layers + 1)
conv_heat = "stage{}_conv{}_heat".format(use_stages,use_sub_layers + 1)
net.vec_out = L.Eltwise(net.vec_mask, net[conv_vec], eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_out = L.Eltwise(net.heat_mask, net[conv_heat], eltwise_param=dict(operation=P.Eltwise.PROD))
feaLayers = []
feaLayers.append(net.heat_out)
feaLayers.append(net.vec_out)
outlayer = "concat_stage{}".format(use_stages)
net[outlayer] = L.Concat(*feaLayers, axis=1)
# Resize
resize_kwargs = {
'factor': pose_test_kwargs.get("resize_factor", 8),
'scale_gap': pose_test_kwargs.get("resize_scale_gap", 0.3),
'start_scale': pose_test_kwargs.get("resize_start_scale", 1.0),
}
net.resized_map = L.ImResize(net[outlayer], name="resize", imresize_param=resize_kwargs)
# Nms
nms_kwargs = {
'threshold': pose_test_kwargs.get("nms_threshold", 0.05),
'max_peaks': pose_test_kwargs.get("nms_max_peaks", 100),
'num_parts': pose_test_kwargs.get("nms_num_parts", 18),
}
net.joints = L.Nms(net.resized_map, name="nms", nms_param=nms_kwargs)
# ConnectLimbs
connect_kwargs = {
'is_type_coco': pose_test_kwargs.get("conn_is_type_coco", True),
'max_person': pose_test_kwargs.get("conn_max_person", 10),
'max_peaks_use': pose_test_kwargs.get("conn_max_peaks_use", 20),
'iters_pa_cal': pose_test_kwargs.get("conn_iters_pa_cal", 10),
'connect_inter_threshold': pose_test_kwargs.get("conn_connect_inter_threshold", 0.05),
'connect_inter_min_nums': pose_test_kwargs.get("conn_connect_inter_min_nums", 8),
'connect_min_subset_cnt': pose_test_kwargs.get("conn_connect_min_subset_cnt", 3),
'connect_min_subset_score': pose_test_kwargs.get("conn_connect_min_subset_score", 0.4),
}
net.limbs = L.Connectlimb(net.resized_map, net.joints, connect_limb_param=connect_kwargs)
# Eval
eval_kwargs = {
'stride': 8,
'area_thre': pose_test_kwargs.get("eval_area_thre", 64*64),
'oks_thre': pose_test_kwargs.get("eval_oks_thre", [0.5,0.55,0.6,0.65,0.7,0.75,0.8,0.85,0.9]),
}
net.eval = L.PoseEval(net.limbs, net.gt, pose_eval_param=eval_kwargs)
return net
def mPoseNet_ResNet_MultiStages_Train(net, data_layer="data", label_layer="label", train=True, lr = 1, decay = 1,**pose_test_kwargs):
kwargs = {'param': [dict(lr_mult=lr, decay_mult=decay), dict(lr_mult=2 * lr, decay_mult=0)],
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0)}
# input
if train:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp = \
L.Slice(net[label_layer], ntop=4, slice_param=dict(slice_point=[34,52,86], axis=1))
else:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp, net.gt = \
L.Slice(net[label_layer], ntop=5, slice_param=dict(slice_point=[34,52,86,104], axis=1))
# label
net.vec_label = L.Eltwise(net.vec_mask, net.vec_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_label = L.Eltwise(net.heat_mask, net.heat_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
stages = (3, 4, 6)
net = ResNet_layers(net, from_layer=data_layer, stages=stages)
from_layer = 'res%sb%s' % (str(len(stages) + 1), str(stages[-1] - 1))
add_layer = 'upsample'
net[add_layer] = L.Reorg(net[from_layer], reorg_param=dict(up_down=P.Reorg.UP))
base_layer = add_layer
bottom_string = 'net.{}'.format(base_layer)
use_stages = 4
use_sub_layers = 5
num_output = 32
kernel_size = 5
for i_stage in xrange(use_stages):
for i_sub in xrange(use_sub_layers):
for str_i in ['vec', 'heat']:
print i_stage, i_sub, str_i
if i_sub == 0:
stage_string = match_string_stage.replace('#', str_i)
else:
stage_string = resnet_string_stage.replace('#', str_i)
if i_sub != 0:
bottom_string = 'net.stage#(n)_(m)_elewise'.\
replace('#', str_i).replace('(n)',str(i_stage + 1)).\
replace('(m)', str(i_sub))
exec (stage_string.replace('(bottom)', bottom_string).
replace('(base)', str(num_output)).
replace('(n)', str(i_stage+1)).
replace('(m)', str(i_sub+1)).
replace('(s)', str(1)).
replace('(k)', str(kernel_size)))
from1_layer = 'stage#(n)_(m)_elewise'.replace('#', 'vec').replace('(n)', str(i_stage + 1)).replace(
'(m)', str(use_sub_layers))
conv_vec = "stage{}_conv{}_vec".format(i_stage + 1, use_sub_layers + 1)
net[conv_vec] = L.Convolution(net[from1_layer], num_output=34, pad=1, kernel_size=3, **kwargs)
from2_layer = 'stage#(n)_(m)_elewise'.replace('#', 'heat').replace('(n)', str(i_stage + 1)).replace(
'(m)', str(use_sub_layers))
conv_heat = "stage{}_conv{}_heat".format(i_stage + 1, use_sub_layers + 1)
net[conv_heat] = L.Convolution(net[from2_layer], num_output=18, pad=1, kernel_size=3, **kwargs)
weight_vec = "weight_stage{}_vec".format(i_stage+ 1)
weight_heat = "weight_stage{}_heat".format(i_stage+1)
loss_vec = "loss_stage{}_vec".format(i_stage+1)
loss_heat = "loss_stage{}_heat".format(i_stage+1)
net[weight_vec] = L.Eltwise(net[conv_vec], net.vec_mask, eltwise_param=dict(operation=P.Eltwise.PROD))
net[loss_vec] = L.EuclideanLoss(net[weight_vec], net.vec_label, loss_weight=1)
net[weight_heat] = L.Eltwise(net[conv_heat], net.heat_mask, eltwise_param=dict(operation=P.Eltwise.PROD))
net[loss_heat] = L.EuclideanLoss(net[weight_heat], net.heat_label, loss_weight=1)
if i_stage != use_stages - 1:
out_layer = 'concat_stage{}'.format(str(i_stage + 1))
fea_layers = []
fea_layers.append(net[conv_vec])
fea_layers.append(net[conv_heat])
assert base_layer in net.keys()
fea_layers.append(net[base_layer])
net[out_layer] = L.Concat(*fea_layers, axis=1)
bottom_string = 'net.{}'.format(out_layer)
for key in net.keys():
print key
if not train:
print(net.keys())
conv_vec = "stage{}_conv{}_vec".format(use_stages,use_sub_layers + 1)
conv_heat = "stage{}_conv{}_heat".format(use_stages,use_sub_layers + 1)
net.vec_out = L.Eltwise(net.vec_mask, net[conv_vec], eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_out = L.Eltwise(net.heat_mask, net[conv_heat], eltwise_param=dict(operation=P.Eltwise.PROD))
feaLayers = []
feaLayers.append(net.heat_out)
feaLayers.append(net.vec_out)
outlayer = "concat_stage{}".format(use_stages)
net[outlayer] = L.Concat(*feaLayers, axis=1)
# Resize
resize_kwargs = {
'factor': pose_test_kwargs.get("resize_factor", 8),
'scale_gap': pose_test_kwargs.get("resize_scale_gap", 0.3),
'start_scale': pose_test_kwargs.get("resize_start_scale", 1.0),
}
net.resized_map = L.ImResize(net[outlayer], name="resize", imresize_param=resize_kwargs)
# Nms
nms_kwargs = {
'threshold': pose_test_kwargs.get("nms_threshold", 0.05),
'max_peaks': pose_test_kwargs.get("nms_max_peaks", 100),
'num_parts': pose_test_kwargs.get("nms_num_parts", 18),
}
net.joints = L.Nms(net.resized_map, name="nms", nms_param=nms_kwargs)
# ConnectLimbs
connect_kwargs = {
'is_type_coco': pose_test_kwargs.get("conn_is_type_coco", True),
'max_person': pose_test_kwargs.get("conn_max_person", 10),
'max_peaks_use': pose_test_kwargs.get("conn_max_peaks_use", 20),
'iters_pa_cal': pose_test_kwargs.get("conn_iters_pa_cal", 10),
'connect_inter_threshold': pose_test_kwargs.get("conn_connect_inter_threshold", 0.05),
'connect_inter_min_nums': pose_test_kwargs.get("conn_connect_inter_min_nums", 8),
'connect_min_subset_cnt': pose_test_kwargs.get("conn_connect_min_subset_cnt", 3),
'connect_min_subset_score': pose_test_kwargs.get("conn_connect_min_subset_score", 0.4),
}
net.limbs = L.Connectlimb(net.resized_map, net.joints, connect_limb_param=connect_kwargs)
# Eval
eval_kwargs = {
'stride': 8,
'area_thre': pose_test_kwargs.get("eval_area_thre", 64*64),
'oks_thre': pose_test_kwargs.get("eval_oks_thre", [0.5,0.55,0.6,0.65,0.7,0.75,0.8,0.85,0.9]),
}
net.eval = L.PoseEval(net.limbs, net.gt, pose_eval_param=eval_kwargs)
return net
def ResidualReduce_Base_A(net, data_layer="data",use_sub_layers = (2, 6, 7),num_channels = (128, 144, 288),output_channels = (0, 0, 0, 0),
channel_scale = 3,num_channel_deconv = (128,128),lr=1,decay=1,add_strs=""):
num_output = 32
out_layer = 'conv1' + add_strs
ConvBNUnitLayer(net, data_layer, out_layer, use_bn=True, use_relu=True,num_output=num_output,
kernel_size=7, pad=3, stride=4, use_scale=True, leaky=False, lr_mult=lr,decay_mult=decay,pose_string=pose_string)
from_layer = out_layer
out_layer = 'pool1' + add_strs
net[out_layer] = L.Pooling(net[from_layer], pool=P.Pooling.AVE, kernel_size=3, stride=2, pad=0)
num_output = 64
kernel_size = 3
out_layer = "conv2_1" + add_strs
ConvBNUnitLayer(net, from_layer, out_layer, use_bn=True, use_relu=True,
num_output=num_output, kernel_size=kernel_size, pad=(kernel_size - 1) / 2, stride=2, use_scale=True,
leaky=False, lr_mult=lr, decay_mult=decay,pose_string=pose_string)
from_layer = out_layer
feat_layers = []
feat_layers.append(net["pool1" + add_strs])
feat_layers.append(net[from_layer])
out_layer = "conv2_1_concat" + add_strs
net[out_layer] = L.Concat(*feat_layers, axis=1)
for sublayer in xrange(use_sub_layers[0]):
base_layer = out_layer
name_prefix = 'conv2_{}'.format(sublayer + 2) + add_strs
ResNet_UnitA(net, base_layer, name_prefix, 1, num_channels[0], bridge=True,
num_channel_change=0, flag_hasresid=True,channel_scale=channel_scale)
out_layer = name_prefix + '_relu'
for layer in xrange(1, len(use_sub_layers)):
num_output_layer = num_channels[layer]
output_channel_layer = output_channels[layer]
for sublayer in xrange(use_sub_layers[layer]):
base_layer = out_layer
name_prefix = 'conv{}_{}'.format(layer + 2, sublayer + 1) + add_strs
if sublayer == 0:
stride = 2
else:
stride = 1
if sublayer == 1:
bridge = True
else:
bridge = False
if not output_channel_layer == 0 and sublayer == use_sub_layers[layer] - 1:
num_channel_change = output_channel_layer
bridge = True
else:
num_channel_change = 0
ResNet_UnitA(net, base_layer, name_prefix, stride, num_output_layer,bridge = bridge,
num_channel_change = num_channel_change,flag_hasresid = True,channel_scale=channel_scale)
out_layer = name_prefix + '_relu'
bn_kwargs = {
'param': [dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0)],
'eps': 0.001,
}
sb_kwargs = {
'bias_term': True,
'param': [dict(lr_mult=1, decay_mult=0), dict(lr_mult=1, decay_mult=0)],
'filler': dict(type='constant', value=1.0),
'bias_filler': dict(type='constant', value=0.2),
}
if len(num_channel_deconv) == 2:
deconv_param = {
'num_output': num_channel_deconv[0],
'kernel_size': 2,
'pad': 0,
'stride': 2,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0),
'group': 1,
}
kwargs_deconv = {
'param': [dict(lr_mult=1, decay_mult=1)],
'convolution_param': deconv_param
}
from_layer = "conv3_6{}_Add".format(add_strs)
add_layer = from_layer + "_deconv"
net[add_layer] = L.Deconvolution(net[from_layer], **kwargs_deconv)
bn_name = add_layer + '_bn'
net[bn_name] = L.BatchNorm(net[add_layer], in_place=True, **bn_kwargs)
sb_name = add_layer + '_scale'
net[sb_name] = L.Scale(net[add_layer], in_place=True, **sb_kwargs)
relu_name = add_layer + '_relu'
net[relu_name] = L.ReLU(net[add_layer], in_place=True)
deconv_param1 = {
'num_output': num_channel_deconv[-1],
'kernel_size': 4,
'pad': 0,
'stride': 4,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0),
'group': 1,
}
kwargs_deconv1 = {
'param': [dict(lr_mult=1, decay_mult=1)],
'convolution_param': deconv_param1
}
from_layer = "conv4_7{}_Add".format(add_strs)
add_layer = from_layer + "_deconv"
net[add_layer] = L.Deconvolution(net[from_layer], **kwargs_deconv1)
bn_name = add_layer + '_bn'
net[bn_name] = L.BatchNorm(net[add_layer], in_place=True, **bn_kwargs)
sb_name = add_layer + '_scale'
net[sb_name] = L.Scale(net[add_layer], in_place=True, **sb_kwargs)
relu_name = add_layer + '_relu'
net[relu_name] = L.ReLU(net[add_layer], in_place=True)
return net
def ResidualShuffleVariant_Base_A(net, data_layer="data",use_sub_layers = (2, 6, 7),num_channels = (128, 144, 288),output_channels = (0, 256,128),
channel_scale = 4,num_channel_deconv = 128,lr=1,decay=1,flag_deconvwithrelu = True,add_strs=""):
out_layer = 'conv1' + add_strs
ConvBNUnitLayer(net, data_layer, out_layer, use_bn=True, use_relu=True,
num_output=32, kernel_size=3, pad=1, stride=2, use_scale=True, leaky=False, lr_mult=lr,
decay_mult=decay,pose_string=pose_string)
from_layer = out_layer
out_layer = 'pool1' + add_strs
net[out_layer] = L.Pooling(net[from_layer], pool=P.Pooling.MAX, kernel_size=3, stride=2, pad=0)
for layer in xrange(0, len(use_sub_layers)):
num_channel_layer = num_channels[layer]
output_channel_layer = output_channels[layer]
for sublayer in xrange(use_sub_layers[layer]):
base_layer = out_layer
name_prefix = 'conv{}_{}'.format(layer + 2, sublayer + 1) + add_strs
if sublayer == 0:
stride = 2
else:
stride = 1
if sublayer == 1:
bridge = True
else:
bridge = False
if not output_channel_layer == 0 and sublayer == use_sub_layers[layer] - 1:
num_channel_change = output_channel_layer
bridge = True
else:
num_channel_change = 0
ResNet_UnitA(net, base_layer, name_prefix, stride, num_channel_layer, bridge=bridge, num_channel_change=num_channel_change,
flag_hasresid=True, channel_scale=channel_scale, check_macc=False)
out_layer = name_prefix + '_relu'
bn_kwargs = {
'param': [dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0)],
'eps': 0.001,
}
sb_kwargs = {
'bias_term': True,
'param': [dict(lr_mult=1, decay_mult=0), dict(lr_mult=1, decay_mult=0)],
'filler': dict(type='constant', value=1.0),
'bias_filler': dict(type='constant', value=0.2),
}
deconv_param = {
'num_output': num_channel_deconv,
'kernel_size': 2,
'pad': 0,
'stride': 2,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0),
'group': 1,
}
kwargs_deconv = {
'param': [dict(lr_mult=1, decay_mult=1)],
'convolution_param': deconv_param
}
from_layer = "conv3_{}{}_Add".format(use_sub_layers[-1],add_strs)
add_layer = from_layer + "_deconv"
net[add_layer] = L.Deconvolution(net[from_layer], **kwargs_deconv)
if flag_deconvwithrelu:
bn_name = add_layer + '_bn'
net[bn_name] = L.BatchNorm(net[add_layer], in_place=True, **bn_kwargs)
sb_name = add_layer + '_scale'
net[sb_name] = L.Scale(net[add_layer], in_place=True, **sb_kwargs)
relu_name = add_layer + '_relu'
net[relu_name] = L.ReLU(net[add_layer], in_place=True)
return net
def ResidualVariant_Base_A(net, data_layer="data",use_sub_layers = (2, 6, 7),num_channels = (128, 144, 288),output_channels = (0, 256,128),
channel_scale = 4,num_channel_deconv = 128,lr=0.1,decay=1.0,flag_deconvwithrelu = True,add_strs="",flag_withparamname=False):
####
global pose_string
pose_string='_pose'
net = ResidualVariant_Base_A_base(net, data_layer=data_layer, use_sub_layers=use_sub_layers, num_channels=num_channels,
output_channels=output_channels,channel_scale=channel_scale,lr=lr, decay=1, add_strs=add_strs,flag_withparamname=flag_withparamname,pose_string=pose_string)
bn_kwargs = {
'param': [dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0)],
'eps': 0.001,
}
sb_kwargs = {
'bias_term': True,
'param': [dict(lr_mult=1, decay_mult=0), dict(lr_mult=1, decay_mult=0)],
'filler': dict(type='constant', value=1.0),
'bias_filler': dict(type='constant', value=0.2),
}
deconv_param = {
'num_output': num_channel_deconv,
'kernel_size': 2,
'pad': 0,
'stride': 2,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0),
'group': 1,
}
kwargs_deconv = {
'param': [dict(lr_mult=1, decay_mult=1)],
'convolution_param': deconv_param
}
from_layer = "conv3_{}{}_Add".format(use_sub_layers[-1],add_strs)
add_layer = from_layer + "_deconv"
from_layer= "conv3_{}{}_Add".format(use_sub_layers[-1],add_strs)+pose_string
net[add_layer] = L.Deconvolution(net[from_layer], **kwargs_deconv)
if flag_deconvwithrelu:
bn_name = add_layer + '_bn'
net[bn_name] = L.BatchNorm(net[add_layer], in_place=True, **bn_kwargs)
sb_name = add_layer + '_scale'
net[sb_name] = L.Scale(net[add_layer], in_place=True, **sb_kwargs)
relu_name = add_layer + '_relu'
net[relu_name] = L.ReLU(net[add_layer], in_place=True)
return net
def mPoseNet_COCO_ShuffleVariant_ReconBase_Train(net, data_layer="data",flag_withTea = True,loss_weight=0.2):####
use_sub_layers = (6, 7)
num_channels = (144, 288)
output_channels = (128, 0)
channel_scale = 4
num_channel_deconv = 128
lr = 0.1
decay = 1.0
add_strs = "_recon"
flag_deconvwithrelu = False
flag_withparamname=True
pose_string='_pose'
############################# NOTE TO CHANGE THE BASE FUNCTION!!!!!!!!!!!!!
net = ResidualVariant_Base_A(net, data_layer=data_layer, use_sub_layers=use_sub_layers, num_channels=num_channels,
output_channels=output_channels,channel_scale=channel_scale, num_channel_deconv=num_channel_deconv,
lr=lr, decay=decay, add_strs=add_strs,flag_deconvwithrelu=flag_deconvwithrelu,flag_withparamname=flag_withparamname)
# net = ResidualShuffleVariant_Base_A(net, data_layer=data_layer, use_sub_layers=use_sub_layers, num_channels=num_channels,
# output_channels=output_channels,channel_scale=channel_scale, num_channel_deconv=num_channel_deconv,
# lr=lrdecay, decay=lrdecay, add_strs=add_strs,flag_deconvwithrelu=flag_deconvwithrelu)
recon_layer1 = "conv2_{}{}_Add".format(use_sub_layers[0], add_strs)
recon_layer2 = "conv3_{}{}_Add".format(use_sub_layers[1], add_strs) + "_deconv"
strid_convs = [1, 1, 1, 0, 0]
if flag_withTea:
## Teacher 15F
# net = YoloNetPartCompress(net, from_layer="data", use_bn=True, use_layers=5, use_sub_layers=5,
# strid_conv=strid_convs, final_pool=False, lr=0, decay=0, leaky=True)
# add_layer = 'conv5_5_upsample'
# net[add_layer] = L.Reorg(net["conv5_5"], reorg_param=dict(up_down=P.Reorg.UP))
## Teach DarkTea8B
leaky = False
ChangeNameAndChannel = {"conv4_3": 128, "conv5_1": 512}
net = YoloNetPart(net, from_layer=data_layer, use_bn=True, use_layers=5, use_sub_layers=5, final_pool=False,
leaky=leaky, lr=0, decay=0, ChangeNameAndChannel=ChangeNameAndChannel)
### Teacher DarkNetTea4A
# num_sublayers_tea = [1, 1, 2, 3]
# num_channels_tea = [512, 256,512, 256,128]
# alpha = 1
# net = YoloNetPart_StrideRemove1x1(net, num_sublayers=num_sublayers_tea, num_channels=num_channels_tea,
# from_layer=data_layer,lr=0, decay=0,alpha=alpha,fix_layer=5,fix_sublayer=1)
####Both Teach DarkTea8B and DarkTea4A use the following deconv
conv_param = {
'num_output': 128,
'kernel_size': 2,
'pad': 0,
'stride': 2,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_term': False,
'group': 1,
}
# conv_param = {"kernel_size": 4, "stride": 2, "num_output": 128, "group": 128, "pad": 1,
# "weight_filler": dict(type="bilinear"), "bias_term": False}
kwargs = {
'param': [dict(lr_mult=0, decay_mult=0)],
'convolution_param': conv_param
}
from_layer = "conv5_5"
out_layer = from_layer + "_Upsample"
net[out_layer] = L.Deconvolution(net[from_layer], **kwargs)
ref_layer1 = "conv4_3"
ref_layer2 = "conv5_5_Upsample"
net['loss1'] = L.EuclideanLoss(net[recon_layer1], net[ref_layer1], loss_weight=loss_weight)
net['loss2'] = L.EuclideanLoss(net[recon_layer2], net[ref_layer2], loss_weight=loss_weight)
return net, recon_layer1, recon_layer2
def mPoseNet_VGGDarkNet_Base_Train(net, data_layer="data",pose_string=""):####
flag_withparamname = True
pool_last = (False,False,False,True,False)
net = VGGDarkNet(net, data_layer=data_layer, pool_last=pool_last,flag_withparamname=flag_withparamname,pose_string=pose_string)
bn_kwargs = {
'param': [dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0), dict(lr_mult=0, decay_mult=0)],
'eps': 0.001,
}
sb_kwargs = {
'bias_term': True,
'param': [dict(lr_mult=0.1, decay_mult=0), dict(lr_mult=0.1, decay_mult=0)],
'filler': dict(type='constant', value=1.0),
'bias_filler': dict(type='constant', value=0.2),
}
deconv_param = {
'num_output': 128,
'kernel_size': 2,
'pad': 0,
'stride': 2,
'weight_filler': dict(type='gaussian', std=0.01),
'bias_filler': dict(type='constant', value=0),
'group': 1,
}
kwargs_deconv = {
'param': [dict(lr_mult=1, decay_mult=1)],
'convolution_param': deconv_param
}
from_layer = "conv5_5" + pose_string
add_layer = from_layer + "_deconv"
net[add_layer] = L.Deconvolution(net[from_layer], **kwargs_deconv)
bn_name = add_layer + '_bn'
net[bn_name] = L.BatchNorm(net[add_layer], in_place=True, **bn_kwargs)
sb_name = add_layer + '_scale'
net[sb_name] = L.Scale(net[add_layer], in_place=True, **sb_kwargs)
relu_name = add_layer + '_relu'
net[relu_name] = L.ReLU(net[add_layer], in_place=True)
return net,"conv4_5","conv5_5" + pose_string + "_deconv"
def mPoseNet_COCO_ShuffleVariant_ReconStage_Train(net, data_layer="data",loss_weight=1.0):
use_sub_layers = (6, 7)
num_channels = (128, 256)
output_channels = (256, 0)
channel_scale = 4
num_channel_deconv = 128
lrdecay = 1
add_strs = "_recon"
net = ResidualShuffleVariant_Base_A(net, data_layer=data_layer, use_sub_layers=use_sub_layers, num_channels=num_channels,
output_channels=output_channels,channel_scale=channel_scale, num_channel_deconv=num_channel_deconv,
lr=lrdecay, decay=lrdecay, add_strs=add_strs)
recon_layer1 = "conv2_{}{}_Add".format(use_sub_layers[0], add_strs)
recon_layer2 = "conv3_{}{}_Add".format(use_sub_layers[1], add_strs) + "_deconv"
concat_layer = []
concat_layer.append(net[recon_layer1])
concat_layer.append(net[recon_layer2])
baselayer = "convf" + add_strs
net[baselayer] = L.Concat(*concat_layer, axis=1)
use_3_layers = 5
use_1_layers = 0
n_channel = 64
lrdecay = 1.0
kernel_size = 3
flag_output_sigmoid = False
net = mPose_StageX_Train(net, from_layer=baselayer, stage=1,use_3_layers=use_3_layers, use_1_layers=use_1_layers, short_cut=False,
base_layer=baselayer, lr=lrdecay, decay=lrdecay, num_channels=n_channel,kernel_size=kernel_size,
flag_sigmoid=flag_output_sigmoid,flag_hasoutput=False,addstrs=add_strs,flag_hasloss=False)
############################### Teacher
strid_convs = [1, 1, 1, 0, 0]
net = YoloNetPartCompress(net, from_layer=data_layer, use_bn=True, use_layers=5, use_sub_layers=5,
strid_conv=strid_convs, final_pool=False, lr=0, decay=0, leaky=False)
add_layer = 'conv5_5_upsample'
net[add_layer] = L.Reorg(net["conv5_5"], reorg_param=dict(up_down=P.Reorg.UP))
concat_layer = []
concat_layer.append(net['conv4_3'])
concat_layer.append(net['conv5_5_upsample'])
baselayer = "convf"
net[baselayer] = L.Concat(*concat_layer, axis=1)
use_stage = 3
use_3_layers = 5
use_1_layers = 0
n_channel = 64
kernel_size = 3
flag_output_sigmoid = False
for stage in xrange(use_stage):
if stage == 0:
from_layer = baselayer
else:
from_layer = "concat_stage{}".format(stage)
outlayer = "concat_stage{}".format(stage + 1)
if stage == use_stage - 1:
flag_hasoutput = False
short_cut = False
else:
flag_hasoutput = True
short_cut = True
net = mPose_StageX_Train(net, from_layer=from_layer, out_layer=outlayer, stage=stage + 1,mask_vec="vec_mask", mask_heat="heat_mask", \
label_vec="vec_label", label_heat="heat_label",use_3_layers=use_3_layers, use_1_layers=use_1_layers,
short_cut=short_cut,base_layer=baselayer, lr=0, decay=0, num_channels=n_channel,
kernel_size=kernel_size, flag_sigmoid=flag_output_sigmoid,flag_hasoutput=flag_hasoutput,flag_hasloss=False)
recon_layer1 = "stage1_conv{}_heat".format(use_3_layers-1) + add_strs
recon_layer2 = "stage1_conv{}_vec".format(use_3_layers - 1) + add_strs
ref_layer1 = "stage3_conv{}_heat".format(use_3_layers-1)
ref_layer2 = "stage3_conv{}_vec".format(use_3_layers - 1)
net['loss1'] = L.EuclideanLoss(net[recon_layer1], net[ref_layer1], loss_weight=loss_weight)
net['loss2'] = L.EuclideanLoss(net[recon_layer2], net[ref_layer2], loss_weight=loss_weight)
return net
def mPoseNet_COCO_ShuffleVariant_PoseFromReconBase_Train(net, data_layer="data", label_layer="label", train=True,**pose_test_kwargs):####
# input
# input
if train:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp = \
L.Slice(net[label_layer], ntop=4, slice_param=dict(slice_point=[34, 52, 86], axis=1))
else:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp, net.gt = \
L.Slice(net[label_layer], ntop=5, slice_param=dict(slice_point=[34, 52, 86, 104], axis=1))
# label
net.vec_label = L.Eltwise(net.vec_mask, net.vec_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_label = L.Eltwise(net.heat_mask, net.heat_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
flag_concat = True
net, ref_layer1, ref_layer2 = mPoseNet_COCO_ShuffleVariant_ReconBase_Train(net, data_layer=data_layer,flag_withTea = False)
ref_layer1=ref_layer1+pose_string
if flag_concat:
feaLayers = []
feaLayers.append(net[ref_layer1])
feaLayers.append(net[ref_layer2])
baselayer = "convf"
net[baselayer] = L.Concat(*feaLayers, axis=1)
else:
baselayer = ref_layer2
use_stage = 3
use_3_layers = 5
use_1_layers = 0
n_channel = 64
lrdecay = 1.0
kernel_size = 3
flag_output_sigmoid = False
for stage in xrange(use_stage):
if stage == 0:
from_layer = baselayer
else:
from_layer = "concat_stage{}".format(stage)
outlayer = "concat_stage{}".format(stage + 1)
if stage == use_stage - 1:
short_cut = False
else:
short_cut = True
net = mPose_StageX_Train(net, from_layer=from_layer, out_layer=outlayer, stage=stage + 1,
mask_vec="vec_mask", mask_heat="heat_mask", \
label_vec="vec_label", label_heat="heat_label", \
use_3_layers=use_3_layers, use_1_layers=use_1_layers, short_cut=short_cut, \
base_layer=baselayer, lr=0.1, decay=lrdecay, num_channels=n_channel,
kernel_size=kernel_size, flag_sigmoid=flag_output_sigmoid)
# for Test
if not train:
if flag_output_sigmoid:
conv_vec = "stage{}_conv{}_vec".format(use_stage, use_3_layers + use_1_layers) + "_sig"
conv_heat = "stage{}_conv{}_heat".format(use_stage, use_3_layers + use_1_layers) + "_sig"
else:
conv_vec = "stage{}_conv{}_vec".format(use_stage, use_3_layers + use_1_layers)
conv_heat = "stage{}_conv{}_heat".format(use_stage, use_3_layers + use_1_layers)
net.vec_out = L.Eltwise(net.vec_mask, net[conv_vec], eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_out = L.Eltwise(net.heat_mask, net[conv_heat], eltwise_param=dict(operation=P.Eltwise.PROD))
feaLayers = []
feaLayers.append(net.heat_out)
feaLayers.append(net.vec_out)
outlayer = "concat_stage{}".format(3)
net[outlayer] = L.Concat(*feaLayers, axis=1)
# Resize
resize_kwargs = {
'factor': pose_test_kwargs.get("resize_factor", 8),
'scale_gap': pose_test_kwargs.get("resize_scale_gap", 0.3),
'start_scale': pose_test_kwargs.get("resize_start_scale", 1.0),
}
net.resized_map = L.ImResize(net[outlayer], name="resize", imresize_param=resize_kwargs)
# Nms
nms_kwargs = {
'threshold': pose_test_kwargs.get("nms_threshold", 0.05),
'max_peaks': pose_test_kwargs.get("nms_max_peaks", 100),
'num_parts': pose_test_kwargs.get("nms_num_parts", 18),
}
net.joints = L.Nms(net.resized_map, name="nms", nms_param=nms_kwargs)
# ConnectLimbs
connect_kwargs = {
'is_type_coco': pose_test_kwargs.get("conn_is_type_coco", True),
'max_person': pose_test_kwargs.get("conn_max_person", 10),
'max_peaks_use': pose_test_kwargs.get("conn_max_peaks_use", 20),
'iters_pa_cal': pose_test_kwargs.get("conn_iters_pa_cal", 10),
'connect_inter_threshold': pose_test_kwargs.get("conn_connect_inter_threshold", 0.05),
'connect_inter_min_nums': pose_test_kwargs.get("conn_connect_inter_min_nums", 8),
'connect_min_subset_cnt': pose_test_kwargs.get("conn_connect_min_subset_cnt", 3),
'connect_min_subset_score': pose_test_kwargs.get("conn_connect_min_subset_score", 0.4),
}
net.limbs = L.Connectlimb(net.resized_map, net.joints, connect_limb_param=connect_kwargs)
# Eval
eval_kwargs = {
'stride': 8,
'area_thre': pose_test_kwargs.get("eval_area_thre", 64 * 64),
'oks_thre': pose_test_kwargs.get("eval_oks_thre", [0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.85, 0.9]),
}
net.eval = L.PoseEval(net.limbs, net.gt, pose_eval_param=eval_kwargs)
return net
def mPoseNet_VGGDarkNet_Train(net, data_layer="data", label_layer="label", train=True,**pose_test_kwargs):####
# input
# input
pose_string = "_pose"
if train:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp = \
L.Slice(net[label_layer], ntop=4, slice_param=dict(slice_point=[34, 52, 86], axis=1))
else:
net.vec_mask, net.heat_mask, net.vec_temp, net.heat_temp, net.gt = \
L.Slice(net[label_layer], ntop=5, slice_param=dict(slice_point=[34, 52, 86, 104], axis=1))
# label
net.vec_label = L.Eltwise(net.vec_mask, net.vec_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
net.heat_label = L.Eltwise(net.heat_mask, net.heat_temp, eltwise_param=dict(operation=P.Eltwise.PROD))
flag_concat = True
net, ref_layer1, ref_layer2 = mPoseNet_VGGDarkNet_Base_Train(net, data_layer=data_layer,pose_string=pose_string)
ref_layer1=ref_layer1+pose_string
if flag_concat:
feaLayers = []
feaLayers.append(net[ref_layer1])
feaLayers.append(net[ref_layer2])
baselayer = "convf"
net[baselayer] = L.Concat(*feaLayers, axis=1)
else:
baselayer = ref_layer2
use_stage = 3
use_3_layers = 5
use_1_layers = 0
n_channel = 64
lrdecay = 1.0
kernel_size = 3
flag_output_sigmoid = False
for stage in xrange(use_stage):
if stage == 0:
from_layer = baselayer
else:
from_layer = "concat_stage{}".format(stage)
outlayer = "concat_stage{}".format(stage + 1)
if stage == use_stage - 1:
short_cut = False
else:
short_cut = True
net = mPose_StageX_Train(net, from_layer=from_layer, out_layer=outlayer, stage=stage + 1,
mask_vec="vec_mask", mask_heat="heat_mask", \
label_vec="vec_label", label_heat="heat_label", \
use_3_layers=use_3_layers, use_1_layers=use_1_layers, short_cut=short_cut, \
base_layer=baselayer, lr=0.1, decay=lrdecay, num_channels=n_channel,
kernel_size=kernel_size, flag_sigmoid=flag_output_sigmoid)
return net | 52.171266 | 187 | 0.64375 | 9,241 | 64,275 | 4.164701 | 0.036684 | 0.019124 | 0.018864 | 0.021203 | 0.913579 | 0.877696 | 0.85296 | 0.830744 | 0.821753 | 0.809125 | 0 | 0.032177 | 0.225406 | 64,275 | 1,232 | 188 | 52.171266 | 0.740836 | 0.028627 | 0 | 0.714428 | 0 | 0.024876 | 0.073885 | 0.012295 | 0 | 0 | 0 | 0 | 0.00199 | 0 | null | null | 0 | 0.01194 | null | null | 0.00597 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c7c30288783d7596fdd95c35721b31d5d9d32a1c | 16,461 | py | Python | device_model/migrations/0001_initial.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | null | null | null | device_model/migrations/0001_initial.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | 6 | 2021-03-19T16:01:33.000Z | 2022-03-12T00:54:23.000Z | device_model/migrations/0001_initial.py | weirdaze/tauto | e5a635628cd92998212cf3ae74aef2f0436430f5 | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2019-06-04 14:49
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
]
operations = [
migrations.CreateModel(
name='ChipModelNo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='ChipType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='DeviceModelNo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='DeviceState',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='DeviceType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='Interface',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('description', models.TextField(blank=True, default='Interface Description', null=True)),
('slot', models.PositiveIntegerField(blank=True, default=1, null=True)),
('number', models.PositiveIntegerField(default=1)),
('verified', models.BooleanField(default=False)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='InterfaceType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='Mac',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='ModuleBuildModelNo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='ModuleSerial',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='ModuleState',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='ModuleType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='SerdesSpeed',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='SerdesType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='SlotModelNo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='SlotType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
],
),
migrations.CreateModel(
name='State',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('type', models.CharField(max_length=300)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='Speed',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('speed', models.FloatField(default=0)),
('unit', models.CharField(default='G', max_length=5)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='Slot',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('number', models.CharField(max_length=300)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('model', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.SlotModelNo')),
('type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.SlotType')),
],
),
migrations.CreateModel(
name='Serial',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('number', models.CharField(max_length=300)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='Serdes',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('speed', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.SerdesSpeed')),
('type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.SerdesType')),
],
),
migrations.CreateModel(
name='ModuleBuildPorts',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('num_phy_ports', models.PositiveIntegerField(default=0)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='ModuleBuild',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('fqdn', models.URLField(blank=True, null=True)),
('model', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.ModuleBuildModelNo')),
],
),
migrations.CreateModel(
name='Module',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('number', models.PositiveIntegerField()),
('name', models.CharField(blank=True, max_length=300, null=True)),
('slot', models.PositiveIntegerField(default=1)),
('module_build', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.ModuleBuild')),
('module_type', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.ModuleType')),
('serial', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.ModuleSerial')),
('state', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.ModuleState')),
],
),
migrations.CreateModel(
name='ModelNo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('number', models.CharField(max_length=300)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='MacAddr',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('address', models.CharField(max_length=100)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.CreateModel(
name='Link',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('side_a', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='side_a', to='device_model.Interface')),
('side_z', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='side_z', to='device_model.Interface')),
],
),
migrations.CreateModel(
name='IPAddress',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('address', models.GenericIPAddressField()),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
migrations.AddField(
model_name='interface',
name='type',
field=models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.InterfaceType'),
),
migrations.CreateModel(
name='DeviceModel',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('hostname', models.CharField(max_length=300)),
('chassis', models.PositiveIntegerField(default=1)),
('fqdn', models.URLField()),
('num_slots', models.PositiveIntegerField(default=1)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('model', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceModelNo')),
('state', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceState')),
('type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceType')),
],
),
migrations.CreateModel(
name='Device',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('hostname', models.CharField(max_length=300)),
('chassis', models.PositiveIntegerField(default=1)),
('fqdn', models.URLField()),
('num_slots', models.PositiveIntegerField(default=1)),
('object_id', models.PositiveIntegerField(blank=True, null=True)),
('content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('model', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceModelNo')),
('state', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceState')),
('type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.DeviceType')),
],
),
migrations.CreateModel(
name='Chip',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=300)),
('serdes_num_front', models.PositiveIntegerField(default=0)),
('serdes_num_fabric', models.PositiveIntegerField(default=0)),
('serdes_speed_front', models.PositiveIntegerField(default=0)),
('serdes_speed_fabric', models.PositiveIntegerField(default=0)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
('macs', models.ManyToManyField(to='device_model.Mac')),
('model', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='device_model.ChipModelNo')),
('type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.DO_NOTHING, to='device_model.ChipType')),
],
),
migrations.CreateModel(
name='BandwidthGigabits',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bw', models.FloatField(default=0)),
('object_id', models.PositiveIntegerField()),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
],
),
]
| 52.759615 | 151 | 0.585687 | 1,581 | 16,461 | 5.943707 | 0.077799 | 0.039161 | 0.052144 | 0.081941 | 0.849952 | 0.840587 | 0.825689 | 0.825689 | 0.825689 | 0.825263 | 0 | 0.009744 | 0.270518 | 16,461 | 311 | 152 | 52.92926 | 0.772818 | 0.002612 | 0 | 0.743421 | 1 | 0 | 0.125366 | 0.051474 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006579 | 0 | 0.019737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c7f147d4e9ee0ad278472a073eb307aa2cf324ab | 7,738 | py | Python | presets.py | puffyboa/game-of-life | fb5285367747010ce30b6c6402b6ba06fcf89e94 | [
"MIT"
] | 1 | 2017-09-03T23:24:17.000Z | 2017-09-03T23:24:17.000Z | presets.py | puffyboa/game-of-life | fb5285367747010ce30b6c6402b6ba06fcf89e94 | [
"MIT"
] | null | null | null | presets.py | puffyboa/game-of-life | fb5285367747010ce30b6c6402b6ba06fcf89e94 | [
"MIT"
] | null | null | null |
Presets = {
"blinker": [
[1, 1, 1]
],
"toad": [
[1, 1, 1, 0],
[0, 1, 1, 1]
],
"glider": [
[1, 0, 0],
[0, 1, 1],
[1, 1, 0]
],
"unbounded": [
[1, 1, 1, 0, 1],
[1, 0, 0, 0, 0],
[0, 0, 0, 1, 1],
[0, 1, 1, 0, 1],
[1, 0, 1, 0, 1]
],
"glider_gun": [
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1],
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1],
[1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[1,1,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
],
"diehard": [
[0, 0, 0, 0, 0, 0, 1, 0],
[1, 1, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 1, 1, 1]
],
"boat": [
[1, 1, 0],
[1, 0, 1],
[0, 1, 0]
],
"r_pentomino": [
[0, 1, 1],
[1, 1, 0],
[0, 1, 0]
],
"beacon": [
[0, 0, 1, 1],
[0, 0, 1, 1],
[1, 1, 0, 0],
[1, 1, 0, 0]
],
"acorn": [
[0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0],
[1, 1, 0, 0, 1, 1, 1]
],
"spaceship": [
[0, 0, 1, 1, 0],
[1, 1, 0, 1, 1],
[1, 1, 1, 1, 0],
[0, 1, 1, 0, 0]
],
"block_switch_engine": [
[0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 1, 0, 1, 1],
[0, 0, 0, 0, 1, 0, 1, 0],
[0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0],
[1, 0, 1, 0, 0, 0, 0, 0]
],
"pentadecathlon": [
[0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,0,0,0,0,1,0,0,0],
[0,1,1,0,1,1,1,1,0,1,1,0],
[0,0,0,1,0,0,0,0,1,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0]
],
"pulsar": [
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,1,1,0,0,0,1,1,1,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,0,0,1,1,1,0,0,0,1,1,1,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,1,1,0,0,0,1,1,1,0,0,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,1,0,0,0,0,1,0,1,0,0,0,0,1,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,0,1,1,1,0,0,0,1,1,1,0,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
],
"copperhead": [
[0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0],
[0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0]
],
"fireship": [
[0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0],
[0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0],
[0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0],
[0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0],
[0, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0],
[1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1],
[0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0],
[1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1],
[0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0]
],
"simkin_glider_gun": [
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 1],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
],
"what is this": [
[1, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 1],
[0, 1, 0, 0, 0],
[0, 0, 1, 0, 0]
]
}
| 45.517647 | 120 | 0.304084 | 2,211 | 7,738 | 1.061511 | 0.011759 | 1.438432 | 1.962079 | 2.404772 | 0.936941 | 0.936941 | 0.93481 | 0.930976 | 0.919898 | 0.915211 | 0 | 0.446717 | 0.368183 | 7,738 | 169 | 121 | 45.786982 | 0.03334 | 0 | 0 | 0.52381 | 0 | 0 | 0.021197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
400a25ba56720e0ffe60d87f416ab12d0fceb74c | 32,396 | py | Python | Drone_project/simulation_ws/devel/lib/python2.7/dist-packages/ardrone_as/msg/_ArdroneAction.py | nikku1234/ROS- | 8fa78a78e7f2350d3e35152f8dd979c4fe8aa18e | [
"MIT"
] | 1 | 2020-07-02T06:06:36.000Z | 2020-07-02T06:06:36.000Z | Drone_project/simulation_ws/devel/lib/python2.7/dist-packages/ardrone_as/msg/_ArdroneAction.py | nikku1234/ROS | 8fa78a78e7f2350d3e35152f8dd979c4fe8aa18e | [
"MIT"
] | null | null | null | Drone_project/simulation_ws/devel/lib/python2.7/dist-packages/ardrone_as/msg/_ArdroneAction.py | nikku1234/ROS | 8fa78a78e7f2350d3e35152f8dd979c4fe8aa18e | [
"MIT"
] | null | null | null | # This Python file uses the following encoding: utf-8
"""autogenerated by genpy from ardrone_as/ArdroneAction.msg. Do not edit."""
import sys
python3 = True if sys.hexversion > 0x03000000 else False
import genpy
import struct
import actionlib_msgs.msg
import sensor_msgs.msg
import genpy
import ardrone_as.msg
import std_msgs.msg
class ArdroneAction(genpy.Message):
_md5sum = "6edcd96c5f3b653a5f6894b456244926"
_type = "ardrone_as/ArdroneAction"
_has_header = False #flag to mark the presence of a Header object
_full_text = """# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
ArdroneActionGoal action_goal
ArdroneActionResult action_result
ArdroneActionFeedback action_feedback
================================================================================
MSG: ardrone_as/ArdroneActionGoal
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
Header header
actionlib_msgs/GoalID goal_id
ArdroneGoal goal
================================================================================
MSG: std_msgs/Header
# Standard metadata for higher-level stamped data types.
# This is generally used to communicate timestamped data
# in a particular coordinate frame.
#
# sequence ID: consecutively increasing ID
uint32 seq
#Two-integer timestamp that is expressed as:
# * stamp.sec: seconds (stamp_secs) since epoch (in Python the variable is called 'secs')
# * stamp.nsec: nanoseconds since stamp_secs (in Python the variable is called 'nsecs')
# time-handling sugar is provided by the client library
time stamp
#Frame this data is associated with
# 0: no frame
# 1: global frame
string frame_id
================================================================================
MSG: actionlib_msgs/GoalID
# The stamp should store the time at which this goal was requested.
# It is used by an action server when it tries to preempt all
# goals that were requested before a certain time
time stamp
# The id provides a way to associate feedback and
# result message with specific goal requests. The id
# specified must be unique.
string id
================================================================================
MSG: ardrone_as/ArdroneGoal
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
#goal for the drone
int32 nseconds # the number of seconds the drone will be taking pictures
================================================================================
MSG: ardrone_as/ArdroneActionResult
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
Header header
actionlib_msgs/GoalStatus status
ArdroneResult result
================================================================================
MSG: actionlib_msgs/GoalStatus
GoalID goal_id
uint8 status
uint8 PENDING = 0 # The goal has yet to be processed by the action server
uint8 ACTIVE = 1 # The goal is currently being processed by the action server
uint8 PREEMPTED = 2 # The goal received a cancel request after it started executing
# and has since completed its execution (Terminal State)
uint8 SUCCEEDED = 3 # The goal was achieved successfully by the action server (Terminal State)
uint8 ABORTED = 4 # The goal was aborted during execution by the action server due
# to some failure (Terminal State)
uint8 REJECTED = 5 # The goal was rejected by the action server without being processed,
# because the goal was unattainable or invalid (Terminal State)
uint8 PREEMPTING = 6 # The goal received a cancel request after it started executing
# and has not yet completed execution
uint8 RECALLING = 7 # The goal received a cancel request before it started executing,
# but the action server has not yet confirmed that the goal is canceled
uint8 RECALLED = 8 # The goal received a cancel request before it started executing
# and was successfully cancelled (Terminal State)
uint8 LOST = 9 # An action client can determine that a goal is LOST. This should not be
# sent over the wire by an action server
#Allow for the user to associate a string with GoalStatus for debugging
string text
================================================================================
MSG: ardrone_as/ArdroneResult
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
#result
sensor_msgs/CompressedImage[] allPictures # an array containing all the pictures taken along the nseconds
================================================================================
MSG: sensor_msgs/CompressedImage
# This message contains a compressed image
Header header # Header timestamp should be acquisition time of image
# Header frame_id should be optical frame of camera
# origin of frame should be optical center of cameara
# +x should point to the right in the image
# +y should point down in the image
# +z should point into to plane of the image
string format # Specifies the format of the data
# Acceptable values:
# jpeg, png
uint8[] data # Compressed image buffer
================================================================================
MSG: ardrone_as/ArdroneActionFeedback
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
Header header
actionlib_msgs/GoalStatus status
ArdroneFeedback feedback
================================================================================
MSG: ardrone_as/ArdroneFeedback
# ====== DO NOT MODIFY! AUTOGENERATED FROM AN ACTION DEFINITION ======
#feedback
sensor_msgs/CompressedImage lastImage # the last image taken
"""
__slots__ = ['action_goal','action_result','action_feedback']
_slot_types = ['ardrone_as/ArdroneActionGoal','ardrone_as/ArdroneActionResult','ardrone_as/ArdroneActionFeedback']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
action_goal,action_result,action_feedback
:param args: complete set of field values, in .msg order
:param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(ArdroneAction, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.action_goal is None:
self.action_goal = ardrone_as.msg.ArdroneActionGoal()
if self.action_result is None:
self.action_result = ardrone_as.msg.ArdroneActionResult()
if self.action_feedback is None:
self.action_feedback = ardrone_as.msg.ArdroneActionFeedback()
else:
self.action_goal = ardrone_as.msg.ArdroneActionGoal()
self.action_result = ardrone_as.msg.ArdroneActionResult()
self.action_feedback = ardrone_as.msg.ArdroneActionFeedback()
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
:param buff: buffer, ``StringIO``
"""
try:
_x = self
buff.write(_get_struct_3I().pack(_x.action_goal.header.seq, _x.action_goal.header.stamp.secs, _x.action_goal.header.stamp.nsecs))
_x = self.action_goal.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_goal.goal_id.stamp.secs, _x.action_goal.goal_id.stamp.nsecs))
_x = self.action_goal.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_i3I().pack(_x.action_goal.goal.nseconds, _x.action_result.header.seq, _x.action_result.header.stamp.secs, _x.action_result.header.stamp.nsecs))
_x = self.action_result.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_result.status.goal_id.stamp.secs, _x.action_result.status.goal_id.stamp.nsecs))
_x = self.action_result.status.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_get_struct_B().pack(self.action_result.status.status))
_x = self.action_result.status.text
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
length = len(self.action_result.result.allPictures)
buff.write(_struct_I.pack(length))
for val1 in self.action_result.result.allPictures:
_v1 = val1.header
buff.write(_get_struct_I().pack(_v1.seq))
_v2 = _v1.stamp
_x = _v2
buff.write(_get_struct_2I().pack(_x.secs, _x.nsecs))
_x = _v1.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.format
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.data
length = len(_x)
# - if encoded as a list instead, serialize as bytes instead of string
if type(_x) in [list, tuple]:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_3I().pack(_x.action_feedback.header.seq, _x.action_feedback.header.stamp.secs, _x.action_feedback.header.stamp.nsecs))
_x = self.action_feedback.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_feedback.status.goal_id.stamp.secs, _x.action_feedback.status.goal_id.stamp.nsecs))
_x = self.action_feedback.status.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_get_struct_B().pack(self.action_feedback.status.status))
_x = self.action_feedback.status.text
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_3I().pack(_x.action_feedback.feedback.lastImage.header.seq, _x.action_feedback.feedback.lastImage.header.stamp.secs, _x.action_feedback.feedback.lastImage.header.stamp.nsecs))
_x = self.action_feedback.feedback.lastImage.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.action_feedback.feedback.lastImage.format
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.action_feedback.feedback.lastImage.data
length = len(_x)
# - if encoded as a list instead, serialize as bytes instead of string
if type(_x) in [list, tuple]:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(locals().get('_x', self)))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(locals().get('_x', self)))))
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
:param str: byte array of serialized message, ``str``
"""
try:
if self.action_goal is None:
self.action_goal = ardrone_as.msg.ArdroneActionGoal()
if self.action_result is None:
self.action_result = ardrone_as.msg.ArdroneActionResult()
if self.action_feedback is None:
self.action_feedback = ardrone_as.msg.ArdroneActionFeedback()
end = 0
_x = self
start = end
end += 12
(_x.action_goal.header.seq, _x.action_goal.header.stamp.secs, _x.action_goal.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_goal.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_goal.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_goal.goal_id.stamp.secs, _x.action_goal.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_goal.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_goal.goal_id.id = str[start:end]
_x = self
start = end
end += 16
(_x.action_goal.goal.nseconds, _x.action_result.header.seq, _x.action_result.header.stamp.secs, _x.action_result.header.stamp.nsecs,) = _get_struct_i3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_result.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_result.status.goal_id.stamp.secs, _x.action_result.status.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.status.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_result.status.goal_id.id = str[start:end]
start = end
end += 1
(self.action_result.status.status,) = _get_struct_B().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.status.text = str[start:end].decode('utf-8')
else:
self.action_result.status.text = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
self.action_result.result.allPictures = []
for i in range(0, length):
val1 = sensor_msgs.msg.CompressedImage()
_v3 = val1.header
start = end
end += 4
(_v3.seq,) = _get_struct_I().unpack(str[start:end])
_v4 = _v3.stamp
_x = _v4
start = end
end += 8
(_x.secs, _x.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
_v3.frame_id = str[start:end].decode('utf-8')
else:
_v3.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.format = str[start:end].decode('utf-8')
else:
val1.format = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
val1.data = str[start:end]
self.action_result.result.allPictures.append(val1)
_x = self
start = end
end += 12
(_x.action_feedback.header.seq, _x.action_feedback.header.stamp.secs, _x.action_feedback.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_feedback.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_feedback.status.goal_id.stamp.secs, _x.action_feedback.status.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.status.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_feedback.status.goal_id.id = str[start:end]
start = end
end += 1
(self.action_feedback.status.status,) = _get_struct_B().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.status.text = str[start:end].decode('utf-8')
else:
self.action_feedback.status.text = str[start:end]
_x = self
start = end
end += 12
(_x.action_feedback.feedback.lastImage.header.seq, _x.action_feedback.feedback.lastImage.header.stamp.secs, _x.action_feedback.feedback.lastImage.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.feedback.lastImage.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_feedback.feedback.lastImage.header.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.feedback.lastImage.format = str[start:end].decode('utf-8')
else:
self.action_feedback.feedback.lastImage.format = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.action_feedback.feedback.lastImage.data = str[start:end]
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
:param buff: buffer, ``StringIO``
:param numpy: numpy python module
"""
try:
_x = self
buff.write(_get_struct_3I().pack(_x.action_goal.header.seq, _x.action_goal.header.stamp.secs, _x.action_goal.header.stamp.nsecs))
_x = self.action_goal.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_goal.goal_id.stamp.secs, _x.action_goal.goal_id.stamp.nsecs))
_x = self.action_goal.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_i3I().pack(_x.action_goal.goal.nseconds, _x.action_result.header.seq, _x.action_result.header.stamp.secs, _x.action_result.header.stamp.nsecs))
_x = self.action_result.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_result.status.goal_id.stamp.secs, _x.action_result.status.goal_id.stamp.nsecs))
_x = self.action_result.status.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_get_struct_B().pack(self.action_result.status.status))
_x = self.action_result.status.text
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
length = len(self.action_result.result.allPictures)
buff.write(_struct_I.pack(length))
for val1 in self.action_result.result.allPictures:
_v5 = val1.header
buff.write(_get_struct_I().pack(_v5.seq))
_v6 = _v5.stamp
_x = _v6
buff.write(_get_struct_2I().pack(_x.secs, _x.nsecs))
_x = _v5.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.format
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.data
length = len(_x)
# - if encoded as a list instead, serialize as bytes instead of string
if type(_x) in [list, tuple]:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_3I().pack(_x.action_feedback.header.seq, _x.action_feedback.header.stamp.secs, _x.action_feedback.header.stamp.nsecs))
_x = self.action_feedback.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_2I().pack(_x.action_feedback.status.goal_id.stamp.secs, _x.action_feedback.status.goal_id.stamp.nsecs))
_x = self.action_feedback.status.goal_id.id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_get_struct_B().pack(self.action_feedback.status.status))
_x = self.action_feedback.status.text
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self
buff.write(_get_struct_3I().pack(_x.action_feedback.feedback.lastImage.header.seq, _x.action_feedback.feedback.lastImage.header.stamp.secs, _x.action_feedback.feedback.lastImage.header.stamp.nsecs))
_x = self.action_feedback.feedback.lastImage.header.frame_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.action_feedback.feedback.lastImage.format
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.action_feedback.feedback.lastImage.data
length = len(_x)
# - if encoded as a list instead, serialize as bytes instead of string
if type(_x) in [list, tuple]:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(locals().get('_x', self)))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(locals().get('_x', self)))))
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
:param str: byte array of serialized message, ``str``
:param numpy: numpy python module
"""
try:
if self.action_goal is None:
self.action_goal = ardrone_as.msg.ArdroneActionGoal()
if self.action_result is None:
self.action_result = ardrone_as.msg.ArdroneActionResult()
if self.action_feedback is None:
self.action_feedback = ardrone_as.msg.ArdroneActionFeedback()
end = 0
_x = self
start = end
end += 12
(_x.action_goal.header.seq, _x.action_goal.header.stamp.secs, _x.action_goal.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_goal.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_goal.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_goal.goal_id.stamp.secs, _x.action_goal.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_goal.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_goal.goal_id.id = str[start:end]
_x = self
start = end
end += 16
(_x.action_goal.goal.nseconds, _x.action_result.header.seq, _x.action_result.header.stamp.secs, _x.action_result.header.stamp.nsecs,) = _get_struct_i3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_result.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_result.status.goal_id.stamp.secs, _x.action_result.status.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.status.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_result.status.goal_id.id = str[start:end]
start = end
end += 1
(self.action_result.status.status,) = _get_struct_B().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_result.status.text = str[start:end].decode('utf-8')
else:
self.action_result.status.text = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
self.action_result.result.allPictures = []
for i in range(0, length):
val1 = sensor_msgs.msg.CompressedImage()
_v7 = val1.header
start = end
end += 4
(_v7.seq,) = _get_struct_I().unpack(str[start:end])
_v8 = _v7.stamp
_x = _v8
start = end
end += 8
(_x.secs, _x.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
_v7.frame_id = str[start:end].decode('utf-8')
else:
_v7.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.format = str[start:end].decode('utf-8')
else:
val1.format = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
val1.data = str[start:end]
self.action_result.result.allPictures.append(val1)
_x = self
start = end
end += 12
(_x.action_feedback.header.seq, _x.action_feedback.header.stamp.secs, _x.action_feedback.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_feedback.header.frame_id = str[start:end]
_x = self
start = end
end += 8
(_x.action_feedback.status.goal_id.stamp.secs, _x.action_feedback.status.goal_id.stamp.nsecs,) = _get_struct_2I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.status.goal_id.id = str[start:end].decode('utf-8')
else:
self.action_feedback.status.goal_id.id = str[start:end]
start = end
end += 1
(self.action_feedback.status.status,) = _get_struct_B().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.status.text = str[start:end].decode('utf-8')
else:
self.action_feedback.status.text = str[start:end]
_x = self
start = end
end += 12
(_x.action_feedback.feedback.lastImage.header.seq, _x.action_feedback.feedback.lastImage.header.stamp.secs, _x.action_feedback.feedback.lastImage.header.stamp.nsecs,) = _get_struct_3I().unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.feedback.lastImage.header.frame_id = str[start:end].decode('utf-8')
else:
self.action_feedback.feedback.lastImage.header.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.action_feedback.feedback.lastImage.format = str[start:end].decode('utf-8')
else:
self.action_feedback.feedback.lastImage.format = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.action_feedback.feedback.lastImage.data = str[start:end]
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
_struct_I = genpy.struct_I
def _get_struct_I():
global _struct_I
return _struct_I
_struct_3I = None
def _get_struct_3I():
global _struct_3I
if _struct_3I is None:
_struct_3I = struct.Struct("<3I")
return _struct_3I
_struct_B = None
def _get_struct_B():
global _struct_B
if _struct_B is None:
_struct_B = struct.Struct("<B")
return _struct_B
_struct_2I = None
def _get_struct_2I():
global _struct_2I
if _struct_2I is None:
_struct_2I = struct.Struct("<2I")
return _struct_2I
_struct_i3I = None
def _get_struct_i3I():
global _struct_i3I
if _struct_i3I is None:
_struct_i3I = struct.Struct("<i3I")
return _struct_i3I
| 39.411192 | 214 | 0.619768 | 4,378 | 32,396 | 4.374143 | 0.079032 | 0.076867 | 0.059739 | 0.051802 | 0.78564 | 0.78564 | 0.766841 | 0.759478 | 0.739217 | 0.730548 | 0 | 0.012996 | 0.23756 | 32,396 | 821 | 215 | 39.459196 | 0.762308 | 0.047753 | 0 | 0.799197 | 1 | 0.001339 | 0.199094 | 0.043305 | 0 | 0 | 0.000326 | 0 | 0 | 1 | 0.014726 | false | 0 | 0.01071 | 0 | 0.045515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40108b84616c2d3eddb2d22f6b6acff479c9050b | 52,671 | py | Python | third_party/gsutil/gslib/tests/test_rsync.py | ravitejavalluri/catapult | 246a39a82c2213d913a96fff020a263838dc76e6 | [
"BSD-3-Clause"
] | 8 | 2016-02-08T11:59:31.000Z | 2020-05-31T15:19:54.000Z | third_party/gsutil/gslib/tests/test_rsync.py | ravitejavalluri/catapult | 246a39a82c2213d913a96fff020a263838dc76e6 | [
"BSD-3-Clause"
] | 1 | 2021-02-23T22:20:14.000Z | 2021-02-23T22:20:14.000Z | third_party/gsutil/gslib/tests/test_rsync.py | ravitejavalluri/catapult | 246a39a82c2213d913a96fff020a263838dc76e6 | [
"BSD-3-Clause"
] | 7 | 2016-02-09T09:28:14.000Z | 2020-07-25T19:03:36.000Z | # -*- coding: utf-8 -*-
# Copyright 2014 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Integration tests for rsync command."""
import os
import crcmod
import gslib.tests.testcase as testcase
from gslib.tests.testcase.integration_testcase import SkipForS3
from gslib.tests.util import ObjectToURI as suri
from gslib.tests.util import SequentialAndParallelTransfer
from gslib.tests.util import SetBotoConfigForTest
from gslib.tests.util import unittest
from gslib.util import IS_WINDOWS
from gslib.util import Retry
from gslib.util import UsingCrcmodExtension
NO_CHANGES = 'Building synchronization state...\nStarting synchronization\n'
def _TailSet(start_point, listing):
"""Returns set of object name tails.
Tails can be compared between source and dest, past the point at which rsync
was done. For example if test ran rsync gs://bucket1/dir gs://bucket2/dir2,
the tails for listings from bucket1 would start after "dir", while the tails
for listings from bucket2 would start after "dir2".
Args:
start_point: The target of the rsync command, e.g., for the above command it
would be gs://bucket1/dir for the bucket1 listing results and
gs://bucket2/dir2 for the bucket2 listing results.
listing: The listing over which to compute tail.
Returns:
Object name tails.
"""
return set(l[len(start_point):] for l in listing.strip().split('\n'))
# TODO: Add inspection to the retry wrappers in this test suite where the state
# at the end of a retry block is depended upon by subsequent tests (since
# listing content can vary depending on which backend server is reached until
# eventual consistency is reached).
# TODO: Remove retry wrappers and AssertNObjectsInBucket calls if GCS ever
# supports strong listing consistency.
class TestRsync(testcase.GsUtilIntegrationTestCase):
"""Integration tests for rsync command."""
@staticmethod
def _FlatListDir(directory):
"""Perform a flat listing over directory.
Args:
directory: The directory to list
Returns:
Listings with path separators canonicalized to '/', to make assertions
easier for Linux vs Windows.
"""
result = []
for dirpath, _, filenames in os.walk(directory):
for f in filenames:
result.append(os.path.join(dirpath, f))
return '\n'.join(result).replace('\\', '/')
def _FlatListBucket(self, bucket_url_string):
"""Perform a flat listing over bucket_url_string."""
return self.RunGsUtil(['ls', suri(bucket_url_string, '**')],
return_stdout=True)
def test_invalid_args(self):
"""Tests various invalid argument cases."""
bucket_uri = self.CreateBucket()
obj1 = self.CreateObject(bucket_uri=bucket_uri, object_name='obj1',
contents='obj1')
tmpdir = self.CreateTempDir()
# rsync object to bucket.
self.RunGsUtil(['rsync', suri(obj1), suri(bucket_uri)], expected_status=1)
# rsync bucket to object.
self.RunGsUtil(['rsync', suri(bucket_uri), suri(obj1)], expected_status=1)
# rsync bucket to non-existent bucket.
self.RunGsUtil(['rsync', suri(bucket_uri), self.nonexistent_bucket_name],
expected_status=1)
# rsync object to dir.
self.RunGsUtil(['rsync', suri(obj1), tmpdir], expected_status=1)
# rsync dir to object.
self.RunGsUtil(['rsync', tmpdir, suri(obj1)], expected_status=1)
# rsync dir to non-existent bucket.
self.RunGsUtil(['rsync', tmpdir, suri(obj1), self.nonexistent_bucket_name],
expected_status=1)
# Note: The tests below exercise the cases
# {src_dir, src_bucket} X {dst_dir, dst_bucket}. We use gsutil rsync -d for
# all the cases but then have just one test without -d (test_bucket_to_bucket)
# as representative of handling without the -d option. This provides
# reasonable test coverage because the -d handling it src/dest URI-type
# independent, and keeps the test case combinations more manageable.
def test_bucket_to_bucket(self):
"""Tests that flat and recursive rsync between 2 buckets works correctly."""
# Create 2 buckets with 1 overlapping object, 1 extra object at root level
# in each, and 1 extra object 1 level down in each, where one of the objects
# starts with "." to test that we don't skip those objects. Make the
# overlapping objects named the same but with different content, to test
# that we detect and properly copy in that case.
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket1_uri, object_name='.obj2',
contents='.obj2')
self.CreateObject(bucket_uri=bucket1_uri, object_name='subdir/obj3',
contents='subdir/obj3')
self.CreateObject(bucket_uri=bucket2_uri, object_name='.obj2',
contents='.OBJ2')
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj4',
contents='obj4')
self.CreateObject(bucket_uri=bucket2_uri, object_name='subdir/obj5',
contents='subdir/obj5')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Second bucket should have new objects added from source bucket (without
# removing extraneeous object found in dest bucket), and without the
# subdir objects synchronized.
self.assertEquals(listing2,
set(['/obj1', '/.obj2', '/obj4', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were correctly synchronized (bucket to bucket sync uses
# checksums).
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket1_uri, '.obj2')], return_stdout=True))
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket2_uri, '.obj2')], return_stdout=True))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', suri(bucket1_uri), suri(bucket2_uri)], return_stderr=True))
_Check2()
# Now add and remove some objects in each bucket and test rsync -r.
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj6',
contents='obj6')
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj7',
contents='obj7')
self.RunGsUtil(['rm', suri(bucket1_uri, 'obj1')])
self.RunGsUtil(['rm', suri(bucket2_uri, '.obj2')])
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
self.RunGsUtil(['rsync', '-r', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/.obj2', '/obj6', '/subdir/obj3']))
# Second bucket should have objects tha were newly added to first bucket
# (wihout removing extraneous dest bucket objects), and without the
# subdir objects synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/obj4', '/obj6',
'/obj7', '/subdir/obj3',
'/subdir/obj5']))
_Check3()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check4():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-r', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check4()
def test_bucket_to_bucket_minus_d(self):
"""Tests that flat and recursive rsync between 2 buckets works correctly."""
# Create 2 buckets with 1 overlapping object, 1 extra object at root level
# in each, and 1 extra object 1 level down in each, where one of the objects
# starts with "." to test that we don't skip those objects. Make the
# overlapping objects named the same but with different content, to test
# that we detect and properly copy in that case.
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket1_uri, object_name='.obj2',
contents='.obj2')
self.CreateObject(bucket_uri=bucket1_uri, object_name='subdir/obj3',
contents='subdir/obj3')
self.CreateObject(bucket_uri=bucket2_uri, object_name='.obj2',
contents='.OBJ2')
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj4',
contents='obj4')
self.CreateObject(bucket_uri=bucket2_uri, object_name='subdir/obj5',
contents='subdir/obj5')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Second bucket should have content like first bucket but without the
# subdir objects synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were correctly synchronized (bucket to bucket sync uses
# checksums).
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket1_uri, '.obj2')], return_stdout=True))
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket2_uri, '.obj2')], return_stdout=True))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
# Now add and remove some objects in each bucket and test rsync -r.
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj6',
contents='obj6')
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj7',
contents='obj7')
self.RunGsUtil(['rm', suri(bucket1_uri, 'obj1')])
self.RunGsUtil(['rm', suri(bucket2_uri, '.obj2')])
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
self.RunGsUtil(['rsync', '-d', '-r',
suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/.obj2', '/obj6', '/subdir/obj3']))
# Second bucket should have content like first bucket but without the
# subdir objects synchronized.
self.assertEquals(listing2, set(['/.obj2', '/obj6', '/subdir/obj3']))
_Check3()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check4():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-r', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check4()
# Test sequential upload as well as parallel composite upload case.
@SequentialAndParallelTransfer
@unittest.skipUnless(UsingCrcmodExtension(crcmod),
'Test requires fast crcmod.')
def test_dir_to_bucket_minus_d(self):
"""Tests that flat and recursive rsync dir to bucket works correctly."""
# Create dir and bucket with 1 overlapping object, 1 extra object at root
# level in each, and 1 extra object 1 level down in each, where one of the
# objects starts with "." to test that we don't skip those objects. Make the
# overlapping objects named the same but with different content, to test
# that we detect and properly copy in that case.
tmpdir = self.CreateTempDir()
subdir = os.path.join(tmpdir, 'subdir')
os.mkdir(subdir)
bucket_uri = self.CreateBucket()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
self.CreateTempFile(tmpdir=subdir, file_name='obj3', contents='subdir/obj3')
self.CreateObject(bucket_uri=bucket_uri, object_name='.obj2',
contents='.OBJ2')
self.CreateObject(bucket_uri=bucket_uri, object_name='obj4',
contents='obj4')
self.CreateObject(bucket_uri=bucket_uri, object_name='subdir/obj5',
contents='subdir/obj5')
# Need to make sure the bucket listing is caught-up, otherwise the
# first rsync may not see .obj2 and overwrite it.
self.AssertNObjectsInBucket(bucket_uri, 3)
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-d', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Bucket should have content like dir but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were not synchronized (dir to bucket sync doesn't use checksums
# unless you specify -c).
with open(os.path.join(tmpdir, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
self.assertEquals('.OBJ2', self.RunGsUtil(
['cat', suri(bucket_uri, '.obj2')], return_stdout=True))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', tmpdir, suri(bucket_uri)], return_stderr=True))
_Check2()
# Now rerun the sync with the -c option.
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
"""Tests rsync -c works as expected."""
self.RunGsUtil(['rsync', '-d', '-c', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Bucket should have content like dir but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were synchronized (dir to bucket sync with -c uses checksums).
with open(os.path.join(tmpdir, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket_uri, '.obj2')], return_stdout=True))
_Check3()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check4():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-c', tmpdir, suri(bucket_uri)], return_stderr=True))
_Check4()
# Now add and remove some objects in dir and bucket and test rsync -r.
self.CreateTempFile(tmpdir=tmpdir, file_name='obj6', contents='obj6')
self.CreateObject(bucket_uri=bucket_uri, object_name='obj7',
contents='obj7')
os.unlink(os.path.join(tmpdir, 'obj1'))
self.RunGsUtil(['rm', suri(bucket_uri, '.obj2')])
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check5():
self.RunGsUtil(['rsync', '-d', '-r', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/.obj2', '/obj6', '/subdir/obj3']))
# Bucket should have content like dir but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/.obj2', '/obj6', '/subdir/obj3']))
_Check5()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check6():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-r', tmpdir, suri(bucket_uri)], return_stderr=True))
_Check6()
@unittest.skipUnless(UsingCrcmodExtension(crcmod),
'Test requires fast crcmod.')
def test_dir_to_dir_minus_d(self):
"""Tests that flat and recursive rsync dir to dir works correctly."""
# Create 2 dirs with 1 overlapping file, 1 extra file at root
# level in each, and 1 extra file 1 level down in each, where one of the
# objects starts with "." to test that we don't skip those objects. Make the
# overlapping files named the same but with different content, to test
# that we detect and properly copy in that case.
tmpdir1 = self.CreateTempDir()
tmpdir2 = self.CreateTempDir()
subdir1 = os.path.join(tmpdir1, 'subdir1')
subdir2 = os.path.join(tmpdir2, 'subdir2')
os.mkdir(subdir1)
os.mkdir(subdir2)
self.CreateTempFile(tmpdir=tmpdir1, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir1, file_name='.obj2', contents='.obj2')
self.CreateTempFile(
tmpdir=subdir1, file_name='obj3', contents='subdir1/obj3')
self.CreateTempFile(tmpdir=tmpdir2, file_name='.obj2', contents='.OBJ2')
self.CreateTempFile(tmpdir=tmpdir2, file_name='obj4', contents='obj4')
self.CreateTempFile(
tmpdir=subdir2, file_name='obj5', contents='subdir2/obj5')
self.RunGsUtil(['rsync', '-d', tmpdir1, tmpdir2])
listing1 = _TailSet(tmpdir1, self._FlatListDir(tmpdir1))
listing2 = _TailSet(tmpdir2, self._FlatListDir(tmpdir2))
# dir1 should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir1/obj3']))
# dir2 should have content like dir1 but without the subdir1 objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir2/obj5']))
# Assert that the src/dest objects that had same length but different
# checksums were not synchronized (dir to dir sync doesn't use checksums
# unless you specify -c).
with open(os.path.join(tmpdir1, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
with open(os.path.join(tmpdir2, '.obj2')) as f:
self.assertEquals('.OBJ2', '\n'.join(f.readlines()))
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', tmpdir1, tmpdir2], return_stderr=True))
_Check1()
# Now rerun the sync with the -c option.
self.RunGsUtil(['rsync', '-d', '-c', tmpdir1, tmpdir2])
listing1 = _TailSet(tmpdir1, self._FlatListDir(tmpdir1))
listing2 = _TailSet(tmpdir2, self._FlatListDir(tmpdir2))
# dir1 should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir1/obj3']))
# dir2 should have content like dir but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir2/obj5']))
# Assert that the src/dest objects that had same length but different
# content were synchronized (dir to dir sync with -c uses checksums).
with open(os.path.join(tmpdir1, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
with open(os.path.join(tmpdir1, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-c', tmpdir1, tmpdir2], return_stderr=True))
_Check2()
# Now add and remove some objects in both dirs and test rsync -r.
self.CreateTempFile(tmpdir=tmpdir1, file_name='obj6', contents='obj6')
self.CreateTempFile(tmpdir=tmpdir2, file_name='obj7', contents='obj7')
os.unlink(os.path.join(tmpdir1, 'obj1'))
os.unlink(os.path.join(tmpdir2, '.obj2'))
self.RunGsUtil(['rsync', '-d', '-r', tmpdir1, tmpdir2])
listing1 = _TailSet(tmpdir1, self._FlatListDir(tmpdir1))
listing2 = _TailSet(tmpdir2, self._FlatListDir(tmpdir2))
# dir1 should have un-altered content.
self.assertEquals(listing1, set(['/.obj2', '/obj6', '/subdir1/obj3']))
# dir2 should have content like dir but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/.obj2', '/obj6', '/subdir1/obj3']))
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-r', tmpdir1, tmpdir2], return_stderr=True))
_Check3()
def test_dir_to_dir_minus_d_more_files_than_bufsize(self):
"""Tests concurrently building listing from multiple tmp file ranges."""
# Create 2 dirs, where each dir has 1000 objects and differing names.
tmpdir1 = self.CreateTempDir()
tmpdir2 = self.CreateTempDir()
for i in range(0, 1000):
self.CreateTempFile(tmpdir=tmpdir1, file_name='d1-%s' %i, contents='x')
self.CreateTempFile(tmpdir=tmpdir2, file_name='d2-%s' %i, contents='y')
# We open a new temp file each time we reach rsync_buffer_lines of
# listing output. On Windows, this will result in a 'too many open file
# handles' error, so choose a larger value so as not to open so many files.
rsync_buffer_config = [('GSUtil', 'rsync_buffer_lines',
'50' if IS_WINDOWS else '2')]
# Run gsutil with config option to make buffer size << # files.
with SetBotoConfigForTest(rsync_buffer_config):
self.RunGsUtil(['rsync', '-d', tmpdir1, tmpdir2])
listing1 = _TailSet(tmpdir1, self._FlatListDir(tmpdir1))
listing2 = _TailSet(tmpdir2, self._FlatListDir(tmpdir2))
self.assertEquals(listing1, listing2)
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', tmpdir1, tmpdir2], return_stderr=True))
_Check()
@unittest.skipUnless(UsingCrcmodExtension(crcmod),
'Test requires fast crcmod.')
def test_bucket_to_dir_minus_d(self):
"""Tests that flat and recursive rsync bucket to dir works correctly."""
# Create bucket and dir with 1 overlapping object, 1 extra object at root
# level in each, and 1 extra object 1 level down in each, where one of the
# objects starts with "." to test that we don't skip those objects. Make the
# overlapping objects named the same but with different content, to test
# that we detect and properly copy in that case.
bucket_uri = self.CreateBucket()
tmpdir = self.CreateTempDir()
subdir = os.path.join(tmpdir, 'subdir')
os.mkdir(subdir)
self.CreateObject(bucket_uri=bucket_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket_uri, object_name='.obj2',
contents='.obj2')
self.CreateObject(bucket_uri=bucket_uri, object_name='subdir/obj3',
contents='subdir/obj3')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.OBJ2')
self.CreateTempFile(tmpdir=tmpdir, file_name='obj4', contents='obj4')
self.CreateTempFile(tmpdir=subdir, file_name='obj5', contents='subdir/obj5')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-d', suri(bucket_uri), tmpdir])
listing1 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
listing2 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Dir should have content like bucket but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were not synchronized (bucket to dir sync doesn't use checksums
# unless you specify -c).
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket_uri, '.obj2')], return_stdout=True))
with open(os.path.join(tmpdir, '.obj2')) as f:
self.assertEquals('.OBJ2', '\n'.join(f.readlines()))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket_uri), tmpdir], return_stderr=True))
_Check2()
# Now rerun the sync with the -c option.
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
"""Tests rsync -c works as expected."""
self.RunGsUtil(['rsync', '-d', '-c', suri(bucket_uri), tmpdir])
listing1 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
listing2 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Dir should have content like bucket but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
# Assert that the src/dest objects that had same length but different
# content were synchronized (bucket to dir sync with -c uses checksums).
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket_uri, '.obj2')], return_stdout=True))
with open(os.path.join(tmpdir, '.obj2')) as f:
self.assertEquals('.obj2', '\n'.join(f.readlines()))
_Check3()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check4():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-c', suri(bucket_uri), tmpdir], return_stderr=True))
_Check4()
# Now add and remove some objects in dir and bucket and test rsync -r.
self.CreateObject(bucket_uri=bucket_uri, object_name='obj6',
contents='obj6')
self.CreateTempFile(tmpdir=tmpdir, file_name='obj7', contents='obj7')
self.RunGsUtil(['rm', suri(bucket_uri, 'obj1')])
os.unlink(os.path.join(tmpdir, '.obj2'))
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check5():
self.RunGsUtil(['rsync', '-d', '-r', suri(bucket_uri), tmpdir])
listing1 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
listing2 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Bucket should have un-altered content.
self.assertEquals(listing1, set(['/.obj2', '/obj6', '/subdir/obj3']))
# Dir should have content like bucket but without the subdir objects
# synchronized.
self.assertEquals(listing2, set(['/.obj2', '/obj6', '/subdir/obj3']))
_Check5()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check6():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-r', suri(bucket_uri), tmpdir], return_stderr=True))
_Check6()
def test_bucket_to_dir_minus_d_with_fname_case_change(self):
"""Tests that name case changes work correctly.
Example:
Windows filenames are case-preserving in what you wrote, but case-
insensitive when compared. If you synchronize from FS to cloud and then
change case-naming in local files, you could end up with this situation:
Cloud copy is called .../TiVo/...
FS copy is called .../Tivo/...
Then, if you sync from cloud to FS, if rsync doesn't recognize that on
Windows these names are identical, each rsync run will cause both a copy
and a delete to be executed.
"""
# Create bucket and dir with same objects, but dir copy has different name
# case.
bucket_uri = self.CreateBucket()
tmpdir = self.CreateTempDir()
self.CreateObject(bucket_uri=bucket_uri, object_name='obj1',
contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='Obj1', contents='obj1')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
output = self.RunGsUtil(
['rsync', '-d', '-r', suri(bucket_uri), tmpdir], return_stderr=True)
# Nothing should be copied or removed under Windows.
if IS_WINDOWS:
self.assertEquals(NO_CHANGES, output)
else:
self.assertNotEquals(NO_CHANGES, output)
_Check1()
def test_bucket_to_dir_minus_d_with_leftover_dir_placeholder(self):
"""Tests that we correctly handle leftover dir placeholders.
See comments in gslib.commands.rsync._FieldedListingIterator for details.
"""
bucket_uri = self.CreateBucket()
tmpdir = self.CreateTempDir()
self.CreateObject(bucket_uri=bucket_uri, object_name='obj1',
contents='obj1')
# Create a placeholder like what can be left over by web GUI tools.
key_uri = bucket_uri.clone_replace_name('/')
key_uri.set_contents_from_string('')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
output = self.RunGsUtil(
['rsync', '-d', '-r', suri(bucket_uri), tmpdir], return_stderr=True)
listing1 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
listing2 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '//']))
# Bucket should not have the placeholder object.
self.assertEquals(listing2, set(['/obj1']))
_Check1()
@unittest.skipIf(IS_WINDOWS, 'os.symlink() is not available on Windows.')
def test_rsync_minus_d_minus_e(self):
"""Tests that rsync -e ignores symlinks."""
tmpdir = self.CreateTempDir()
subdir = os.path.join(tmpdir, 'subdir')
os.mkdir(subdir)
bucket_uri = self.CreateBucket()
fpath1 = self.CreateTempFile(
tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
self.CreateTempFile(tmpdir=subdir, file_name='obj3', contents='subdir/obj3')
good_symlink_path = os.path.join(tmpdir, 'symlink1')
os.symlink(fpath1, good_symlink_path)
# Make a symlink that points to a non-existent path to test that -e also
# handles that case.
bad_symlink_path = os.path.join(tmpdir, 'symlink2')
os.symlink(os.path.join('/', 'non-existent'), bad_symlink_path)
self.CreateObject(bucket_uri=bucket_uri, object_name='.obj2',
contents='.OBJ2')
self.CreateObject(bucket_uri=bucket_uri, object_name='obj4',
contents='obj4')
self.CreateObject(bucket_uri=bucket_uri, object_name='subdir/obj5',
contents='subdir/obj5')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Ensure listings match the commented expectations."""
self.RunGsUtil(['rsync', '-d', '-e', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(
listing1,
set(['/obj1', '/.obj2', '/subdir/obj3', '/symlink1', '/symlink2']))
# Bucket should have content like dir but without the symlink, and
# without subdir objects synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj5']))
_Check1()
# Now remove invalid symlink and run without -e, and see that symlink gets
# copied (as file to which it points). Use @Retry as hedge against bucket
# listing eventual consistency.
os.unlink(bad_symlink_path)
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-d', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(
listing1, set(['/obj1', '/.obj2', '/subdir/obj3', '/symlink1']))
# Bucket should have content like dir but without the symlink, and
# without subdir objects synchronized.
self.assertEquals(
listing2, set(['/obj1', '/.obj2', '/subdir/obj5', '/symlink1']))
self.assertEquals('obj1', self.RunGsUtil(
['cat', suri(bucket_uri, 'symlink1')], return_stdout=True))
_Check2()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check3():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', tmpdir, suri(bucket_uri)], return_stderr=True))
_Check3()
@SkipForS3('S3 does not support composite objects')
def test_bucket_to_bucket_minus_d_with_composites(self):
"""Tests that rsync works with composite objects (which don't have MD5s)."""
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket1_uri, object_name='.obj2',
contents='.obj2')
self.RunGsUtil(
['compose', suri(bucket1_uri, 'obj1'), suri(bucket1_uri, '.obj2'),
suri(bucket1_uri, 'obj3')])
self.CreateObject(bucket_uri=bucket2_uri, object_name='.obj2',
contents='.OBJ2')
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj4',
contents='obj4')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
self.RunGsUtil(['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/obj3']))
# Second bucket should have content like first bucket but without the
# subdir objects synchronized.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/obj3']))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
def test_bucket_to_bucket_minus_d_empty_dest(self):
"""Tests working with empty dest bucket (iter runs out before src iter)."""
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket1_uri, object_name='.obj2',
contents='.obj2')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
self.RunGsUtil(['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
self.assertEquals(listing1, set(['/obj1', '/.obj2']))
self.assertEquals(listing2, set(['/obj1', '/.obj2']))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
def test_bucket_to_bucket_minus_d_empty_src(self):
"""Tests working with empty src bucket (iter runs out before dst iter)."""
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket2_uri, object_name='obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket2_uri, object_name='.obj2',
contents='.obj2')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
self.RunGsUtil(['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)])
stderr = self.RunGsUtil(['ls', suri(bucket1_uri, '**')],
expected_status=1, return_stderr=True)
self.assertIn('One or more URLs matched no objects', stderr)
stderr = self.RunGsUtil(['ls', suri(bucket2_uri, '**')],
expected_status=1, return_stderr=True)
self.assertIn('One or more URLs matched no objects', stderr)
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
def test_rsync_minus_d_minus_p(self):
"""Tests that rsync -p preserves ACLs."""
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
self.CreateObject(bucket_uri=bucket1_uri, object_name='obj1',
contents='obj1')
# Set public-read (non-default) ACL so we can verify that rsync -p works.
self.RunGsUtil(['acl', 'set', 'public-read', suri(bucket1_uri, 'obj1')])
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync -p works as expected."""
self.RunGsUtil(['rsync', '-d', '-p', suri(bucket1_uri),
suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
self.assertEquals(listing1, set(['/obj1']))
self.assertEquals(listing2, set(['/obj1']))
acl1_json = self.RunGsUtil(['acl', 'get', suri(bucket1_uri, 'obj1')],
return_stdout=True)
acl2_json = self.RunGsUtil(['acl', 'get', suri(bucket2_uri, 'obj1')],
return_stdout=True)
self.assertEquals(acl1_json, acl2_json)
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-p', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
def test_rsync_to_nonexistent_bucket_subdir(self):
"""Tests that rsync to non-existent bucket subdir works."""
# Create dir with some objects and empty bucket.
tmpdir = self.CreateTempDir()
subdir = os.path.join(tmpdir, 'subdir')
os.mkdir(subdir)
bucket_url = self.CreateBucket()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
self.CreateTempFile(tmpdir=subdir, file_name='obj3', contents='subdir/obj3')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-r', tmpdir, suri(bucket_url, 'subdir')])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(
suri(bucket_url, 'subdir'),
self._FlatListBucket(bucket_url.clone_replace_name('subdir')))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/subdir/obj3']))
# Bucket subdir should have content like dir.
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/subdir/obj3']))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-r', tmpdir, suri(bucket_url, 'subdir')],
return_stderr=True))
_Check2()
def test_rsync_from_nonexistent_bucket(self):
"""Tests that rsync from a non-existent bucket subdir fails gracefully."""
tmpdir = self.CreateTempDir()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
bucket_url_str = '%s://%s' % (
self.default_provider, self.nonexistent_bucket_name)
stderr = self.RunGsUtil(['rsync', '-d', bucket_url_str, tmpdir],
expected_status=1, return_stderr=True)
self.assertIn('Caught non-retryable exception', stderr)
listing = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Dir should have un-altered content.
self.assertEquals(listing, set(['/obj1', '/.obj2']))
def test_rsync_to_nonexistent_bucket(self):
"""Tests that rsync from a non-existent bucket subdir fails gracefully."""
tmpdir = self.CreateTempDir()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
bucket_url_str = '%s://%s' % (
self.default_provider, self.nonexistent_bucket_name)
stderr = self.RunGsUtil(['rsync', '-d', bucket_url_str, tmpdir],
expected_status=1, return_stderr=True)
self.assertIn('Caught non-retryable exception', stderr)
listing = _TailSet(tmpdir, self._FlatListDir(tmpdir))
# Dir should have un-altered content.
self.assertEquals(listing, set(['/obj1', '/.obj2']))
def test_bucket_to_bucket_minus_d_with_overwrite_and_punc_chars(self):
"""Tests that punc chars in filenames don't confuse sort order."""
bucket1_uri = self.CreateBucket()
bucket2_uri = self.CreateBucket()
# Create 2 objects in each bucket, with one overwritten with a name that's
# less than the next name in destination bucket when encoded, but not when
# compared without encoding.
self.CreateObject(bucket_uri=bucket1_uri, object_name='e/obj1',
contents='obj1')
self.CreateObject(bucket_uri=bucket1_uri, object_name='e-1/.obj2',
contents='.obj2')
self.CreateObject(bucket_uri=bucket2_uri, object_name='e/obj1',
contents='OBJ1')
self.CreateObject(bucket_uri=bucket2_uri, object_name='e-1/.obj2',
contents='.obj2')
# Need to make sure the bucket listings are caught-up, otherwise the
# rsync may not see all objects and fail to synchronize correctly.
self.AssertNObjectsInBucket(bucket1_uri, 2)
self.AssertNObjectsInBucket(bucket2_uri, 2)
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-rd', suri(bucket1_uri), suri(bucket2_uri)])
listing1 = _TailSet(suri(bucket1_uri), self._FlatListBucket(bucket1_uri))
listing2 = _TailSet(suri(bucket2_uri), self._FlatListBucket(bucket2_uri))
# First bucket should have un-altered content.
self.assertEquals(listing1, set(['/e/obj1', '/e-1/.obj2']))
self.assertEquals(listing2, set(['/e/obj1', '/e-1/.obj2']))
# Assert correct contents.
self.assertEquals('obj1', self.RunGsUtil(
['cat', suri(bucket2_uri, 'e/obj1')], return_stdout=True))
self.assertEquals('.obj2', self.RunGsUtil(
['cat', suri(bucket2_uri, 'e-1/.obj2')], return_stdout=True))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', suri(bucket1_uri), suri(bucket2_uri)],
return_stderr=True))
_Check2()
def test_dir_to_bucket_minus_x(self):
"""Tests that rsync -x option works correctly."""
# Create dir and bucket with 1 overlapping and 2 extra objects in each.
tmpdir = self.CreateTempDir()
bucket_uri = self.CreateBucket()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
self.CreateTempFile(tmpdir=tmpdir, file_name='.obj2', contents='.obj2')
self.CreateTempFile(tmpdir=tmpdir, file_name='obj3', contents='obj3')
self.CreateObject(bucket_uri=bucket_uri, object_name='.obj2',
contents='.obj2')
self.CreateObject(bucket_uri=bucket_uri, object_name='obj4',
contents='obj4')
self.CreateObject(bucket_uri=bucket_uri, object_name='obj5',
contents='obj5')
# Need to make sure the bucket listing is caught-up, otherwise the
# first rsync may not see .obj2 and overwrite it.
self.AssertNObjectsInBucket(bucket_uri, 3)
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check1():
"""Tests rsync works as expected."""
self.RunGsUtil(['rsync', '-d', '-x', 'obj[34]', tmpdir, suri(bucket_uri)])
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/.obj2', '/obj3']))
# Bucket should have content like dir but ignoring obj3 from dir and not
# deleting obj4 from bucket (per exclude regex).
self.assertEquals(listing2, set(['/obj1', '/.obj2', '/obj4']))
_Check1()
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check2():
# Check that re-running the same rsync command causes no more changes.
self.assertEquals(NO_CHANGES, self.RunGsUtil(
['rsync', '-d', '-x', 'obj[34]', tmpdir, suri(bucket_uri)],
return_stderr=True))
_Check2()
@unittest.skipIf(IS_WINDOWS,
"os.chmod() won't make file unreadable on Windows.")
def test_dir_to_bucket_minus_C(self):
"""Tests that rsync -C option works correctly."""
# Create dir with 3 objects, the middle of which is unreadable.
tmpdir = self.CreateTempDir()
bucket_uri = self.CreateBucket()
self.CreateTempFile(tmpdir=tmpdir, file_name='obj1', contents='obj1')
path = self.CreateTempFile(tmpdir=tmpdir, file_name='obj2', contents='obj2')
os.chmod(path, 0)
self.CreateTempFile(tmpdir=tmpdir, file_name='obj3', contents='obj3')
# Use @Retry as hedge against bucket listing eventual consistency.
@Retry(AssertionError, tries=3, timeout_secs=1)
def _Check():
"""Tests rsync works as expected."""
stderr = self.RunGsUtil(['rsync', '-C', tmpdir, suri(bucket_uri)],
expected_status=1, return_stderr=True)
self.assertIn('1 files/objects could not be copied/removed.', stderr)
listing1 = _TailSet(tmpdir, self._FlatListDir(tmpdir))
listing2 = _TailSet(suri(bucket_uri), self._FlatListBucket(bucket_uri))
# Dir should have un-altered content.
self.assertEquals(listing1, set(['/obj1', '/obj2', '/obj3']))
# Bucket should have obj1 and obj3 even though obj2 was unreadable.
self.assertEquals(listing2, set(['/obj1', '/obj3']))
_Check()
| 49.133396 | 80 | 0.674527 | 6,638 | 52,671 | 5.224767 | 0.077885 | 0.032697 | 0.029064 | 0.033158 | 0.822242 | 0.794879 | 0.767141 | 0.748198 | 0.728707 | 0.710311 | 0 | 0.022394 | 0.203053 | 52,671 | 1,071 | 81 | 49.179272 | 0.80384 | 0.316911 | 0 | 0.717949 | 0 | 0 | 0.091178 | 0 | 0 | 0 | 0 | 0.000934 | 0.220211 | 1 | 0.101056 | false | 0 | 0.016591 | 0 | 0.12368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40134ec668090c21463d4b604554a0719f21830d | 2,387 | py | Python | tests/preproc_data_test.py | rutugandhi/Neuron-Finder | 76d771bb37b7c73f884dc4a018fa19090ec904d6 | [
"MIT"
] | null | null | null | tests/preproc_data_test.py | rutugandhi/Neuron-Finder | 76d771bb37b7c73f884dc4a018fa19090ec904d6 | [
"MIT"
] | null | null | null | tests/preproc_data_test.py | rutugandhi/Neuron-Finder | 76d771bb37b7c73f884dc4a018fa19090ec904d6 | [
"MIT"
] | null | null | null | import src.utils.preproc_data
pd = Preprocessing()
def test_transform_img():
#Reassuring that the transformations didnt change the type and size
folders = ['train','test']
for folder in folders:
#grabbing an origional img for testing
img_dir = os.path.join(pd.data, folder)
sample = os.listdir(img_dir)
sample_path = os.path.join(img_dir,sample[1])
images = os.listdir(sample_path)
img_int = np.random.randint(len(images))
img_path = os.path.join(sample_path,images[img_int])
img = external.tifffile.imread(img_path)
#grabbing a proccessed img for testing
proc_img_dir = os.path.join(pd.data,pd.data_storage,folder)
proc_sample_path = os.path.join(proc_img_dir,sample[1])
proc_images = os.listdir(proc_sample_path)
proc_img_path = os.path.join(proc_img_dir,proc_images[img_int])
proc_img = external.tifffile.imread(proc_img_path)
#checking the size of the folder is the same
assert len(images) == len(proc_images)
#chekcing the type of the img chosen
assert size(img) == size(proc_img)
#checking the type of the images
assert instance(img,proc_img)
def test_filter_img():
#Reassuring that the filtering returned the correct image
folders = ['train','test']
for folder in folders:
#grabbing an origional img for testing
img_dir = os.path.join(pd.data, folder)
sample = os.listdir(img_dir)
sample_path = os.path.join(img_dir,sample[1])
images = os.listdir(sample_path)
img_int = np.random.randint(len(images))
img_path = os.path.join(sample_path,images[img_int])
img = external.tifffile.imread(img_path)
#grabbing a proccessed img for testing
proc_img_dir = os.path.join(pd.data,pd.data_storage,folder)
proc_sample_path = os.path.join(proc_img_dir,sample[1])
proc_images = os.listdir(proc_sample_path)
proc_img_path = os.path.join(proc_img_dir,proc_images[img_int])
proc_img = external.tifffile.imread(proc_img_path)
#checking the size of the folder is the same
assert len(images) == len(proc_images)
#chekcing the type of the img chosen
assert size(img) == size(proc_img)
#checking the type of the images
assert instance(img,proc_img)
| 39.783333 | 71 | 0.672811 | 351 | 2,387 | 4.378917 | 0.176638 | 0.072869 | 0.078074 | 0.072869 | 0.882238 | 0.882238 | 0.882238 | 0.882238 | 0.882238 | 0.882238 | 0 | 0.002185 | 0.232928 | 2,387 | 59 | 72 | 40.457627 | 0.837247 | 0.204441 | 0 | 0.894737 | 0 | 0 | 0.009539 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.052632 | false | 0 | 0.026316 | 0 | 0.078947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40341e52ca1ab35a3a5a3787d8cd0595b7b8b6ec | 295 | py | Python | modules/tests/staff/__init__.py | andygimma/eden | 716d5e11ec0030493b582fa67d6f1c35de0af50d | [
"MIT"
] | 1 | 2019-08-20T16:32:33.000Z | 2019-08-20T16:32:33.000Z | modules/tests/staff/__init__.py | andygimma/eden | 716d5e11ec0030493b582fa67d6f1c35de0af50d | [
"MIT"
] | null | null | null | modules/tests/staff/__init__.py | andygimma/eden | 716d5e11ec0030493b582fa67d6f1c35de0af50d | [
"MIT"
] | null | null | null | from staff import *
from search_staff import *
from create_staff_job_role import *
from create_staff_certificate import *
from create_staff_training import *
from add_staff_to_organisation import *
from add_staff_to_office import *
from add_staff_to_warehouse import *
from create_staff import * | 32.777778 | 39 | 0.850847 | 45 | 295 | 5.177778 | 0.311111 | 0.343348 | 0.274678 | 0.360515 | 0.257511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118644 | 295 | 9 | 40 | 32.777778 | 0.896154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4061aabac6e28659ebe7f72eb2b9838d74488e26 | 7,619 | py | Python | hatchet/tests/dataframe_ops.py | TauferLab/llnl-hatchet | c7d12888d71d2b23058facd3025e7dcfa12cbb39 | [
"MIT"
] | null | null | null | hatchet/tests/dataframe_ops.py | TauferLab/llnl-hatchet | c7d12888d71d2b23058facd3025e7dcfa12cbb39 | [
"MIT"
] | null | null | null | hatchet/tests/dataframe_ops.py | TauferLab/llnl-hatchet | c7d12888d71d2b23058facd3025e7dcfa12cbb39 | [
"MIT"
] | null | null | null | # Copyright 2017-2022 Lawrence Livermore National Security, LLC and other
# Hatchet Project Developers. See the top-level LICENSE file for details.
#
# SPDX-License-Identifier: MIT
from __future__ import division
from hatchet import GraphFrame
def test_filter(mock_graph_literal):
"""Test the filter operation with a foo-bar tree."""
gf = GraphFrame.from_literal(mock_graph_literal)
filtered_gf = gf.filter(lambda x: x["time"] > 5.0, squash=False)
assert len(filtered_gf.dataframe) == 9
assert all(time > 5.0 for time in filtered_gf.dataframe["time"])
filtered_gf = gf.filter(lambda x: x["name"].startswith("g"), squash=False)
assert len(filtered_gf.dataframe) == 7
assert all(name.startswith("g") for name in filtered_gf.dataframe["name"])
def test_add(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1.add(gf2)
assert gf3.graph == gf1.graph.union(gf2.graph)
assert len(gf3.graph) == gf3.dataframe.shape[0]
assert gf3.dataframe["time"].sum() == 330
assert gf3.dataframe["time (inc)"].sum() == 1320
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3.add(gf4)
assert gf5.graph == gf3.graph == gf4.graph
def test_sub(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1.sub(gf2)
assert gf3.graph == gf1.graph.union(gf2.graph)
assert len(gf3.graph) == gf3.dataframe.shape[0]
for metric in gf3.exc_metrics + gf3.inc_metrics:
assert gf3.dataframe[metric].sum() == 0
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3.sub(gf4)
assert gf5.graph == gf3.graph == gf4.graph
def test_div(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1.div(gf2)
assert len(gf3.graph) == gf3.dataframe.shape[0]
assert gf3.graph == gf1.graph.union(gf2.graph)
assert gf3.dataframe["time"].sum() == 21
assert gf3.dataframe["time (inc)"].sum() == 24
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3.div(gf4)
assert gf5.graph == gf3.graph == gf4.graph
def test_mul(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1.mul(gf2)
assert len(gf3.graph) == gf3.dataframe.shape[0]
assert gf3.graph == gf1.graph.union(gf2.graph)
assert gf3.dataframe["time"].sum() == 1575
assert gf3.dataframe["time (inc)"].sum() == 37900
def test_add_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1 + gf2
assert gf3.graph == gf1.graph.union(gf2.graph)
assert len(gf3.graph) == gf3.dataframe.shape[0]
assert gf3.dataframe["time"].sum() == 330
assert gf3.dataframe["time (inc)"].sum() == 1320
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3 + gf4
assert gf5.graph == gf3.graph == gf4.graph
gf6 = gf1 + gf2 + gf1
assert gf6.dataframe["time"].sum() == 495
gf7 = gf1 + gf2
gf8 = gf7 + gf1
assert gf8.graph == gf6.graph
assert gf8.dataframe["time"].sum() == gf6.dataframe["time"].sum()
def test_sub_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1 - gf2
assert gf3.graph == gf1.graph.union(gf2.graph)
assert len(gf3.graph) == gf3.dataframe.shape[0]
for metric in gf3.exc_metrics + gf3.inc_metrics:
assert gf3.dataframe[metric].sum() == 0
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3.sub(gf4)
assert gf5.graph == gf3.graph == gf4.graph
gf6 = gf1 - gf2 - gf1
assert gf6.dataframe["time"].sum() == -165
gf7 = gf1 - gf2
gf8 = gf7 - gf1
assert gf8.graph == gf6.graph
assert gf8.dataframe["time"].sum() == gf6.dataframe["time"].sum()
def test_div_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf3 = gf1 / gf2
assert gf3.graph == gf1.graph.union(gf2.graph)
assert len(gf3.graph) == gf3.dataframe.shape[0]
assert gf3.dataframe["time"].sum() == 21
assert gf3.dataframe["time (inc)"].sum() == 24
gf4 = gf3.copy()
assert gf4.graph is gf3.graph
gf5 = gf3 / gf4 / gf3
assert gf5.graph == gf3.graph == gf4.graph
assert gf5.dataframe["time (inc)"].sum() == 24
gf6 = gf3 / gf4
gf7 = gf6 / gf3
assert gf7.graph == gf5.graph
assert gf7.dataframe["time"].sum() == gf5.dataframe["time"].sum()
def test_mul_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
gf3 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph is not gf3.graph
gf4 = gf1 * gf2 * gf3
assert gf4.graph == gf1.graph.union(gf2.graph.union(gf3.graph))
assert len(gf4.graph) == gf4.dataframe.shape[0]
assert gf4.dataframe["time"].sum() == 17625
assert gf4.dataframe["time (inc)"].sum() == 3397500
def test_iadd_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf1 += gf2
assert gf1.graph == gf1.graph.union(gf2.graph)
assert len(gf1.graph) == gf1.dataframe.shape[0]
assert gf1.dataframe["time"].sum() == 330
assert gf1.dataframe["time (inc)"].sum() == 1320
gf3 = gf1.copy()
assert gf3.graph is gf1.graph
gf3 += gf1 + gf2 + gf2
assert gf3.graph == gf1.graph
assert gf3.dataframe["time"].sum() == 990
def test_isub_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf1 -= gf2
assert gf1.graph == gf1.graph.union(gf2.graph)
assert len(gf1.graph) == gf1.dataframe.shape[0]
for metric in gf1.exc_metrics + gf1.inc_metrics:
assert gf1.dataframe[metric].sum() == 0
gf3 = gf1.copy()
assert gf3.graph is gf1.graph
gf3 -= gf1
assert gf3.graph == gf1.graph
def test_idiv_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf1 /= gf2
assert gf1.graph == gf1.graph.union(gf2.graph)
assert len(gf1.graph) == gf1.dataframe.shape[0]
assert gf1.dataframe["time"].sum() == 21
assert gf1.dataframe["time (inc)"].sum() == 24
gf3 = gf1.copy()
assert gf3.graph is gf1.graph
gf3 /= gf1
assert gf3.graph == gf1.graph
def test_imul_operator(mock_graph_literal):
gf1 = GraphFrame.from_literal(mock_graph_literal)
gf2 = GraphFrame.from_literal(mock_graph_literal)
assert gf1.graph is not gf2.graph
gf1 *= gf2
assert gf1.graph == gf1.graph.union(gf2.graph)
assert len(gf1.graph) == gf1.dataframe.shape[0]
assert gf1.dataframe["time"].sum() == 1575
assert gf1.dataframe["time (inc)"].sum() == 37900
| 26.733333 | 78 | 0.675679 | 1,114 | 7,619 | 4.494614 | 0.087971 | 0.070102 | 0.124626 | 0.129818 | 0.846016 | 0.831436 | 0.784302 | 0.752347 | 0.752347 | 0.752347 | 0 | 0.065689 | 0.194776 | 7,619 | 284 | 79 | 26.827465 | 0.750448 | 0.028875 | 0 | 0.589595 | 0 | 0 | 0.02571 | 0 | 0 | 0 | 0 | 0 | 0.514451 | 1 | 0.075145 | false | 0 | 0.011561 | 0 | 0.086705 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40b51fd92d98392c41f61bfcd47df479c0f2f428 | 2,066 | py | Python | bspump/declarative/expression/comparison.py | chinese-soup/BitSwanPump | 6ef71577cc1f166cff80876d28be37c791061bd2 | [
"BSD-3-Clause"
] | 1 | 2020-08-20T12:56:58.000Z | 2020-08-20T12:56:58.000Z | bspump/declarative/expression/comparison.py | chinese-soup/BitSwanPump | 6ef71577cc1f166cff80876d28be37c791061bd2 | [
"BSD-3-Clause"
] | null | null | null | bspump/declarative/expression/comparison.py | chinese-soup/BitSwanPump | 6ef71577cc1f166cff80876d28be37c791061bd2 | [
"BSD-3-Clause"
] | null | null | null | import operator
from ..abc import SequenceExpression, evaluate
def _and_reduce(operator, iterable):
it = iter(iterable)
a = next(it)
for b in it:
if not operator(a, b):
return False
a = b
return True
class LT(SequenceExpression):
'''
Operator '<'
'''
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.lt,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class LE(SequenceExpression):
'''
Operator '<='
'''
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.le,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class EQ(SequenceExpression):
'''
Operator '=='
'''
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.eq,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class NE(SequenceExpression):
'''
Operator '!='
'''
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.ne,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class GE(SequenceExpression):
"""
Operator '>='
"""
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.ge,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class GT(SequenceExpression):
"""
Operator '>'
"""
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.gt,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class IS(SequenceExpression):
"""
Operator 'is'
"""
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.is_,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
class ISNOT(SequenceExpression):
"""
Operator 'is not'
"""
def __call__(self, context, event, *args, **kwargs):
return _and_reduce(
operator.is_not,
[evaluate(item, context, event, *args, **kwargs) for item in self.Items]
)
| 18.446429 | 75 | 0.656825 | 253 | 2,066 | 5.158103 | 0.15415 | 0.147126 | 0.196169 | 0.269732 | 0.811494 | 0.811494 | 0.811494 | 0.811494 | 0.811494 | 0.811494 | 0 | 0 | 0.185866 | 2,066 | 111 | 76 | 18.612613 | 0.775862 | 0.054695 | 0 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.155172 | false | 0 | 0.034483 | 0.137931 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
40ca9349a3d982484b95807ba566de86df1db66b | 67 | py | Python | DatabaseControlWrapper_JE/venv/Lib/site-packages/je_database/__init__.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | 5 | 2020-10-12T09:41:33.000Z | 2020-12-30T07:27:56.000Z | DatabaseControlWrapper_JE/venv/Lib/site-packages/je_database/__init__.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | DatabaseControlWrapper_JE/venv/Lib/site-packages/je_database/__init__.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | from je_database.core import *
from je_database.modules import *
| 22.333333 | 34 | 0.791045 | 10 | 67 | 5.1 | 0.6 | 0.235294 | 0.54902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149254 | 67 | 2 | 35 | 33.5 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
dc06d818293d1468889631ec97ff816af605caf0 | 16,540 | py | Python | EtaPyScripts/getHistory-heuristics-tests.py | kd2eom/shuttletracker | d900055a78fd798f5375bed428cdd68843f5d5c7 | [
"MIT"
] | null | null | null | EtaPyScripts/getHistory-heuristics-tests.py | kd2eom/shuttletracker | d900055a78fd798f5375bed428cdd68843f5d5c7 | [
"MIT"
] | null | null | null | EtaPyScripts/getHistory-heuristics-tests.py | kd2eom/shuttletracker | d900055a78fd798f5375bed428cdd68843f5d5c7 | [
"MIT"
] | null | null | null | import urllib.request
import datetime as dt
import json
MAX_TIME_DIFFERENCE_MIN = 10
# function to open the JSON history file
def response(url):
return urllib.request.urlopen(url)
# function to load the history data from the JSON
def loadJSON(response):
return json.loads(response.read())
def time_in_range(start, end, x):
"""Return true if x is in the range [start, end]"""
if start <= end:
return start <= x <= end
else:
return start <= x or x <= end
def getAvgVelocity(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and dataWeekday == weekday and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity2(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if dataWeekday == weekday and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity3(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=15)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=15)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and dataWeekday == weekday and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity4(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=20)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=20)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and dataWeekday == weekday and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity5(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=30)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=30)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and dataWeekday == weekday and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity6(data, route_id, current_time, weekday):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and inTimeRange:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
def getAvgVelocity8(data, route_id, current_time, weekday, vehicle_id):
totalVelocity = 0
count = 0
for i in data:
# each entry in i is a data entry about a shuttle, loop through all of these.
for j in i:
# extract relevant data from this entry
dataArrayTime = j["time"].split(":")
dataHour = int(dataArrayTime[0].split("T")[1])
dataMin = int(dataArrayTime[1])
dataTime = dt.time(dataHour, dataMin, 0)
dataArrayDay = dataArrayTime[0].split('-')
dataYear = int(dataArrayDay[0])
dataMonth = int(dataArrayDay[1])
dataDay = int(dataArrayDay[2].split('T')[0])
# get current weekday to prepare for ETA calculation
day = dt.date(dataYear, dataMonth, dataDay)
dataWeekday = day.weekday()
# Only data from within 10 minutes of the current time should be considered in the ETA calculation.
# Calculate start time (10 minutes before current time) and end time (10 minutes after current time).
start = dt.time(current_time.hour, current_time.minute, current_time.second)
tmp_startDate = dt.datetime.combine(dt.date(1,1,1), start)
start = tmp_startDate - dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
start = start.time()
end = tmp_startDate + dt.timedelta(minutes=MAX_TIME_DIFFERENCE_MIN)
end = end.time()
# Determine whether the data we are looking at is within this range
inTimeRange = time_in_range(start, end, dataTime)
if j["route_id"] == route_id and dataWeekday == weekday and inTimeRange and j["vehicle_id"] == vehicle_id:
totalVelocity += j["speed"]
count += 1
else:
continue
# perform final calculation for ETA algorithm and return result.
return totalVelocity/count
if __name__ == '__main__':
# Get what day of the week it is today
targetWeekday = dt.datetime.today().weekday()
targetWeekday = 2 # manually hard-code the day we want
# Get what the current time is now
targetTime = dt.datetime.now().time()
targetTime = dt.time(22, 45, 50) # manually hard-coded the time we want
# Specify which route you want to calculate the average velocity for
targetRoute = 1
# Specify the shutlte's vehicle_id number
shuttleID = 9
# URL of the JSON file that contains the history of the shuttles.
url = "https://shuttles.rpi.edu/history"
# open and load the JSON
response = response(url)
data = loadJSON(response)
# Run 7 different versions of the ETA algorithm (each version uses a different heuristic for calculation) and
# output the results we get under each version.
# Each version will use the exact same data; thus, any differences in output are solely due to differences
# in the algorithm.
print("Version 1 (default): ")
print(getAvgVelocity(data, 1, targetTime, targetWeekday))
print("Version 2: Same as 1, but shuttle doesn't have to be on route_id (can be on any route).")
print(getAvgVelocity2(data, 1, targetTime, targetWeekday))
print("Version 3: Same as 1, but change time window to 15 minutes instead of 10.")
print(getAvgVelocity3(data, 1, targetTime, targetWeekday))
print("Version 4: Same as 1, but change time window to 20 minutes instead of 10")
print(getAvgVelocity4(data, 1, targetTime, targetWeekday))
print("Version 5: Same as 1, but change time window to 30 minutes instead of 10")
print(getAvgVelocity5(data, 1, targetTime, targetWeekday))
print("Version 6: Same as 1, but look at ANY DAY OF THE WEEK.")
print(getAvgVelocity6(data, 1, targetTime, targetWeekday))
print("Version 7: Same as 1, but shuttle must HAVE THE SAME SHUTTLE ID AS THE SHUTTLE WE ARE TARGETING.")
print(getAvgVelocity8(data, 1, targetTime, targetWeekday, shuttleID))
'''
RESULTS (as of 2/16/19 at 4:08 PM):
Version 1 (default):
12.233058956086635
Version 2: Same as 1, but shuttle doesn't have to be on route_id (can be on any route).
13.024325100523708
Version 3: Same as 1, but change time window to 15 minutes instead of 10.
11.852338474319906
Version 4: Same as 1, but change time window to 20 minutes instead of 10
12.49505608883642
Version 5: Same as 1, but change time window to 30 minutes instead of 10
12.113736460390298
Version 6: Same as 1, but look at ANY DAY OF THE WEEK.
12.554997379680035
Version 7: Same as 1, but shuttle must HAVE THE SAME SHUTTLE ID AS THE SHUTTLE WE ARE TARGETING.
12.183459281921387
'''
| 41.35 | 118 | 0.620738 | 2,090 | 16,540 | 4.852632 | 0.099043 | 0.05423 | 0.028988 | 0.031749 | 0.854368 | 0.850621 | 0.827056 | 0.827056 | 0.827056 | 0.813153 | 0 | 0.031349 | 0.294135 | 16,540 | 399 | 119 | 41.453634 | 0.837345 | 0.253386 | 0 | 0.776824 | 0 | 0.008584 | 0.057212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042918 | false | 0 | 0.012876 | 0.008584 | 0.103004 | 0.060086 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90cd2d7b82c3be3d97439c50afda4f7990b6b9b1 | 12,032 | py | Python | tests/unit_test/test_assert_pyspark_df_equal.py | debugger24/pyspark-test | df1594bdfd3af560525b33e331be3636ab2fd4e1 | [
"Apache-2.0"
] | 4 | 2020-12-18T21:51:22.000Z | 2021-12-10T04:17:58.000Z | tests/unit_test/test_assert_pyspark_df_equal.py | debugger24/pyspark-df-assert | df1594bdfd3af560525b33e331be3636ab2fd4e1 | [
"Apache-2.0"
] | null | null | null | tests/unit_test/test_assert_pyspark_df_equal.py | debugger24/pyspark-df-assert | df1594bdfd3af560525b33e331be3636ab2fd4e1 | [
"Apache-2.0"
] | 2 | 2021-11-22T07:52:34.000Z | 2022-02-11T21:22:16.000Z | import datetime
import pyspark
import pytest
from pyspark.sql.types import (
DateType,
DoubleType,
LongType,
StringType,
StructField,
StructType,
)
from src.pyspark_test import assert_pyspark_df_equal
class TestAssertPysparkDfEqual:
def test_assert_pyspark_df_equal_success(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_one_is_not_pyspark_df(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = "Demo"
with pytest.raises(
AssertionError,
match="Right expected type <class 'pyspark.sql.dataframe.DataFrame'>, found <class 'str'> instead",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_string_value(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo1", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Data mismatch\n \n Row = 1 : Column = col_b\n \n ACTUAL: demo\n EXPECTED: demo1",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_string_value_where_one_of_the_value_is_Null(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), None, 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Data mismatch\n \n Row = 1 : Column = col_b\n \n ACTUAL: demo\n EXPECTED: None",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_date_value(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 3), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Data mismatch\n \n Row = 1 : Column = col_a\n \n ACTUAL: 2020-01-01\n EXPECTED: 2020-01-03",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_long_value(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 20],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Data mismatch\n \n Row = 1 : Column = col_d\n \n ACTUAL: 10\n EXPECTED: 20",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_double_value(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.1236, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Data mismatch\n \n Row = 1 : Column = col_c\n \n ACTUAL: 1.123\n EXPECTED: 1.1236",
):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_columns(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[[datetime.datetime(2020, 1, 1), "demo", 10], [None, None, None],],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(AssertionError, match="df schema type mismatch"):
assert_pyspark_df_equal(left_df, right_df)
def test_assert_pyspark_df_equal_different_row_count(
self, spark_session: pyspark.sql.SparkSession
):
left_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
right_df = spark_session.createDataFrame(
data=[
[datetime.date(2020, 1, 1), "demo", 1.123, 10],
[None, None, None, None],
[None, None, None, None],
],
schema=StructType(
[
StructField("col_a", DateType(), True),
StructField("col_b", StringType(), True),
StructField("col_c", DoubleType(), True),
StructField("col_d", LongType(), True),
]
),
)
with pytest.raises(
AssertionError,
match="Number of rows are not same.\n \n Actual Rows: 2\n Expected Rows: 3",
):
assert_pyspark_df_equal(left_df, right_df)
| 36.795107 | 116 | 0.476563 | 1,094 | 12,032 | 5.027422 | 0.076782 | 0.170545 | 0.163636 | 0.058182 | 0.901455 | 0.895091 | 0.890182 | 0.890182 | 0.884182 | 0.884182 | 0 | 0.033352 | 0.404422 | 12,032 | 326 | 117 | 36.907975 | 0.734161 | 0 | 0 | 0.719745 | 0 | 0.019108 | 0.084275 | 0.002909 | 0 | 0 | 0 | 0 | 0.089172 | 1 | 0.028662 | false | 0 | 0.015924 | 0 | 0.047771 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90e4da9bc2792507142e7785f435e0c6a7d5aef2 | 794 | py | Python | geomet/tests/test_cli.py | tomplex/geomet | f57a2302d738ef8af694c8dde09e95d419457d9e | [
"Apache-2.0"
] | 121 | 2015-01-21T23:47:00.000Z | 2022-03-25T00:18:50.000Z | geomet/tests/test_cli.py | achapkowski/geomet | e0408ef6e5860815be995140c019217b5097edef | [
"Apache-2.0"
] | 47 | 2015-06-22T16:57:22.000Z | 2022-01-27T18:30:08.000Z | geomet/tests/test_cli.py | achapkowski/geomet | e0408ef6e5860815be995140c019217b5097edef | [
"Apache-2.0"
] | 27 | 2015-06-17T15:27:04.000Z | 2022-01-25T23:38:49.000Z | import subprocess
def test_arg():
result = subprocess.check_output(
'geomet "POINT (0.99999 0.999999)"',
shell=True)
expected = '{"coordinates": [0.99999, 0.999999], "type": "Point"}'
assert result.decode('utf-8').strip() == expected
def test_stdin_implicit():
result = subprocess.check_output(
'echo "POINT (0.99999 0.999999)" | geomet',
shell=True)
expected = '{"coordinates": [0.99999, 0.999999], "type": "Point"}'
assert result.decode('utf-8').strip() == expected
def test_stdin_explicit():
result = subprocess.check_output(
'echo "POINT (0.99999 0.999999)" | geomet -',
shell=True)
expected = '{"coordinates": [0.99999, 0.999999], "type": "Point"}'
assert result.decode('utf-8').strip() == expected
| 30.538462 | 70 | 0.617128 | 96 | 794 | 5.020833 | 0.28125 | 0.074689 | 0.087137 | 0.161826 | 0.844398 | 0.807054 | 0.807054 | 0.807054 | 0.807054 | 0.807054 | 0 | 0.12776 | 0.201511 | 794 | 25 | 71 | 31.76 | 0.632492 | 0 | 0 | 0.631579 | 0 | 0 | 0.36398 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.157895 | false | 0 | 0.052632 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
46408eed777b7d13918c2a07ee075cee06479365 | 2,636 | py | Python | blog/models.py | skyrred/django_local_library_dev | 4cd1b6591adae931d9b20f02b3c6d1ac3c92e5c2 | [
"Apache-2.0"
] | null | null | null | blog/models.py | skyrred/django_local_library_dev | 4cd1b6591adae931d9b20f02b3c6d1ac3c92e5c2 | [
"Apache-2.0"
] | null | null | null | blog/models.py | skyrred/django_local_library_dev | 4cd1b6591adae931d9b20f02b3c6d1ac3c92e5c2 | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django.core.urlresolvers import reverse
from django.db.models import permalink
class category(models.Model):
name = models.CharField(max_length = 255 , )
slug = models.SlugField(unique = True, max_length=255 , )
def __str__(self):
return "%s" % self.name
@permalink
def get_absolute_url(self):
return ("category_view_blog" , None , {'slug':self.slug})
class Post(models.Model):
category = models.ForeignKey(category , on_delete = models.CASCADE , null=True)
title = models.CharField(max_length = 255)
slug = models.SlugField(unique = True , max_length=255)
url = models.CharField(max_length = 500 , default= False)
description = models.CharField(max_length = 200)
content = models.TextField()
published = models.BooleanField(default = True)
created = models.DateTimeField(auto_now_add=True)
def __unicode__(self):
return u'%s' % self.title
#def get_absolute_url(self):
#return reverse('blog_pst', args=[self.slug])
@permalink
def get_absolute_url(self):
return ("blog_post" ,None ,{'slug':self.slug})
# def get_url(self):
#return reverse("blog.views.post" , args = self.slug)
#def get_absolute_url(self):
#global slg
#slg = self.slug
#return reverse('blog.views.post',args = slg)
class Meta:
ordering = ['-created']
class Post2(models.Model):
category = models.ForeignKey(category , on_delete = models.CASCADE , null=True)
title = models.CharField(max_length = 255)
slug = models.SlugField(unique = True , max_length=255)
url = models.CharField(max_length = 500)
description = models.CharField(max_length = 200)
content = models.TextField()
published = models.BooleanField(default = True)
created = models.DateTimeField(auto_now_add=True)
def __unicode__(self):
return u'%s' % self.title
@permalink
def get_absolute_url(self):
return ("blog_view_post" ,None,{'slug':self.slug})
class Meta:
ordering = ['-created']
class Sub(models.Model):
name = models.CharField(max_length=255)
email = models.CharField(max_length=255)
def __str__(self):
return '%s' % self.name
class comment1(models.Model):
post = models.ForeignKey(Post , on_delete = models.CASCADE , null = True)
name = models.CharField(max_length=255)
email = models.CharField(max_length=255)
desc = models.TextField(max_length=255)
def __str__(self):
return '%s' % self.name
class comment2(models.Model):
post = models.ForeignKey(Post2 , on_delete = models.CASCADE , null = True)
name = models.CharField(max_length=255)
email = models.CharField(max_length=255)
desc = models.TextField(max_length=255)
def __str__(self):
return '%s' % self.name
# Create your models here.
| 34.684211 | 80 | 0.736343 | 365 | 2,636 | 5.134247 | 0.194521 | 0.086446 | 0.089648 | 0.166489 | 0.866596 | 0.7492 | 0.702775 | 0.683565 | 0.627001 | 0.627001 | 0 | 0.025316 | 0.13088 | 2,636 | 75 | 81 | 35.146667 | 0.792667 | 0.099772 | 0 | 0.693548 | 0 | 0 | 0.034264 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145161 | false | 0 | 0.048387 | 0.145161 | 0.951613 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
46577be9d8d24fb6ad4f832eaccae937246f2092 | 7,064 | py | Python | rl_6_nimmt/agents/policy.py | johannbrehmer/rl-6-nimmt | 8bc504e0372bb4bc99a3d69e77418991092ffdac | [
"MIT"
] | 3 | 2021-04-21T07:41:45.000Z | 2022-02-12T23:43:44.000Z | rl_6_nimmt/agents/policy.py | johannbrehmer/rl-6-nimmt | 8bc504e0372bb4bc99a3d69e77418991092ffdac | [
"MIT"
] | null | null | null | rl_6_nimmt/agents/policy.py | johannbrehmer/rl-6-nimmt | 8bc504e0372bb4bc99a3d69e77418991092ffdac | [
"MIT"
] | 3 | 2021-04-20T04:28:58.000Z | 2021-12-31T13:06:51.000Z | import torch
from torch import nn
from torch.distributions import Categorical
import numpy as np
import logging
from .base import Agent
from ..utils.nets import MultiHeadedMLP
from ..utils.various import compute_discounted_returns
from ..utils.preprocessing import SechsNimmtStateNormalization
logger = logging.getLogger(__name__)
class MaskedReinforceAgent(Agent):
def __init__(
self,
env=None,
gamma=0.99,
optim_kwargs=None,
history_length=None,
dtype=torch.float,
device=torch.device("cpu"),
hidden_sizes=(100, 100,),
activation=nn.ReLU(),
r_factor=1.0,
actor_weight=1.0,
entropy_weight=0.0,
*args,
**kwargs
):
super().__init__(env, gamma, optim_kwargs, history_length, dtype, device)
self.r_factor = r_factor
self.actor_weight = actor_weight
self.entropy_weight = entropy_weight
# NN that calculates the policy (actor) and estimates Q (critic)
self.preprocessor = SechsNimmtStateNormalization(action=False)
self.actor = MultiHeadedMLP(
self.state_length, hidden_sizes=hidden_sizes, head_sizes=(self.num_actions,), activation=activation, head_activations=(None,)
)
self.softmax = nn.Softmax(dim=-1)
def forward(self, state, legal_actions, **kwargs):
# Let the actor pick action probabilities and the critic guess the expected reward V(s_t)
state = self.preprocessor(state)
(probs,) = self.actor(state)
probs = probs[legal_actions]
probs = self.softmax(probs)
logger.debug(probs.detach().numpy())
# Sample action from these probabilities
cat = Categorical(probs)
action_id = cat.sample()
log_prob = cat.log_prob(action_id)
entropy = cat.entropy()
action = legal_actions[action_id]
return int(action.item()), {"log_prob": log_prob, "entropy": entropy}
def learn(self, state, reward, action, done, next_state, next_reward, episode_end, num_episode, *args, **kwargs):
# Memorize step
self.history.store(log_prob=kwargs["log_prob"], reward=reward * self.r_factor, entropy=kwargs["entropy"])
# No further steps after each step
if not episode_end or not self.training:
return np.zeros(3)
# Gradient updates
losses = self._train()
# Reset memory for next episode
self.history.clear()
return losses
def _train(self):
# Roll out last episode
rollout = self.history.rollout()
n = len(self.history)
log_probs = torch.stack(rollout["log_prob"], dim=0)
entropies = torch.stack(rollout["entropy"], dim=0)
returns = compute_discounted_returns(rollout["reward"], self.gamma).to(self.device, self.dtype)
# Compute loss
discounts = torch.exp(np.log(self.gamma) * torch.linspace(0, n - 1, n))
discounts = discounts.to(self.device, self.dtype)
actor_loss = -torch.sum(discounts * returns * log_probs)
# Entropy regularization to incentivize exploration
if entropies is not None:
entropy_loss = -torch.sum(entropies)
else:
entropy_loss = torch.tensor(0.0)
# Gradient update
self._gradient_step(self.actor_weight * actor_loss + self.entropy_weight * entropy_loss)
return np.array([actor_loss.item(), 0.0, entropy_loss.item()])
def _gradient_step(self, loss):
self.optimizer.zero_grad()
loss.backward()
self.optimizer.step()
class BatchedReinforceAgent(Agent):
def __init__(
self,
env=None,
gamma=0.99,
optim_kwargs=None,
history_length=None,
dtype=torch.float,
device=torch.device("cpu"),
hidden_sizes=(100, 100,),
activation=nn.ReLU(),
r_factor=1.0,
actor_weight=1.0,
entropy_weight=0.0,
*args,
**kwargs
):
super().__init__(env, gamma, optim_kwargs, history_length, dtype, device)
self.r_factor = r_factor
self.actor_weight = actor_weight
self.entropy_weight = entropy_weight
# NN that calculates the policy (actor) and estimates Q (critic)
self.preprocessor = SechsNimmtStateNormalization(action=True)
self.actor = MultiHeadedMLP(self.state_length + 1, hidden_sizes=hidden_sizes, head_sizes=(1,), activation=activation, head_activations=(None,))
self.softmax = nn.Softmax(dim=0)
def forward(self, state, legal_actions, **kwargs):
# Let the actor pick action probabilities and the critic guess the expected reward V(s_t)
batch_states = []
for action in legal_actions:
action_ = torch.tensor([action]).to(self.device, self.dtype)
batch_states.append(torch.cat((action_, state), dim=0).unsqueeze(0))
batch_states = torch.cat(batch_states, dim=0)
batch_states = self.preprocessor(batch_states)
(probs,) = self.actor(batch_states)
probs = self.softmax(probs).flatten()
# Sample action from these probabilities
cat = Categorical(probs)
action_id = cat.sample()
log_prob = cat.log_prob(action_id)
entropy = cat.entropy()
action = legal_actions[action_id]
return int(action), {"log_prob": log_prob, "entropy": entropy}
def learn(self, state, reward, action, done, next_state, next_reward, episode_end, num_episode, *args, **kwargs):
# Memorize step
self.history.store(log_prob=kwargs["log_prob"], reward=reward * self.r_factor, entropy=kwargs["entropy"])
# No further steps after each step
if not episode_end or not self.training:
return np.zeros(3)
# Gradient updates
losses = self._train()
# Reset memory for next episode
self.history.clear()
return losses
def _train(self):
# Roll out last episode
rollout = self.history.rollout()
n = len(self.history)
log_probs = torch.stack(rollout["log_prob"], dim=0)
entropies = torch.stack(rollout["entropy"], dim=0)
returns = compute_discounted_returns(rollout["reward"], self.gamma).to(self.device, self.dtype)
# Compute loss
discounts = torch.exp(np.log(self.gamma) * torch.linspace(0, n - 1, n))
discounts = discounts.to(self.device, self.dtype)
actor_loss = -torch.sum(discounts * returns * log_probs)
# Entropy regularization to incentivize exploration
if entropies is not None:
entropy_loss = -torch.sum(entropies)
else:
entropy_loss = torch.tensor(0.0)
# Gradient update
self._gradient_step(self.actor_weight * actor_loss + self.entropy_weight * entropy_loss)
return np.array([actor_loss.item(), 0.0, entropy_loss.item()])
def _gradient_step(self, loss):
self.optimizer.zero_grad()
loss.backward()
self.optimizer.step()
| 34.970297 | 151 | 0.641138 | 862 | 7,064 | 5.080046 | 0.182135 | 0.02238 | 0.013702 | 0.018269 | 0.839233 | 0.834437 | 0.802923 | 0.802923 | 0.802923 | 0.802923 | 0 | 0.010419 | 0.25269 | 7,064 | 201 | 152 | 35.144279 | 0.819095 | 0.108862 | 0 | 0.748201 | 0 | 0 | 0.017219 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071942 | false | 0 | 0.064748 | 0 | 0.208633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
31052df04dd2f8256574ed0e597b1019a6871ac3 | 89,642 | py | Python | src/honeycomb_circuit_test.py | Strilanc/honeycomb_threshold | d71737d3b4fb8878e856f8bd66b9632cc7078159 | [
"Apache-2.0"
] | 5 | 2021-07-23T05:33:18.000Z | 2022-01-27T00:59:40.000Z | src/honeycomb_circuit_test.py | Strilanc/honeycomb_threshold | d71737d3b4fb8878e856f8bd66b9632cc7078159 | [
"Apache-2.0"
] | 1 | 2021-08-03T20:58:26.000Z | 2021-08-08T17:13:11.000Z | src/honeycomb_circuit_test.py | Strilanc/honeycomb_threshold | d71737d3b4fb8878e856f8bd66b9632cc7078159 | [
"Apache-2.0"
] | 1 | 2022-01-30T11:05:19.000Z | 2022-01-30T11:05:19.000Z | import itertools
import pytest
from honeycomb_circuit import generate_honeycomb_circuit
from hack_pycharm_pybind_pytest_workaround import stim
from honeycomb_layout import HoneycombLayout
@pytest.mark.parametrize('tile_width,tile_height_extra,sub_rounds,obs,style', itertools.product(
range(1, 5),
[-1, 0, +1],
range(1, 24),
["H", "V"],
["PC3", "SD6", "EM3", "SI1000"],
))
def test_circuit_has_decomposing_error_model(
tile_width: int,
tile_height_extra: int,
sub_rounds: int,
obs: str,
style: str):
if style == "SI1000" and sub_rounds % 3 != 0:
return
circuit = generate_honeycomb_circuit(HoneycombLayout(
data_width=2 * tile_width,
data_height=6 * max(1, tile_width + tile_height_extra),
sub_rounds=sub_rounds,
noise=0.001,
style=style,
obs=obs,
))
_ = circuit.detector_error_model(decompose_errors=True)
def test_circuit_details_SD6():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=1003,
noise=0.001,
style="SD6",
obs="V",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
QUBIT_COORDS(0, 1) 12
QUBIT_COORDS(0, 3) 13
QUBIT_COORDS(0, 5) 14
QUBIT_COORDS(1, 0.5) 15
QUBIT_COORDS(1, 1.5) 16
QUBIT_COORDS(1, 2.5) 17
QUBIT_COORDS(1, 3.5) 18
QUBIT_COORDS(1, 4.5) 19
QUBIT_COORDS(1, 5.5) 20
QUBIT_COORDS(2, 0) 21
QUBIT_COORDS(2, 2) 22
QUBIT_COORDS(2, 4) 23
QUBIT_COORDS(3, 0.5) 24
QUBIT_COORDS(3, 1.5) 25
QUBIT_COORDS(3, 2.5) 26
QUBIT_COORDS(3, 3.5) 27
QUBIT_COORDS(3, 4.5) 28
QUBIT_COORDS(3, 5.5) 29
R 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
R 13 16 19 21 25 28
C_ZYX 0 2 4 7 9 11
X_ERROR(0.001) 13 16 19 21 25 28
DEPOLARIZE1(0.001) 0 2 4 7 9 11 1 3 5 6 8 10 12 14 15 17 18 20 22 23 24 26 27 29
TICK
CX 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.001) 1 3 5 6 8 10 12 14 15 17 18 20 22 23 24 26 27 29
TICK
R 12 17 20 23 26 29
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
X_ERROR(0.001) 12 17 20 23 26 29
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 13 14 15 16 18 19 21 22 24 25 27 28
TICK
CX 7 12 2 17 0 20 4 23 9 26 11 29
CX 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.001) 14 15 18 22 24 27
TICK
X_ERROR(0.001) 13 16 19 21 25 28
R 14 15 18 22 24 27
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 14 15 18 22 24 27
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 17 20 23 26 29
TICK
CX 11 14 0 15 4 18 2 22 7 24 9 27
CX 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.001) 13 16 19 21 25 28
TICK
X_ERROR(0.001) 12 17 20 23 26 29
R 13 16 19 21 25 28
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 13 16 19 21 25 28
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 14 15 18 22 24 27
TICK
CX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.001) 12 17 20 23 26 29
TICK
X_ERROR(0.001) 14 15 18 22 24 27
R 12 17 20 23 26 29
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 12 17 20 23 26 29
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 13 16 19 21 25 28
TICK
CX 7 12 2 17 0 20 4 23 9 26 11 29
CX 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.001) 14 15 18 22 24 27
TICK
X_ERROR(0.001) 13 16 19 21 25 28
R 14 15 18 22 24 27
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 14 15 18 22 24 27
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 17 20 23 26 29
TICK
REPEAT 332 {
CX 11 14 0 15 4 18 2 22 7 24 9 27
CX 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.001) 13 16 19 21 25 28
TICK
X_ERROR(0.001) 12 17 20 23 26 29
R 13 16 19 21 25 28
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 13 16 19 21 25 28
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 14 15 18 22 24 27
TICK
CX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.001) 12 17 20 23 26 29
TICK
X_ERROR(0.001) 14 15 18 22 24 27
R 12 17 20 23 26 29
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 12 17 20 23 26 29
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 13 16 19 21 25 28
TICK
CX 7 12 2 17 0 20 4 23 9 26 11 29
CX 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.001) 14 15 18 22 24 27
TICK
X_ERROR(0.001) 13 16 19 21 25 28
R 14 15 18 22 24 27
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 14 15 18 22 24 27
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 17 20 23 26 29
TICK
}
CX 11 14 0 15 4 18 2 22 7 24 9 27
CX 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.001) 13 16 19 21 25 28
TICK
X_ERROR(0.001) 12 17 20 23 26 29
R 13 16 19 21 25 28
C_ZYX 0 1 2 3 4 5 6 7 8 9 10 11
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
X_ERROR(0.001) 13 16 19 21 25 28
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 14 15 18 22 24 27
TICK
CX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.001) 12 17 20 23 26 29
TICK
X_ERROR(0.001) 14 15 18 22 24 27
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 16 17 19 20 21 23 25 26 28 29
TICK
C_ZYX 1 3 5 6 8 10
DEPOLARIZE1(0.001) 1 3 5 6 8 10 0 2 4 7 9 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
CX 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE2(0.001) 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.001) 0 2 4 7 9 11 12 14 15 17 18 20 22 23 24 26 27 29
TICK
X_ERROR(0.001) 13 16 19 21 25 28
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
C_XYZ 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 14 15 17 18 20 22 23 24 26 27 29
TICK
H_YZ 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
DETECTOR(0, 2, 0) rec[-36] rec[-35] rec[-32] rec[-30] rec[-29] rec[-26] rec[-18] rec[-17] rec[-14] rec[-11] rec[-10] rec[-9] rec[-5] rec[-4] rec[-3]
DETECTOR(2, 5, 0) rec[-34] rec[-33] rec[-31] rec[-28] rec[-27] rec[-25] rec[-16] rec[-15] rec[-13] rec[-12] rec[-8] rec[-7] rec[-6] rec[-2] rec[-1]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-18] rec[-16] rec[-13] rec[-9] rec[-8] rec[-7] rec[-3] rec[-2] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-17] rec[-15] rec[-14] rec[-12] rec[-11] rec[-10] rec[-6] rec[-5] rec[-4]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10] rec[-8] rec[-7]
DEPOLARIZE1(0.001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
""")
def test_circuit_details_PC3():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=1003,
noise=0.001,
style="PC3",
obs="V",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
QUBIT_COORDS(0, 1) 12
QUBIT_COORDS(0, 3) 13
QUBIT_COORDS(0, 5) 14
QUBIT_COORDS(1, 0.5) 15
QUBIT_COORDS(1, 1.5) 16
QUBIT_COORDS(1, 2.5) 17
QUBIT_COORDS(1, 3.5) 18
QUBIT_COORDS(1, 4.5) 19
QUBIT_COORDS(1, 5.5) 20
QUBIT_COORDS(2, 0) 21
QUBIT_COORDS(2, 2) 22
QUBIT_COORDS(2, 4) 23
QUBIT_COORDS(3, 0.5) 24
QUBIT_COORDS(3, 1.5) 25
QUBIT_COORDS(3, 2.5) 26
QUBIT_COORDS(3, 3.5) 27
QUBIT_COORDS(3, 4.5) 28
QUBIT_COORDS(3, 5.5) 29
R 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
XCX 9 13 2 16 4 19 0 21 7 25 11 28
R 12 17 20 23 26 29
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28
X_ERROR(0.001) 12 17 20 23 26 29
DEPOLARIZE1(0.001) 1 3 5 6 8 10 14 15 18 22 24 27
TICK
YCX 7 12 2 17 0 20 4 23 9 26 11 29
XCX 3 13 1 16 5 19 6 21 8 25 10 28
R 14 15 18 22 24 27
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
X_ERROR(0.001) 14 15 18 22 24 27
TICK
X_ERROR(0.001) 13 16 19 21 25 28
CX 11 14 0 15 4 18 2 22 7 24 9 27
YCX 1 12 3 17 5 20 10 23 8 26 6 29
M 13 16 19 21 25 28
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
TICK
X_ERROR(0.001) 12 17 20 23 26 29
XCX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
M 12 17 20 23 26 29
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
TICK
X_ERROR(0.001) 14 15 18 22 24 27
YCX 7 12 2 17 0 20 4 23 9 26 11 29
XCX 3 13 1 16 5 19 6 21 8 25 10 28
M 14 15 18 22 24 27
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
TICK
X_ERROR(0.001) 13 16 19 21 25 28
CX 11 14 0 15 4 18 2 22 7 24 9 27
YCX 1 12 3 17 5 20 10 23 8 26 6 29
M 13 16 19 21 25 28
DETECTOR(0, 4, 0) rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
TICK
# === stabilizers are now all established, but not all edge flip flops are established ===
X_ERROR(0.001) 12 17 20 23 26 29
XCX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
M 12 17 20 23 26 29
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
TICK
X_ERROR(0.001) 14 15 18 22 24 27
YCX 7 12 2 17 0 20 4 23 9 26 11 29
XCX 3 13 1 16 5 19 6 21 8 25 10 28
M 14 15 18 22 24 27
DETECTOR(0, 0, 0) rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
TICK
X_ERROR(0.001) 13 16 19 21 25 28
CX 11 14 0 15 4 18 2 22 7 24 9 27
YCX 1 12 3 17 5 20 10 23 8 26 6 29
M 13 16 19 21 25 28
DETECTOR(0, 4, 0) rec[-42] rec[-40] rec[-37] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-41] rec[-39] rec[-38] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
TICK
# === stabilizers and edge flip flops now all established ===
REPEAT 331 {
X_ERROR(0.001) 12 17 20 23 26 29
XCX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
M 12 17 20 23 26 29
DETECTOR(0, 2, 0) rec[-48] rec[-47] rec[-44] rec[-42] rec[-41] rec[-38] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-46] rec[-45] rec[-43] rec[-40] rec[-39] rec[-37] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
TICK
X_ERROR(0.001) 14 15 18 22 24 27
YCX 7 12 2 17 0 20 4 23 9 26 11 29
XCX 3 13 1 16 5 19 6 21 8 25 10 28
M 14 15 18 22 24 27
DETECTOR(0, 0, 0) rec[-48] rec[-46] rec[-43] rec[-42] rec[-41] rec[-38] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-47] rec[-45] rec[-44] rec[-40] rec[-39] rec[-37] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29 3 13 1 16 5 19 6 21 8 25 10 28
TICK
X_ERROR(0.001) 13 16 19 21 25 28
CX 11 14 0 15 4 18 2 22 7 24 9 27
YCX 1 12 3 17 5 20 10 23 8 26 6 29
M 13 16 19 21 25 28
DETECTOR(0, 4, 0) rec[-48] rec[-46] rec[-43] rec[-42] rec[-40] rec[-37] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-47] rec[-45] rec[-44] rec[-41] rec[-39] rec[-38] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27 1 12 3 17 5 20 10 23 8 26 6 29
TICK
}
X_ERROR(0.001) 12 17 20 23 26 29
XCX 9 13 2 16 4 19 0 21 7 25 11 28
CX 5 14 1 15 3 18 8 22 6 24 10 27
M 12 17 20 23 26 29
DETECTOR(0, 2, 0) rec[-48] rec[-47] rec[-44] rec[-42] rec[-41] rec[-38] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-46] rec[-45] rec[-43] rec[-40] rec[-39] rec[-37] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28 5 14 1 15 3 18 8 22 6 24 10 27
TICK
X_ERROR(0.001) 14 15 18 22 24 27
XCX 3 13 1 16 5 19 6 21 8 25 10 28
M 14 15 18 22 24 27
DETECTOR(0, 0, 0) rec[-48] rec[-46] rec[-43] rec[-42] rec[-41] rec[-38] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-47] rec[-45] rec[-44] rec[-40] rec[-39] rec[-37] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE2(0.001) 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.001) 0 2 4 7 9 11 12 17 20 23 26 29
TICK
X_ERROR(0.001) 13 16 19 21 25 28
M 13 16 19 21 25 28
DETECTOR(0, 4, 0) rec[-48] rec[-46] rec[-43] rec[-42] rec[-40] rec[-37] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-47] rec[-45] rec[-44] rec[-41] rec[-39] rec[-38] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 14 15 17 18 20 22 23 24 26 27 29
TICK
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
OBSERVABLE_INCLUDE(0) rec[-17] rec[-16]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10]
H_YZ 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
DETECTOR(0, 2, 0) rec[-54] rec[-53] rec[-50] rec[-48] rec[-47] rec[-44] rec[-30] rec[-29] rec[-26] rec[-18] rec[-17] rec[-14] rec[-11] rec[-10] rec[-9] rec[-5] rec[-4] rec[-3]
DETECTOR(2, 5, 0) rec[-52] rec[-51] rec[-49] rec[-46] rec[-45] rec[-43] rec[-28] rec[-27] rec[-25] rec[-16] rec[-15] rec[-13] rec[-12] rec[-8] rec[-7] rec[-6] rec[-2] rec[-1]
DETECTOR(0, 4, 0) rec[-42] rec[-40] rec[-37] rec[-36] rec[-34] rec[-31] rec[-24] rec[-22] rec[-19] rec[-18] rec[-16] rec[-13] rec[-9] rec[-8] rec[-7] rec[-3] rec[-2] rec[-1]
DETECTOR(2, 1, 0) rec[-41] rec[-39] rec[-38] rec[-35] rec[-33] rec[-32] rec[-23] rec[-21] rec[-20] rec[-17] rec[-15] rec[-14] rec[-12] rec[-11] rec[-10] rec[-6] rec[-5] rec[-4]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10] rec[-8] rec[-7]
DEPOLARIZE1(0.001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
""")
def test_circuit_details_EM3():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=1003,
noise=0.001,
style="EM3",
obs="V",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
R 0 1 2 3 4 5 6 7 8 9 10 11
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
TICK
DEPOLARIZE2(0.001) 7 1 2 3 0 5 4 10 9 8 11 6
MPP(0.001) Y7*Y1 Y2*Y3 Y0*Y5 Y4*Y10 Y9*Y8 Y11*Y6
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
DEPOLARIZE2(0.001) 11 5 0 1 4 3 2 8 7 6 9 10
MPP(0.001) Z11*Z5 Z0*Z1 Z4*Z3 Z2*Z8 Z7*Z6 Z9*Z10
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
TICK
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
TICK
REPEAT 333 {
DEPOLARIZE2(0.001) 7 1 2 3 0 5 4 10 9 8 11 6
MPP(0.001) Y7*Y1 Y2*Y3 Y0*Y5 Y4*Y10 Y9*Y8 Y11*Y6
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
DEPOLARIZE2(0.001) 11 5 0 1 4 3 2 8 7 6 9 10
MPP(0.001) Z11*Z5 Z0*Z1 Z4*Z3 Z2*Z8 Z7*Z6 Z9*Z10
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
TICK
}
H_YZ 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
DETECTOR(0, 2, 0) rec[-36] rec[-35] rec[-32] rec[-30] rec[-29] rec[-26] rec[-18] rec[-17] rec[-14] rec[-11] rec[-10] rec[-9] rec[-5] rec[-4] rec[-3]
DETECTOR(2, 5, 0) rec[-34] rec[-33] rec[-31] rec[-28] rec[-27] rec[-25] rec[-16] rec[-15] rec[-13] rec[-12] rec[-8] rec[-7] rec[-6] rec[-2] rec[-1]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-18] rec[-16] rec[-13] rec[-9] rec[-8] rec[-7] rec[-3] rec[-2] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-17] rec[-15] rec[-14] rec[-12] rec[-11] rec[-10] rec[-6] rec[-5] rec[-4]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10] rec[-8] rec[-7]
""")
def test_circuit_details_EM3_v2():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=1003,
noise=0.001,
style="EM3_v2",
obs="V",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
R 0 1 2 3 4 5 6 7 8 9 10 11
X_ERROR(0.0005) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
R 12
XCX 9 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X3
E(3.12647e-05) X9 X3 X12
E(3.12647e-05) X9 Y3
E(3.12647e-05) X9 Y3 X12
E(3.12647e-05) X9 Z3
E(3.12647e-05) X9 Z3 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X3
E(3.12647e-05) Y9 X3 X12
E(3.12647e-05) Y9 Y3
E(3.12647e-05) Y9 Y3 X12
E(3.12647e-05) Y9 Z3
E(3.12647e-05) Y9 Z3 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X3
E(3.12647e-05) Z9 X3 X12
E(3.12647e-05) Z9 Y3
E(3.12647e-05) Z9 Y3 X12
E(3.12647e-05) Z9 Z3
E(3.12647e-05) Z9 Z3 X12
M 12
R 12
XCX 2 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X1
E(3.12647e-05) X2 X1 X12
E(3.12647e-05) X2 Y1
E(3.12647e-05) X2 Y1 X12
E(3.12647e-05) X2 Z1
E(3.12647e-05) X2 Z1 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X1
E(3.12647e-05) Y2 X1 X12
E(3.12647e-05) Y2 Y1
E(3.12647e-05) Y2 Y1 X12
E(3.12647e-05) Y2 Z1
E(3.12647e-05) Y2 Z1 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X1
E(3.12647e-05) Z2 X1 X12
E(3.12647e-05) Z2 Y1
E(3.12647e-05) Z2 Y1 X12
E(3.12647e-05) Z2 Z1
E(3.12647e-05) Z2 Z1 X12
M 12
R 12
XCX 4 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X5
E(3.12647e-05) X4 X5 X12
E(3.12647e-05) X4 Y5
E(3.12647e-05) X4 Y5 X12
E(3.12647e-05) X4 Z5
E(3.12647e-05) X4 Z5 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X5
E(3.12647e-05) Y4 X5 X12
E(3.12647e-05) Y4 Y5
E(3.12647e-05) Y4 Y5 X12
E(3.12647e-05) Y4 Z5
E(3.12647e-05) Y4 Z5 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X5
E(3.12647e-05) Z4 X5 X12
E(3.12647e-05) Z4 Y5
E(3.12647e-05) Z4 Y5 X12
E(3.12647e-05) Z4 Z5
E(3.12647e-05) Z4 Z5 X12
M 12
R 12
XCX 0 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X6
E(3.12647e-05) X0 X6 X12
E(3.12647e-05) X0 Y6
E(3.12647e-05) X0 Y6 X12
E(3.12647e-05) X0 Z6
E(3.12647e-05) X0 Z6 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X6
E(3.12647e-05) Y0 X6 X12
E(3.12647e-05) Y0 Y6
E(3.12647e-05) Y0 Y6 X12
E(3.12647e-05) Y0 Z6
E(3.12647e-05) Y0 Z6 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X6
E(3.12647e-05) Z0 X6 X12
E(3.12647e-05) Z0 Y6
E(3.12647e-05) Z0 Y6 X12
E(3.12647e-05) Z0 Z6
E(3.12647e-05) Z0 Z6 X12
M 12
R 12
XCX 7 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X8
E(3.12647e-05) X7 X8 X12
E(3.12647e-05) X7 Y8
E(3.12647e-05) X7 Y8 X12
E(3.12647e-05) X7 Z8
E(3.12647e-05) X7 Z8 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X8
E(3.12647e-05) Y7 X8 X12
E(3.12647e-05) Y7 Y8
E(3.12647e-05) Y7 Y8 X12
E(3.12647e-05) Y7 Z8
E(3.12647e-05) Y7 Z8 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X8
E(3.12647e-05) Z7 X8 X12
E(3.12647e-05) Z7 Y8
E(3.12647e-05) Z7 Y8 X12
E(3.12647e-05) Z7 Z8
E(3.12647e-05) Z7 Z8 X12
M 12
R 12
XCX 11 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X10
E(3.12647e-05) X11 X10 X12
E(3.12647e-05) X11 Y10
E(3.12647e-05) X11 Y10 X12
E(3.12647e-05) X11 Z10
E(3.12647e-05) X11 Z10 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X10
E(3.12647e-05) Y11 X10 X12
E(3.12647e-05) Y11 Y10
E(3.12647e-05) Y11 Y10 X12
E(3.12647e-05) Y11 Z10
E(3.12647e-05) Y11 Z10 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X10
E(3.12647e-05) Z11 X10 X12
E(3.12647e-05) Z11 Y10
E(3.12647e-05) Z11 Y10 X12
E(3.12647e-05) Z11 Z10
E(3.12647e-05) Z11 Z10 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
TICK
R 12
YCX 7 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X1
E(3.12647e-05) X7 X1 X12
E(3.12647e-05) X7 Y1
E(3.12647e-05) X7 Y1 X12
E(3.12647e-05) X7 Z1
E(3.12647e-05) X7 Z1 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X1
E(3.12647e-05) Y7 X1 X12
E(3.12647e-05) Y7 Y1
E(3.12647e-05) Y7 Y1 X12
E(3.12647e-05) Y7 Z1
E(3.12647e-05) Y7 Z1 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X1
E(3.12647e-05) Z7 X1 X12
E(3.12647e-05) Z7 Y1
E(3.12647e-05) Z7 Y1 X12
E(3.12647e-05) Z7 Z1
E(3.12647e-05) Z7 Z1 X12
M 12
R 12
YCX 2 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X3
E(3.12647e-05) X2 X3 X12
E(3.12647e-05) X2 Y3
E(3.12647e-05) X2 Y3 X12
E(3.12647e-05) X2 Z3
E(3.12647e-05) X2 Z3 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X3
E(3.12647e-05) Y2 X3 X12
E(3.12647e-05) Y2 Y3
E(3.12647e-05) Y2 Y3 X12
E(3.12647e-05) Y2 Z3
E(3.12647e-05) Y2 Z3 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X3
E(3.12647e-05) Z2 X3 X12
E(3.12647e-05) Z2 Y3
E(3.12647e-05) Z2 Y3 X12
E(3.12647e-05) Z2 Z3
E(3.12647e-05) Z2 Z3 X12
M 12
R 12
YCX 0 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X5
E(3.12647e-05) X0 X5 X12
E(3.12647e-05) X0 Y5
E(3.12647e-05) X0 Y5 X12
E(3.12647e-05) X0 Z5
E(3.12647e-05) X0 Z5 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X5
E(3.12647e-05) Y0 X5 X12
E(3.12647e-05) Y0 Y5
E(3.12647e-05) Y0 Y5 X12
E(3.12647e-05) Y0 Z5
E(3.12647e-05) Y0 Z5 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X5
E(3.12647e-05) Z0 X5 X12
E(3.12647e-05) Z0 Y5
E(3.12647e-05) Z0 Y5 X12
E(3.12647e-05) Z0 Z5
E(3.12647e-05) Z0 Z5 X12
M 12
R 12
YCX 4 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X10
E(3.12647e-05) X4 X10 X12
E(3.12647e-05) X4 Y10
E(3.12647e-05) X4 Y10 X12
E(3.12647e-05) X4 Z10
E(3.12647e-05) X4 Z10 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X10
E(3.12647e-05) Y4 X10 X12
E(3.12647e-05) Y4 Y10
E(3.12647e-05) Y4 Y10 X12
E(3.12647e-05) Y4 Z10
E(3.12647e-05) Y4 Z10 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X10
E(3.12647e-05) Z4 X10 X12
E(3.12647e-05) Z4 Y10
E(3.12647e-05) Z4 Y10 X12
E(3.12647e-05) Z4 Z10
E(3.12647e-05) Z4 Z10 X12
M 12
R 12
YCX 9 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X8
E(3.12647e-05) X9 X8 X12
E(3.12647e-05) X9 Y8
E(3.12647e-05) X9 Y8 X12
E(3.12647e-05) X9 Z8
E(3.12647e-05) X9 Z8 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X8
E(3.12647e-05) Y9 X8 X12
E(3.12647e-05) Y9 Y8
E(3.12647e-05) Y9 Y8 X12
E(3.12647e-05) Y9 Z8
E(3.12647e-05) Y9 Z8 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X8
E(3.12647e-05) Z9 X8 X12
E(3.12647e-05) Z9 Y8
E(3.12647e-05) Z9 Y8 X12
E(3.12647e-05) Z9 Z8
E(3.12647e-05) Z9 Z8 X12
M 12
R 12
YCX 11 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X6
E(3.12647e-05) X11 X6 X12
E(3.12647e-05) X11 Y6
E(3.12647e-05) X11 Y6 X12
E(3.12647e-05) X11 Z6
E(3.12647e-05) X11 Z6 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X6
E(3.12647e-05) Y11 X6 X12
E(3.12647e-05) Y11 Y6
E(3.12647e-05) Y11 Y6 X12
E(3.12647e-05) Y11 Z6
E(3.12647e-05) Y11 Z6 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X6
E(3.12647e-05) Z11 X6 X12
E(3.12647e-05) Z11 Y6
E(3.12647e-05) Z11 Y6 X12
E(3.12647e-05) Z11 Z6
E(3.12647e-05) Z11 Z6 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
R 12
CX 11 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X5
E(3.12647e-05) X11 X5 X12
E(3.12647e-05) X11 Y5
E(3.12647e-05) X11 Y5 X12
E(3.12647e-05) X11 Z5
E(3.12647e-05) X11 Z5 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X5
E(3.12647e-05) Y11 X5 X12
E(3.12647e-05) Y11 Y5
E(3.12647e-05) Y11 Y5 X12
E(3.12647e-05) Y11 Z5
E(3.12647e-05) Y11 Z5 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X5
E(3.12647e-05) Z11 X5 X12
E(3.12647e-05) Z11 Y5
E(3.12647e-05) Z11 Y5 X12
E(3.12647e-05) Z11 Z5
E(3.12647e-05) Z11 Z5 X12
M 12
R 12
CX 0 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X1
E(3.12647e-05) X0 X1 X12
E(3.12647e-05) X0 Y1
E(3.12647e-05) X0 Y1 X12
E(3.12647e-05) X0 Z1
E(3.12647e-05) X0 Z1 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X1
E(3.12647e-05) Y0 X1 X12
E(3.12647e-05) Y0 Y1
E(3.12647e-05) Y0 Y1 X12
E(3.12647e-05) Y0 Z1
E(3.12647e-05) Y0 Z1 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X1
E(3.12647e-05) Z0 X1 X12
E(3.12647e-05) Z0 Y1
E(3.12647e-05) Z0 Y1 X12
E(3.12647e-05) Z0 Z1
E(3.12647e-05) Z0 Z1 X12
M 12
R 12
CX 4 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X3
E(3.12647e-05) X4 X3 X12
E(3.12647e-05) X4 Y3
E(3.12647e-05) X4 Y3 X12
E(3.12647e-05) X4 Z3
E(3.12647e-05) X4 Z3 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X3
E(3.12647e-05) Y4 X3 X12
E(3.12647e-05) Y4 Y3
E(3.12647e-05) Y4 Y3 X12
E(3.12647e-05) Y4 Z3
E(3.12647e-05) Y4 Z3 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X3
E(3.12647e-05) Z4 X3 X12
E(3.12647e-05) Z4 Y3
E(3.12647e-05) Z4 Y3 X12
E(3.12647e-05) Z4 Z3
E(3.12647e-05) Z4 Z3 X12
M 12
R 12
CX 2 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X8
E(3.12647e-05) X2 X8 X12
E(3.12647e-05) X2 Y8
E(3.12647e-05) X2 Y8 X12
E(3.12647e-05) X2 Z8
E(3.12647e-05) X2 Z8 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X8
E(3.12647e-05) Y2 X8 X12
E(3.12647e-05) Y2 Y8
E(3.12647e-05) Y2 Y8 X12
E(3.12647e-05) Y2 Z8
E(3.12647e-05) Y2 Z8 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X8
E(3.12647e-05) Z2 X8 X12
E(3.12647e-05) Z2 Y8
E(3.12647e-05) Z2 Y8 X12
E(3.12647e-05) Z2 Z8
E(3.12647e-05) Z2 Z8 X12
M 12
R 12
CX 7 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X6
E(3.12647e-05) X7 X6 X12
E(3.12647e-05) X7 Y6
E(3.12647e-05) X7 Y6 X12
E(3.12647e-05) X7 Z6
E(3.12647e-05) X7 Z6 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X6
E(3.12647e-05) Y7 X6 X12
E(3.12647e-05) Y7 Y6
E(3.12647e-05) Y7 Y6 X12
E(3.12647e-05) Y7 Z6
E(3.12647e-05) Y7 Z6 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X6
E(3.12647e-05) Z7 X6 X12
E(3.12647e-05) Z7 Y6
E(3.12647e-05) Z7 Y6 X12
E(3.12647e-05) Z7 Z6
E(3.12647e-05) Z7 Z6 X12
M 12
R 12
CX 9 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X10
E(3.12647e-05) X9 X10 X12
E(3.12647e-05) X9 Y10
E(3.12647e-05) X9 Y10 X12
E(3.12647e-05) X9 Z10
E(3.12647e-05) X9 Z10 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X10
E(3.12647e-05) Y9 X10 X12
E(3.12647e-05) Y9 Y10
E(3.12647e-05) Y9 Y10 X12
E(3.12647e-05) Y9 Z10
E(3.12647e-05) Y9 Z10 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X10
E(3.12647e-05) Z9 X10 X12
E(3.12647e-05) Z9 Y10
E(3.12647e-05) Z9 Y10 X12
E(3.12647e-05) Z9 Z10
E(3.12647e-05) Z9 Z10 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
TICK
R 12
XCX 9 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X3
E(3.12647e-05) X9 X3 X12
E(3.12647e-05) X9 Y3
E(3.12647e-05) X9 Y3 X12
E(3.12647e-05) X9 Z3
E(3.12647e-05) X9 Z3 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X3
E(3.12647e-05) Y9 X3 X12
E(3.12647e-05) Y9 Y3
E(3.12647e-05) Y9 Y3 X12
E(3.12647e-05) Y9 Z3
E(3.12647e-05) Y9 Z3 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X3
E(3.12647e-05) Z9 X3 X12
E(3.12647e-05) Z9 Y3
E(3.12647e-05) Z9 Y3 X12
E(3.12647e-05) Z9 Z3
E(3.12647e-05) Z9 Z3 X12
M 12
R 12
XCX 2 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X1
E(3.12647e-05) X2 X1 X12
E(3.12647e-05) X2 Y1
E(3.12647e-05) X2 Y1 X12
E(3.12647e-05) X2 Z1
E(3.12647e-05) X2 Z1 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X1
E(3.12647e-05) Y2 X1 X12
E(3.12647e-05) Y2 Y1
E(3.12647e-05) Y2 Y1 X12
E(3.12647e-05) Y2 Z1
E(3.12647e-05) Y2 Z1 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X1
E(3.12647e-05) Z2 X1 X12
E(3.12647e-05) Z2 Y1
E(3.12647e-05) Z2 Y1 X12
E(3.12647e-05) Z2 Z1
E(3.12647e-05) Z2 Z1 X12
M 12
R 12
XCX 4 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X5
E(3.12647e-05) X4 X5 X12
E(3.12647e-05) X4 Y5
E(3.12647e-05) X4 Y5 X12
E(3.12647e-05) X4 Z5
E(3.12647e-05) X4 Z5 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X5
E(3.12647e-05) Y4 X5 X12
E(3.12647e-05) Y4 Y5
E(3.12647e-05) Y4 Y5 X12
E(3.12647e-05) Y4 Z5
E(3.12647e-05) Y4 Z5 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X5
E(3.12647e-05) Z4 X5 X12
E(3.12647e-05) Z4 Y5
E(3.12647e-05) Z4 Y5 X12
E(3.12647e-05) Z4 Z5
E(3.12647e-05) Z4 Z5 X12
M 12
R 12
XCX 0 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X6
E(3.12647e-05) X0 X6 X12
E(3.12647e-05) X0 Y6
E(3.12647e-05) X0 Y6 X12
E(3.12647e-05) X0 Z6
E(3.12647e-05) X0 Z6 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X6
E(3.12647e-05) Y0 X6 X12
E(3.12647e-05) Y0 Y6
E(3.12647e-05) Y0 Y6 X12
E(3.12647e-05) Y0 Z6
E(3.12647e-05) Y0 Z6 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X6
E(3.12647e-05) Z0 X6 X12
E(3.12647e-05) Z0 Y6
E(3.12647e-05) Z0 Y6 X12
E(3.12647e-05) Z0 Z6
E(3.12647e-05) Z0 Z6 X12
M 12
R 12
XCX 7 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X8
E(3.12647e-05) X7 X8 X12
E(3.12647e-05) X7 Y8
E(3.12647e-05) X7 Y8 X12
E(3.12647e-05) X7 Z8
E(3.12647e-05) X7 Z8 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X8
E(3.12647e-05) Y7 X8 X12
E(3.12647e-05) Y7 Y8
E(3.12647e-05) Y7 Y8 X12
E(3.12647e-05) Y7 Z8
E(3.12647e-05) Y7 Z8 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X8
E(3.12647e-05) Z7 X8 X12
E(3.12647e-05) Z7 Y8
E(3.12647e-05) Z7 Y8 X12
E(3.12647e-05) Z7 Z8
E(3.12647e-05) Z7 Z8 X12
M 12
R 12
XCX 11 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X10
E(3.12647e-05) X11 X10 X12
E(3.12647e-05) X11 Y10
E(3.12647e-05) X11 Y10 X12
E(3.12647e-05) X11 Z10
E(3.12647e-05) X11 Z10 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X10
E(3.12647e-05) Y11 X10 X12
E(3.12647e-05) Y11 Y10
E(3.12647e-05) Y11 Y10 X12
E(3.12647e-05) Y11 Z10
E(3.12647e-05) Y11 Z10 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X10
E(3.12647e-05) Z11 X10 X12
E(3.12647e-05) Z11 Y10
E(3.12647e-05) Z11 Y10 X12
E(3.12647e-05) Z11 Z10
E(3.12647e-05) Z11 Z10 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
TICK
REPEAT 333 {
R 12
YCX 7 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X1
E(3.12647e-05) X7 X1 X12
E(3.12647e-05) X7 Y1
E(3.12647e-05) X7 Y1 X12
E(3.12647e-05) X7 Z1
E(3.12647e-05) X7 Z1 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X1
E(3.12647e-05) Y7 X1 X12
E(3.12647e-05) Y7 Y1
E(3.12647e-05) Y7 Y1 X12
E(3.12647e-05) Y7 Z1
E(3.12647e-05) Y7 Z1 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X1
E(3.12647e-05) Z7 X1 X12
E(3.12647e-05) Z7 Y1
E(3.12647e-05) Z7 Y1 X12
E(3.12647e-05) Z7 Z1
E(3.12647e-05) Z7 Z1 X12
M 12
R 12
YCX 2 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X3
E(3.12647e-05) X2 X3 X12
E(3.12647e-05) X2 Y3
E(3.12647e-05) X2 Y3 X12
E(3.12647e-05) X2 Z3
E(3.12647e-05) X2 Z3 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X3
E(3.12647e-05) Y2 X3 X12
E(3.12647e-05) Y2 Y3
E(3.12647e-05) Y2 Y3 X12
E(3.12647e-05) Y2 Z3
E(3.12647e-05) Y2 Z3 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X3
E(3.12647e-05) Z2 X3 X12
E(3.12647e-05) Z2 Y3
E(3.12647e-05) Z2 Y3 X12
E(3.12647e-05) Z2 Z3
E(3.12647e-05) Z2 Z3 X12
M 12
R 12
YCX 0 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X5
E(3.12647e-05) X0 X5 X12
E(3.12647e-05) X0 Y5
E(3.12647e-05) X0 Y5 X12
E(3.12647e-05) X0 Z5
E(3.12647e-05) X0 Z5 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X5
E(3.12647e-05) Y0 X5 X12
E(3.12647e-05) Y0 Y5
E(3.12647e-05) Y0 Y5 X12
E(3.12647e-05) Y0 Z5
E(3.12647e-05) Y0 Z5 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X5
E(3.12647e-05) Z0 X5 X12
E(3.12647e-05) Z0 Y5
E(3.12647e-05) Z0 Y5 X12
E(3.12647e-05) Z0 Z5
E(3.12647e-05) Z0 Z5 X12
M 12
R 12
YCX 4 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X10
E(3.12647e-05) X4 X10 X12
E(3.12647e-05) X4 Y10
E(3.12647e-05) X4 Y10 X12
E(3.12647e-05) X4 Z10
E(3.12647e-05) X4 Z10 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X10
E(3.12647e-05) Y4 X10 X12
E(3.12647e-05) Y4 Y10
E(3.12647e-05) Y4 Y10 X12
E(3.12647e-05) Y4 Z10
E(3.12647e-05) Y4 Z10 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X10
E(3.12647e-05) Z4 X10 X12
E(3.12647e-05) Z4 Y10
E(3.12647e-05) Z4 Y10 X12
E(3.12647e-05) Z4 Z10
E(3.12647e-05) Z4 Z10 X12
M 12
R 12
YCX 9 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X8
E(3.12647e-05) X9 X8 X12
E(3.12647e-05) X9 Y8
E(3.12647e-05) X9 Y8 X12
E(3.12647e-05) X9 Z8
E(3.12647e-05) X9 Z8 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X8
E(3.12647e-05) Y9 X8 X12
E(3.12647e-05) Y9 Y8
E(3.12647e-05) Y9 Y8 X12
E(3.12647e-05) Y9 Z8
E(3.12647e-05) Y9 Z8 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X8
E(3.12647e-05) Z9 X8 X12
E(3.12647e-05) Z9 Y8
E(3.12647e-05) Z9 Y8 X12
E(3.12647e-05) Z9 Z8
E(3.12647e-05) Z9 Z8 X12
M 12
R 12
YCX 11 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X6
E(3.12647e-05) X11 X6 X12
E(3.12647e-05) X11 Y6
E(3.12647e-05) X11 Y6 X12
E(3.12647e-05) X11 Z6
E(3.12647e-05) X11 Z6 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X6
E(3.12647e-05) Y11 X6 X12
E(3.12647e-05) Y11 Y6
E(3.12647e-05) Y11 Y6 X12
E(3.12647e-05) Y11 Z6
E(3.12647e-05) Y11 Z6 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X6
E(3.12647e-05) Z11 X6 X12
E(3.12647e-05) Z11 Y6
E(3.12647e-05) Z11 Y6 X12
E(3.12647e-05) Z11 Z6
E(3.12647e-05) Z11 Z6 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
R 12
CX 11 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X5
E(3.12647e-05) X11 X5 X12
E(3.12647e-05) X11 Y5
E(3.12647e-05) X11 Y5 X12
E(3.12647e-05) X11 Z5
E(3.12647e-05) X11 Z5 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X5
E(3.12647e-05) Y11 X5 X12
E(3.12647e-05) Y11 Y5
E(3.12647e-05) Y11 Y5 X12
E(3.12647e-05) Y11 Z5
E(3.12647e-05) Y11 Z5 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X5
E(3.12647e-05) Z11 X5 X12
E(3.12647e-05) Z11 Y5
E(3.12647e-05) Z11 Y5 X12
E(3.12647e-05) Z11 Z5
E(3.12647e-05) Z11 Z5 X12
M 12
R 12
CX 0 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X1
E(3.12647e-05) X0 X1 X12
E(3.12647e-05) X0 Y1
E(3.12647e-05) X0 Y1 X12
E(3.12647e-05) X0 Z1
E(3.12647e-05) X0 Z1 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X1
E(3.12647e-05) Y0 X1 X12
E(3.12647e-05) Y0 Y1
E(3.12647e-05) Y0 Y1 X12
E(3.12647e-05) Y0 Z1
E(3.12647e-05) Y0 Z1 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X1
E(3.12647e-05) Z0 X1 X12
E(3.12647e-05) Z0 Y1
E(3.12647e-05) Z0 Y1 X12
E(3.12647e-05) Z0 Z1
E(3.12647e-05) Z0 Z1 X12
M 12
R 12
CX 4 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X3
E(3.12647e-05) X4 X3 X12
E(3.12647e-05) X4 Y3
E(3.12647e-05) X4 Y3 X12
E(3.12647e-05) X4 Z3
E(3.12647e-05) X4 Z3 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X3
E(3.12647e-05) Y4 X3 X12
E(3.12647e-05) Y4 Y3
E(3.12647e-05) Y4 Y3 X12
E(3.12647e-05) Y4 Z3
E(3.12647e-05) Y4 Z3 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X3
E(3.12647e-05) Z4 X3 X12
E(3.12647e-05) Z4 Y3
E(3.12647e-05) Z4 Y3 X12
E(3.12647e-05) Z4 Z3
E(3.12647e-05) Z4 Z3 X12
M 12
R 12
CX 2 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X8
E(3.12647e-05) X2 X8 X12
E(3.12647e-05) X2 Y8
E(3.12647e-05) X2 Y8 X12
E(3.12647e-05) X2 Z8
E(3.12647e-05) X2 Z8 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X8
E(3.12647e-05) Y2 X8 X12
E(3.12647e-05) Y2 Y8
E(3.12647e-05) Y2 Y8 X12
E(3.12647e-05) Y2 Z8
E(3.12647e-05) Y2 Z8 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X8
E(3.12647e-05) Z2 X8 X12
E(3.12647e-05) Z2 Y8
E(3.12647e-05) Z2 Y8 X12
E(3.12647e-05) Z2 Z8
E(3.12647e-05) Z2 Z8 X12
M 12
R 12
CX 7 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X6
E(3.12647e-05) X7 X6 X12
E(3.12647e-05) X7 Y6
E(3.12647e-05) X7 Y6 X12
E(3.12647e-05) X7 Z6
E(3.12647e-05) X7 Z6 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X6
E(3.12647e-05) Y7 X6 X12
E(3.12647e-05) Y7 Y6
E(3.12647e-05) Y7 Y6 X12
E(3.12647e-05) Y7 Z6
E(3.12647e-05) Y7 Z6 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X6
E(3.12647e-05) Z7 X6 X12
E(3.12647e-05) Z7 Y6
E(3.12647e-05) Z7 Y6 X12
E(3.12647e-05) Z7 Z6
E(3.12647e-05) Z7 Z6 X12
M 12
R 12
CX 9 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X10
E(3.12647e-05) X9 X10 X12
E(3.12647e-05) X9 Y10
E(3.12647e-05) X9 Y10 X12
E(3.12647e-05) X9 Z10
E(3.12647e-05) X9 Z10 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X10
E(3.12647e-05) Y9 X10 X12
E(3.12647e-05) Y9 Y10
E(3.12647e-05) Y9 Y10 X12
E(3.12647e-05) Y9 Z10
E(3.12647e-05) Y9 Z10 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X10
E(3.12647e-05) Z9 X10 X12
E(3.12647e-05) Z9 Y10
E(3.12647e-05) Z9 Y10 X12
E(3.12647e-05) Z9 Z10
E(3.12647e-05) Z9 Z10 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
R 12
XCX 9 12 3 12
E(3.12647e-05) X12
E(3.12647e-05) X3
E(3.12647e-05) X3 X12
E(3.12647e-05) Y3
E(3.12647e-05) Y3 X12
E(3.12647e-05) Z3
E(3.12647e-05) Z3 X12
E(3.12647e-05) X9
E(3.12647e-05) X9 X12
E(3.12647e-05) X9 X3
E(3.12647e-05) X9 X3 X12
E(3.12647e-05) X9 Y3
E(3.12647e-05) X9 Y3 X12
E(3.12647e-05) X9 Z3
E(3.12647e-05) X9 Z3 X12
E(3.12647e-05) Y9
E(3.12647e-05) Y9 X12
E(3.12647e-05) Y9 X3
E(3.12647e-05) Y9 X3 X12
E(3.12647e-05) Y9 Y3
E(3.12647e-05) Y9 Y3 X12
E(3.12647e-05) Y9 Z3
E(3.12647e-05) Y9 Z3 X12
E(3.12647e-05) Z9
E(3.12647e-05) Z9 X12
E(3.12647e-05) Z9 X3
E(3.12647e-05) Z9 X3 X12
E(3.12647e-05) Z9 Y3
E(3.12647e-05) Z9 Y3 X12
E(3.12647e-05) Z9 Z3
E(3.12647e-05) Z9 Z3 X12
M 12
R 12
XCX 2 12 1 12
E(3.12647e-05) X12
E(3.12647e-05) X1
E(3.12647e-05) X1 X12
E(3.12647e-05) Y1
E(3.12647e-05) Y1 X12
E(3.12647e-05) Z1
E(3.12647e-05) Z1 X12
E(3.12647e-05) X2
E(3.12647e-05) X2 X12
E(3.12647e-05) X2 X1
E(3.12647e-05) X2 X1 X12
E(3.12647e-05) X2 Y1
E(3.12647e-05) X2 Y1 X12
E(3.12647e-05) X2 Z1
E(3.12647e-05) X2 Z1 X12
E(3.12647e-05) Y2
E(3.12647e-05) Y2 X12
E(3.12647e-05) Y2 X1
E(3.12647e-05) Y2 X1 X12
E(3.12647e-05) Y2 Y1
E(3.12647e-05) Y2 Y1 X12
E(3.12647e-05) Y2 Z1
E(3.12647e-05) Y2 Z1 X12
E(3.12647e-05) Z2
E(3.12647e-05) Z2 X12
E(3.12647e-05) Z2 X1
E(3.12647e-05) Z2 X1 X12
E(3.12647e-05) Z2 Y1
E(3.12647e-05) Z2 Y1 X12
E(3.12647e-05) Z2 Z1
E(3.12647e-05) Z2 Z1 X12
M 12
R 12
XCX 4 12 5 12
E(3.12647e-05) X12
E(3.12647e-05) X5
E(3.12647e-05) X5 X12
E(3.12647e-05) Y5
E(3.12647e-05) Y5 X12
E(3.12647e-05) Z5
E(3.12647e-05) Z5 X12
E(3.12647e-05) X4
E(3.12647e-05) X4 X12
E(3.12647e-05) X4 X5
E(3.12647e-05) X4 X5 X12
E(3.12647e-05) X4 Y5
E(3.12647e-05) X4 Y5 X12
E(3.12647e-05) X4 Z5
E(3.12647e-05) X4 Z5 X12
E(3.12647e-05) Y4
E(3.12647e-05) Y4 X12
E(3.12647e-05) Y4 X5
E(3.12647e-05) Y4 X5 X12
E(3.12647e-05) Y4 Y5
E(3.12647e-05) Y4 Y5 X12
E(3.12647e-05) Y4 Z5
E(3.12647e-05) Y4 Z5 X12
E(3.12647e-05) Z4
E(3.12647e-05) Z4 X12
E(3.12647e-05) Z4 X5
E(3.12647e-05) Z4 X5 X12
E(3.12647e-05) Z4 Y5
E(3.12647e-05) Z4 Y5 X12
E(3.12647e-05) Z4 Z5
E(3.12647e-05) Z4 Z5 X12
M 12
R 12
XCX 0 12 6 12
E(3.12647e-05) X12
E(3.12647e-05) X6
E(3.12647e-05) X6 X12
E(3.12647e-05) Y6
E(3.12647e-05) Y6 X12
E(3.12647e-05) Z6
E(3.12647e-05) Z6 X12
E(3.12647e-05) X0
E(3.12647e-05) X0 X12
E(3.12647e-05) X0 X6
E(3.12647e-05) X0 X6 X12
E(3.12647e-05) X0 Y6
E(3.12647e-05) X0 Y6 X12
E(3.12647e-05) X0 Z6
E(3.12647e-05) X0 Z6 X12
E(3.12647e-05) Y0
E(3.12647e-05) Y0 X12
E(3.12647e-05) Y0 X6
E(3.12647e-05) Y0 X6 X12
E(3.12647e-05) Y0 Y6
E(3.12647e-05) Y0 Y6 X12
E(3.12647e-05) Y0 Z6
E(3.12647e-05) Y0 Z6 X12
E(3.12647e-05) Z0
E(3.12647e-05) Z0 X12
E(3.12647e-05) Z0 X6
E(3.12647e-05) Z0 X6 X12
E(3.12647e-05) Z0 Y6
E(3.12647e-05) Z0 Y6 X12
E(3.12647e-05) Z0 Z6
E(3.12647e-05) Z0 Z6 X12
M 12
R 12
XCX 7 12 8 12
E(3.12647e-05) X12
E(3.12647e-05) X8
E(3.12647e-05) X8 X12
E(3.12647e-05) Y8
E(3.12647e-05) Y8 X12
E(3.12647e-05) Z8
E(3.12647e-05) Z8 X12
E(3.12647e-05) X7
E(3.12647e-05) X7 X12
E(3.12647e-05) X7 X8
E(3.12647e-05) X7 X8 X12
E(3.12647e-05) X7 Y8
E(3.12647e-05) X7 Y8 X12
E(3.12647e-05) X7 Z8
E(3.12647e-05) X7 Z8 X12
E(3.12647e-05) Y7
E(3.12647e-05) Y7 X12
E(3.12647e-05) Y7 X8
E(3.12647e-05) Y7 X8 X12
E(3.12647e-05) Y7 Y8
E(3.12647e-05) Y7 Y8 X12
E(3.12647e-05) Y7 Z8
E(3.12647e-05) Y7 Z8 X12
E(3.12647e-05) Z7
E(3.12647e-05) Z7 X12
E(3.12647e-05) Z7 X8
E(3.12647e-05) Z7 X8 X12
E(3.12647e-05) Z7 Y8
E(3.12647e-05) Z7 Y8 X12
E(3.12647e-05) Z7 Z8
E(3.12647e-05) Z7 Z8 X12
M 12
R 12
XCX 11 12 10 12
E(3.12647e-05) X12
E(3.12647e-05) X10
E(3.12647e-05) X10 X12
E(3.12647e-05) Y10
E(3.12647e-05) Y10 X12
E(3.12647e-05) Z10
E(3.12647e-05) Z10 X12
E(3.12647e-05) X11
E(3.12647e-05) X11 X12
E(3.12647e-05) X11 X10
E(3.12647e-05) X11 X10 X12
E(3.12647e-05) X11 Y10
E(3.12647e-05) X11 Y10 X12
E(3.12647e-05) X11 Z10
E(3.12647e-05) X11 Z10 X12
E(3.12647e-05) Y11
E(3.12647e-05) Y11 X12
E(3.12647e-05) Y11 X10
E(3.12647e-05) Y11 X10 X12
E(3.12647e-05) Y11 Y10
E(3.12647e-05) Y11 Y10 X12
E(3.12647e-05) Y11 Z10
E(3.12647e-05) Y11 Z10 X12
E(3.12647e-05) Z11
E(3.12647e-05) Z11 X12
E(3.12647e-05) Z11 X10
E(3.12647e-05) Z11 X10 X12
E(3.12647e-05) Z11 Y10
E(3.12647e-05) Z11 Y10 X12
E(3.12647e-05) Z11 Z10
E(3.12647e-05) Z11 Z10 X12
M 12
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
TICK
}
H_YZ 0 1 2 3 4 5 6 7 8 9 10 11
TICK
X_ERROR(0.0005) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
DETECTOR(0, 2, 0) rec[-36] rec[-35] rec[-32] rec[-30] rec[-29] rec[-26] rec[-18] rec[-17] rec[-14] rec[-11] rec[-10] rec[-9] rec[-5] rec[-4] rec[-3]
DETECTOR(2, 5, 0) rec[-34] rec[-33] rec[-31] rec[-28] rec[-27] rec[-25] rec[-16] rec[-15] rec[-13] rec[-12] rec[-8] rec[-7] rec[-6] rec[-2] rec[-1]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-18] rec[-16] rec[-13] rec[-9] rec[-8] rec[-7] rec[-3] rec[-2] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-17] rec[-15] rec[-14] rec[-12] rec[-11] rec[-10] rec[-6] rec[-5] rec[-4]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10] rec[-8] rec[-7]
""")
def test_circuit_details_EM3_h_obs():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=1003,
noise=0.001,
style="EM3",
obs="H",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
R 0 1 2 3 4 5 6 7 8 9 10 11
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
H 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
# X subround. Compare X parities to X initializations.
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(1) rec[-3]
DETECTOR(0, 3, 0) rec[-6]
DETECTOR(1, 1.5, 0) rec[-5]
DETECTOR(1, 4.5, 0) rec[-4]
DETECTOR(2, 0, 0) rec[-3]
DETECTOR(3, 1.5, 0) rec[-2]
DETECTOR(3, 4.5, 0) rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
# Y subround. Get X*Y=Z stabilizers for first time.
DEPOLARIZE2(0.001) 7 1 2 3 0 5 4 10 9 8 11 6
MPP(0.001) Y7*Y1 Y2*Y3 Y0*Y5 Y4*Y10 Y9*Y8 Y11*Y6
OBSERVABLE_INCLUDE(1) rec[-6]
SHIFT_COORDS(0, 0, 1)
TICK
# Z subround. Get Y*Z=X stabilizers to compare against initialization.
DEPOLARIZE2(0.001) 11 5 0 1 4 3 2 8 7 6 9 10
MPP(0.001) Z11*Z5 Z0*Z1 Z4*Z3 Z2*Z8 Z7*Z6 Z9*Z10
OBSERVABLE_INCLUDE(1) rec[-5] rec[-2]
DETECTOR(0, 0, 0) rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
# X subround. Get Z*X=Y stabilizers for the first time.
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(1) rec[-3]
SHIFT_COORDS(0, 0, 1)
TICK
REPEAT 333 {
# Y subround. Get X*Y = Z stabilizers to compare against last time.
DEPOLARIZE2(0.001) 7 1 2 3 0 5 4 10 9 8 11 6
MPP(0.001) Y7*Y1 Y2*Y3 Y0*Y5 Y4*Y10 Y9*Y8 Y11*Y6
OBSERVABLE_INCLUDE(1) rec[-6]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
# Z subround. Get Y*Z = X stabilizers to compare against last time.
DEPOLARIZE2(0.001) 11 5 0 1 4 3 2 8 7 6 9 10
MPP(0.001) Z11*Z5 Z0*Z1 Z4*Z3 Z2*Z8 Z7*Z6 Z9*Z10
OBSERVABLE_INCLUDE(1) rec[-5] rec[-2]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
TICK
# X subround. Get Z*X = Y stabilizers to compare against last time.
DEPOLARIZE2(0.001) 9 3 2 1 4 5 0 6 7 8 11 10
MPP(0.001) X9*X3 X2*X1 X4*X5 X0*X6 X7*X8 X11*X10
OBSERVABLE_INCLUDE(1) rec[-3]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
TICK
}
H 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
X_ERROR(0.001) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
# Compare X data measurements to X parity measurements from last subround.
DETECTOR(0, 3, 0) rec[-18] rec[-9] rec[-3]
DETECTOR(1, 1.5, 0) rec[-17] rec[-11] rec[-10]
DETECTOR(1, 4.5, 0) rec[-16] rec[-8] rec[-7]
DETECTOR(2, 0, 0) rec[-15] rec[-12] rec[-6]
DETECTOR(3, 1.5, 0) rec[-14] rec[-5] rec[-4]
DETECTOR(3, 4.5, 0) rec[-13] rec[-2] rec[-1]
# Compare X data measurements to previous X stabilizer reconstruction.
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-7] rec[-6] rec[-5] rec[-1]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-8] rec[-4] rec[-3] rec[-2]
OBSERVABLE_INCLUDE(1) rec[-11] rec[-5]
""")
def test_circuit_details_SI1000():
actual = generate_honeycomb_circuit(HoneycombLayout(
data_width=2,
data_height=6,
sub_rounds=3 * 300,
noise=0.001,
style="SI1000",
obs="V",
))
cleaned = stim.Circuit(str(actual))
assert cleaned == stim.Circuit("""
QUBIT_COORDS(1, 0) 0
QUBIT_COORDS(1, 1) 1
QUBIT_COORDS(1, 2) 2
QUBIT_COORDS(1, 3) 3
QUBIT_COORDS(1, 4) 4
QUBIT_COORDS(1, 5) 5
QUBIT_COORDS(3, 0) 6
QUBIT_COORDS(3, 1) 7
QUBIT_COORDS(3, 2) 8
QUBIT_COORDS(3, 3) 9
QUBIT_COORDS(3, 4) 10
QUBIT_COORDS(3, 5) 11
QUBIT_COORDS(0, 1) 12
QUBIT_COORDS(0, 3) 13
QUBIT_COORDS(0, 5) 14
QUBIT_COORDS(1, 0.5) 15
QUBIT_COORDS(1, 1.5) 16
QUBIT_COORDS(1, 2.5) 17
QUBIT_COORDS(1, 3.5) 18
QUBIT_COORDS(1, 4.5) 19
QUBIT_COORDS(1, 5.5) 20
QUBIT_COORDS(2, 0) 21
QUBIT_COORDS(2, 2) 22
QUBIT_COORDS(2, 4) 23
QUBIT_COORDS(3, 0.5) 24
QUBIT_COORDS(3, 1.5) 25
QUBIT_COORDS(3, 2.5) 26
QUBIT_COORDS(3, 3.5) 27
QUBIT_COORDS(3, 4.5) 28
QUBIT_COORDS(3, 5.5) 29
R 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
X_ERROR(0.002) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
TICK
C_ZYX 0 2 4 7 9 11
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 1 3 5 6 8 10
TICK
# X sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
# X sub-round part 2
C_ZYX 0 2 4 7 9 11
CZ 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
# Y sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
# Y sub-round part 2
C_ZYX 0 2 4 7 9 11
CZ 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
# Z sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 12 13 16 17 19 20 21 23 25 26 28 29
TICK
# Z sub-round part 2
CZ 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 16 17 19 20 21 23 25 26 28 29
TICK
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 0 1 2 3 4 5 6 7 8 9 10 11
TICK
# Finish first round.
X_ERROR(0.005) 13 16 19 21 25 28 12 17 20 23 26 29 14 15 18 22 24 27
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE1(0.0001) 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.002) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
R 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
X_ERROR(0.002) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.002) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
C_ZYX 0 2 4 7 9 11
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 1 3 5 6 8 10
TICK
# X sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
# X sub-round part 2
C_ZYX 0 2 4 7 9 11
CZ 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
# Y sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
# Y sub-round part 2
C_ZYX 0 2 4 7 9 11
CZ 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
# Z sub-round part 1
C_ZYX 1 3 5 6 8 10
CZ 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 12 13 16 17 19 20 21 23 25 26 28 29
TICK
# Z sub-round part 2
CZ 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 16 17 19 20 21 23 25 26 28 29
TICK
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 0 1 2 3 4 5 6 7 8 9 10 11
TICK
# Finish second round.
X_ERROR(0.005) 13 16 19 21 25 28 12 17 20 23 26 29 14 15 18 22 24 27
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE1(0.0001) 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.002) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
# Now in stable state for cross-round comparisons. Use a loop.
REPEAT 298 {
R 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
X_ERROR(0.002) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.002) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
C_ZYX 0 2 4 7 9 11
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 1 3 5 6 8 10
TICK
C_ZYX 1 3 5 6 8 10
CZ 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 9 13 2 16 4 19 0 21 7 25 11 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
C_ZYX 0 2 4 7 9 11
CZ 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 3 13 1 16 5 19 6 21 8 25 10 28
DEPOLARIZE1(0.0001) 12 14 15 17 18 20 22 23 24 26 27 29
TICK
C_ZYX 1 3 5 6 8 10
CZ 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 7 12 2 17 0 20 4 23 9 26 11 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
C_ZYX 0 2 4 7 9 11
CZ 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 0 2 4 7 9 11
DEPOLARIZE2(0.001) 1 12 3 17 5 20 10 23 8 26 6 29
DEPOLARIZE1(0.0001) 13 14 15 16 18 19 21 22 24 25 27 28
TICK
C_ZYX 1 3 5 6 8 10
CZ 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 1 3 5 6 8 10
DEPOLARIZE2(0.001) 11 14 0 15 4 18 2 22 7 24 9 27
DEPOLARIZE1(0.0001) 12 13 16 17 19 20 21 23 25 26 28 29
TICK
CZ 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE2(0.001) 5 14 1 15 3 18 8 22 6 24 10 27
DEPOLARIZE1(0.0001) 0 2 4 7 9 11 12 13 16 17 19 20 21 23 25 26 28 29
TICK
H 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.0001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 0 1 2 3 4 5 6 7 8 9 10 11
TICK
X_ERROR(0.005) 13 16 19 21 25 28 12 17 20 23 26 29 14 15 18 22 24 27
M 13 16 19 21 25 28
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 4, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-22] rec[-19] rec[-12] rec[-10] rec[-7] rec[-6] rec[-4] rec[-1]
DETECTOR(2, 1, 0) rec[-29] rec[-27] rec[-26] rec[-23] rec[-21] rec[-20] rec[-11] rec[-9] rec[-8] rec[-5] rec[-3] rec[-2]
SHIFT_COORDS(0, 0, 1)
M 12 17 20 23 26 29
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-12] rec[-11] rec[-8] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-10] rec[-9] rec[-7] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
M 14 15 18 22 24 27
OBSERVABLE_INCLUDE(0) rec[-5] rec[-4]
DETECTOR(0, 0, 0) rec[-30] rec[-28] rec[-25] rec[-24] rec[-23] rec[-20] rec[-12] rec[-10] rec[-7] rec[-6] rec[-5] rec[-2]
DETECTOR(2, 3, 0) rec[-29] rec[-27] rec[-26] rec[-22] rec[-21] rec[-19] rec[-11] rec[-9] rec[-8] rec[-4] rec[-3] rec[-1]
SHIFT_COORDS(0, 0, 1)
DEPOLARIZE1(0.0001) 0 1 2 3 4 5 6 7 8 9 10 11
DEPOLARIZE1(0.002) 0 1 2 3 4 5 6 7 8 9 10 11
TICK
}
# Data measurement.
X_ERROR(0.005) 0 1 2 3 4 5 6 7 8 9 10 11
M 0 1 2 3 4 5 6 7 8 9 10 11
DETECTOR(0, 5, 0) rec[-18] rec[-7] rec[-1]
DETECTOR(1, 0.5, 0) rec[-17] rec[-12] rec[-11]
DETECTOR(1, 3.5, 0) rec[-16] rec[-9] rec[-8]
DETECTOR(2, 2, 0) rec[-15] rec[-10] rec[-4]
DETECTOR(3, 0.5, 0) rec[-14] rec[-6] rec[-5]
DETECTOR(3, 3.5, 0) rec[-13] rec[-3] rec[-2]
DETECTOR(0, 2, 0) rec[-30] rec[-29] rec[-26] rec[-24] rec[-23] rec[-20] rec[-11] rec[-10] rec[-9] rec[-5] rec[-4] rec[-3]
DETECTOR(2, 5, 0) rec[-28] rec[-27] rec[-25] rec[-22] rec[-21] rec[-19] rec[-12] rec[-8] rec[-7] rec[-6] rec[-2] rec[-1]
OBSERVABLE_INCLUDE(0) rec[-11] rec[-10] rec[-8] rec[-7]
DEPOLARIZE1(0.0001) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
DEPOLARIZE1(0.002) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
""")
| 35.8568 | 184 | 0.493786 | 18,341 | 89,642 | 2.391473 | 0.012322 | 0.059368 | 0.237472 | 0.29684 | 0.974032 | 0.969997 | 0.967329 | 0.961105 | 0.958894 | 0.956546 | 0 | 0.463975 | 0.3799 | 89,642 | 2,499 | 185 | 35.871148 | 0.325094 | 0 | 0 | 0.954755 | 1 | 0.077503 | 0.969635 | 0.012026 | 0 | 0 | 0 | 0 | 0.002514 | 1 | 0.002933 | false | 0 | 0.002095 | 0 | 0.005446 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
313ac0eca60b46238a9845c8c893b92601c6b887 | 196 | py | Python | road_reader_project/map/views.py | SaladSaad/Django-RoadReader-Site | 57c0ba582083476861d6aa90cbe74498b02fb536 | [
"bzip2-1.0.6"
] | null | null | null | road_reader_project/map/views.py | SaladSaad/Django-RoadReader-Site | 57c0ba582083476861d6aa90cbe74498b02fb536 | [
"bzip2-1.0.6"
] | null | null | null | road_reader_project/map/views.py | SaladSaad/Django-RoadReader-Site | 57c0ba582083476861d6aa90cbe74498b02fb536 | [
"bzip2-1.0.6"
] | null | null | null | from django.shortcuts import render
def map(request):
return render(request, 'map.html', {'title': 'Map'})
def extra(request):
return render(request, 'extra.html', {'title': 'Extra'})
| 19.6 | 60 | 0.668367 | 25 | 196 | 5.24 | 0.48 | 0.198473 | 0.290076 | 0.396947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153061 | 196 | 9 | 61 | 21.777778 | 0.789157 | 0 | 0 | 0 | 0 | 0 | 0.183673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
31a58b55c4759abe9f7d62e5d0e212c5e898dcfc | 1,661 | py | Python | images/Ch02/02_02 End/02_02.py | mutazag/cv | f5693772bda4e2611808d862756bd9234f02176e | [
"MIT"
] | 1 | 2020-08-06T12:03:40.000Z | 2020-08-06T12:03:40.000Z | images/Ch02/02_02 End/02_02.py | mutazag/cv | f5693772bda4e2611808d862756bd9234f02176e | [
"MIT"
] | null | null | null | images/Ch02/02_02 End/02_02.py | mutazag/cv | f5693772bda4e2611808d862756bd9234f02176e | [
"MIT"
] | 1 | 2020-08-10T07:56:24.000Z | 2020-08-10T07:56:24.000Z | >>> import numpy as np
>>> import cv2
>>> img = cv2.imread("opencv-logo.png", 1)
>>>
>>> img
array([[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]],
[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]],
[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]],
...,
[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]],
[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]],
[[255, 255, 255],
[255, 255, 255],
[255, 255, 255],
...,
[255, 255, 255],
[255, 255, 255],
[255, 255, 255]]], dtype=uint8)
>>> type(img)
<class 'numpy.ndarray'>
>>> len(img)
739
>>> len(img[0])
600
>>> len(img[0][0])
3
>>> img.shape
(739, 600, 3)
>>> img.dtype
dtype('uint8')
>>> 2**8
256
>>> img[10, 5]
array([255, 255, 255], dtype=uint8)
>>> img[:, :, 0]
array([[255, 255, 255, ..., 255, 255, 255],
[255, 255, 255, ..., 255, 255, 255],
[255, 255, 255, ..., 255, 255, 255],
...,
[255, 255, 255, ..., 255, 255, 255],
[255, 255, 255, ..., 255, 255, 255],
[255, 255, 255, ..., 255, 255, 255]], dtype=uint8)
>>> img.size
1330200 | 21.025316 | 57 | 0.393137 | 207 | 1,661 | 3.154589 | 0.149758 | 1.323124 | 1.943338 | 2.535988 | 0.745789 | 0.745789 | 0.707504 | 0.707504 | 0.707504 | 0.707504 | 0 | 0.452919 | 0.360626 | 1,661 | 79 | 58 | 21.025316 | 0.161959 | 0 | 0 | 0.621622 | 0 | 0 | 0.019856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027027 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
9ec8011304bc399db9963d297b6400ed156cd6c4 | 5,000 | py | Python | csdl/tests/test_max.py | LSDOlab/csdl | 04c2c5764f6ca9b865ec87ecfeaf6f22ecacc5a3 | [
"MIT"
] | null | null | null | csdl/tests/test_max.py | LSDOlab/csdl | 04c2c5764f6ca9b865ec87ecfeaf6f22ecacc5a3 | [
"MIT"
] | null | null | null | csdl/tests/test_max.py | LSDOlab/csdl | 04c2c5764f6ca9b865ec87ecfeaf6f22ecacc5a3 | [
"MIT"
] | 1 | 2021-10-04T19:40:32.000Z | 2021-10-04T19:40:32.000Z | import numpy as np
import pytest
def test_max_scalar(backend):
from csdl.examples.valid.ex_max_scalar import example
exec('from {} import Simulator'.format(backend))
sim = example(eval('Simulator'))
m = 2
n = 3
o = 4
p = 5
q = 6
tensor_shape = (m, n, o, p, q)
num_of_elements = np.prod(tensor_shape)
tensor = np.arange(num_of_elements).reshape(tensor_shape)
# SCALAR MIN
desired_output = np.max(tensor)
np.testing.assert_almost_equal(sim['ScalarMax'], desired_output)
assert sim['ScalarMax'].shape == (1, ), sim['ScalarMax'].shape
partials_error = sim.check_partials(includes=['comp_ScalarMax'],
out_stream=None,
compact_print=True,
method='cs')
sim.assert_check_partials(partials_error, atol=1.e-6, rtol=1.e-6)
def test_max_axiswise(backend):
from csdl.examples.valid.ex_max_axiswise import example
exec('from {} import Simulator'.format(backend))
sim = example(eval('Simulator'))
m = 2
n = 3
o = 4
p = 5
q = 6
tensor_shape = (m, n, o, p, q)
num_of_elements = np.prod(tensor_shape)
tensor = np.arange(num_of_elements).reshape(tensor_shape)
# AXISWISE MIN
desired_output = np.amax(tensor, axis=1)
np.testing.assert_almost_equal(sim['AxiswiseMax'], desired_output)
assert sim['AxiswiseMax'].shape == (m, o, p,
q), sim['AxiswiseMax'].shape
partials_error = sim.check_partials(includes=['comp_AxiswiseMax'],
out_stream=None,
compact_print=True,
method='cs')
sim.assert_check_partials(partials_error, atol=1.e-6, rtol=1.e-6)
def test_max_elementwise(backend):
from csdl.examples.valid.ex_max_elementwise import example
exec('from {} import Simulator'.format(backend))
sim = example(eval('Simulator'))
tensor1 = np.array([[1, 5, -8], [10, -3, -5]])
tensor2 = np.array([[2, 6, 9], [-1, 2, 4]])
desired_output = np.maximum(tensor1, tensor2)
np.testing.assert_almost_equal(sim['ElementwiseMax'],
desired_output)
assert sim['ElementwiseMax'].shape == (
2, 3), sim['ElementwiseMax'].shape
partials_error = sim.check_partials(
includes=['comp_ElementwiseMax'],
out_stream=None,
compact_print=True,
method='cs')
sim.assert_check_partials(partials_error, atol=1.e-6, rtol=1.e-6)
def test_max_multi_inputs_and_axis(backend):
exec('from {} import Simulator'.format(backend))
from csdl.examples.invalid.ex_max_multi_inputs_and_axis import example
with pytest.raises(Exception):
example(eval('Simulator'))
def test_max_inputs_not_same_size(backend):
exec('from {} import Simulator'.format(backend))
from csdl.examples.invalid.ex_max_inputs_not_same_size import example
with pytest.raises(Exception):
example(eval('Simulator'))
def test_max_scalar_random(backend):
from csdl.examples.valid.ex_max_scalar_random import example
exec('from {} import Simulator'.format(backend))
sim = example(eval('Simulator'))
m = 2
n = 3
o = 4
p = 5
q = 6
np.random.seed(0)
tensor_shape = (m, n, o, p, q)
num_of_elements = np.prod(tensor_shape)
tensor = np.random.rand(num_of_elements).reshape(tensor_shape)
# SCALAR MIN
desired_output = np.max(tensor)
np.testing.assert_almost_equal(sim['ScalarMax'], desired_output)
assert sim['ScalarMax'].shape == (1, ), sim['ScalarMax'].shape
partials_error = sim.check_partials(includes=['comp_ScalarMax'],
out_stream=None,
compact_print=True,
method='cs')
sim.assert_check_partials(partials_error, atol=1.e-6, rtol=1.e-6)
def test_max_axiswise_random(backend):
from csdl.examples.valid.ex_max_axiswise_random import example
exec('from {} import Simulator'.format(backend))
sim = example(eval('Simulator'))
m = 2
n = 3
o = 4
p = 5
q = 6
np.random.seed(0)
tensor_shape = (m, n, o, p, q)
num_of_elements = np.prod(tensor_shape)
tensor = np.random.rand(num_of_elements).reshape(tensor_shape)
# AXISWISE MIN
desired_output = np.amax(tensor, axis=1)
np.testing.assert_almost_equal(sim['AxiswiseMax'], desired_output)
partials_error = sim.check_partials(includes=['comp_AxiswiseMax'],
out_stream=None,
compact_print=True,
method='cs')
sim.assert_check_partials(partials_error, atol=1.e-6, rtol=1.e-6)
assert sim['AxiswiseMax'].shape == (m, o, p,
q), sim['AxiswiseMax'].shape
| 33.112583 | 74 | 0.606 | 636 | 5,000 | 4.556604 | 0.143082 | 0.045549 | 0.010352 | 0.055556 | 0.918565 | 0.89441 | 0.884403 | 0.873016 | 0.797792 | 0.797792 | 0 | 0.017936 | 0.2752 | 5,000 | 150 | 75 | 33.333333 | 0.781733 | 0.0094 | 0 | 0.778761 | 0 | 0 | 0.097413 | 0 | 0 | 0 | 0 | 0 | 0.132743 | 1 | 0.061947 | false | 0 | 0.141593 | 0 | 0.20354 | 0.044248 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ecbb5239216253d33fdcbf35dba404de430ba97 | 28 | py | Python | lib/solutions/TST/two.py | DPNT-Sourcecode/CHK-wyuk01 | 17b638162cdb10a0eb764d5c8cec4c088c68dfd4 | [
"Apache-2.0"
] | null | null | null | lib/solutions/TST/two.py | DPNT-Sourcecode/CHK-wyuk01 | 17b638162cdb10a0eb764d5c8cec4c088c68dfd4 | [
"Apache-2.0"
] | null | null | null | lib/solutions/TST/two.py | DPNT-Sourcecode/CHK-wyuk01 | 17b638162cdb10a0eb764d5c8cec4c088c68dfd4 | [
"Apache-2.0"
] | null | null | null |
def get():
return 2
| 7 | 13 | 0.464286 | 4 | 28 | 3.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.428571 | 28 | 3 | 14 | 9.333333 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
9ee6e98f29e801ad531b8aaad9845325ec4face8 | 4,888 | py | Python | tests/test_decode_attribute_types.py | jnothman/liac-arff | 45fc0a87fe31e165fd912ed9973c5de3c345787b | [
"MIT"
] | 1 | 2021-05-04T18:01:51.000Z | 2021-05-04T18:01:51.000Z | tests/test_decode_attribute_types.py | jnothman/liac-arff | 45fc0a87fe31e165fd912ed9973c5de3c345787b | [
"MIT"
] | null | null | null | tests/test_decode_attribute_types.py | jnothman/liac-arff | 45fc0a87fe31e165fd912ed9973c5de3c345787b | [
"MIT"
] | null | null | null | import unittest
import arff
class TestDecodeAttributeTypes(unittest.TestCase):
def get_decoder(self):
decoder = arff.ArffDecoder()
return decoder
def test_numeric(self):
'''Numeric attributes.'''
decoder = self.get_decoder()
# Simple case
fixture = u'@ATTRIBUTE attribute-name NUMERIC'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'NUMERIC')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
# Case insensitive
fixture = u'@ATTRIBUTE attribute-name NuMeriC'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'NUMERIC')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
def test_real(self):
'''Real attributes.'''
decoder = self.get_decoder()
# Simple case
fixture = u'@ATTRIBUTE attribute-name REAL'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'REAL')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
# Case insensitive
fixture = u'@ATTRIBUTE attribute-name ReAl'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'REAL')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
def test_integer(self):
'''Integer attributes.'''
decoder = self.get_decoder()
# Simple case
fixture = u'@ATTRIBUTE attribute-name INTEGER'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'INTEGER')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
# Case insensitive
fixture = u'@ATTRIBUTE attribute-name InteGeR'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'INTEGER')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
def test_string(self):
'''String attributes.'''
decoder = self.get_decoder()
# Simple case
fixture = u'@ATTRIBUTE attribute-name STRING'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'STRING')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
# Case insensitive
fixture = u'@ATTRIBUTE attribute-name stRing'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', u'STRING')
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(result[1], expected[1])
def test_nominal(self):
'''Nominal attributes.'''
decoder = self.get_decoder()
# Simple case
fixture = u'@ATTRIBUTE attribute-name {a, b, c}'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', [u'a', u'b', u'c'])
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(len(result[1]), 3)
self.assertEqual(result[1][0], expected[1][0])
self.assertEqual(result[1][1], expected[1][1])
self.assertEqual(result[1][2], expected[1][2])
# Quoted/Spaced/Number case
fixture = u'@ATTRIBUTE attribute-name {"name with spce", 1, lol,2 }'
result = decoder._decode_attribute(fixture)
expected = (u'attribute-name', [u'name with spce', u'1', u'lol', u'2'])
self.assertEqual(len(result), 2)
self.assertEqual(result[0], expected[0])
self.assertEqual(len(result[1]), 4)
self.assertEqual(result[1][0], expected[1][0])
self.assertEqual(result[1][1], expected[1][1])
self.assertEqual(result[1][2], expected[1][2])
self.assertEqual(result[1][3], expected[1][3])
def test_invalid_type(self):
'''Invalid type name or structure.'''
decoder = self.get_decoder()
# Invalid type name
fixture = u'@ATTRIBUTE attribute-name NON-EXIST'
self.assertRaises(
arff.BadAttributeType,
decoder._decode_attribute,
fixture
)
# Invalid nominal structure
fixture = u'@ATTRIBUTE attribute-name {1, 2] 3'
self.assertRaises(
arff.BadAttributeType,
decoder._decode_attribute,
fixture
)
| 33.251701 | 79 | 0.61027 | 555 | 4,888 | 5.306306 | 0.095496 | 0.188455 | 0.178268 | 0.112054 | 0.855348 | 0.834975 | 0.82343 | 0.82343 | 0.779287 | 0.779287 | 0 | 0.02339 | 0.256547 | 4,888 | 146 | 80 | 33.479452 | 0.787012 | 0.066694 | 0 | 0.680412 | 0 | 0 | 0.138969 | 0 | 0 | 0 | 0 | 0 | 0.402062 | 1 | 0.072165 | false | 0 | 0.020619 | 0 | 0.113402 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73130c701e5d5af45b3d0ded6b8193aa169e2835 | 826 | py | Python | venv/Lib/site-packages/cryptography/x509/oid.py | arnoyu-hub/COMP0016miemie | 59af664dcf190eab4f93cefb8471908717415fea | [
"MIT"
] | null | null | null | venv/Lib/site-packages/cryptography/x509/oid.py | arnoyu-hub/COMP0016miemie | 59af664dcf190eab4f93cefb8471908717415fea | [
"MIT"
] | null | null | null | venv/Lib/site-packages/cryptography/x509/oid.py | arnoyu-hub/COMP0016miemie | 59af664dcf190eab4f93cefb8471908717415fea | [
"MIT"
] | null | null | null | # This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from cryptography.hazmat._oid import (
AttributeOID,
AuthorityInformationAccessOID,
CRLEntryExtensionOID,
CertificatePoliciesOID,
ExtendedKeyUsageOID,
ExtensionOID,
NameOID,
OCSPExtensionOID,
ObjectIdentifier,
SignatureAlgorithmOID,
SubjectInformationAccessOID,
)
__all__ = [
"AttributeOID",
"AuthorityInformationAccessOID",
"CRLEntryExtensionOID",
"CertificatePoliciesOID",
"ExtendedKeyUsageOID",
"ExtensionOID",
"NameOID",
"OCSPExtensionOID",
"ObjectIdentifier",
"SignatureAlgorithmOID",
"SubjectInformationAccessOID",
]
| 25.030303 | 80 | 0.700969 | 60 | 826 | 9.566667 | 0.666667 | 0.142857 | 0.212544 | 0.289199 | 0.700348 | 0.700348 | 0.700348 | 0.700348 | 0.700348 | 0.700348 | 0 | 0.003135 | 0.227603 | 826 | 32 | 81 | 25.8125 | 0.896552 | 0.209443 | 0 | 0 | 0 | 0 | 0.32577 | 0.160454 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73380ff1c8d0d23f4e8c5431f0c39cde3895d640 | 37,979 | py | Python | kubernetes/test/test_io_xk8s_cluster_controlplane_v1beta1_aws_managed_control_plane_list.py | mariusgheorghies/python | 68ac7e168963d8b5a81dc493b1973d29e903a15b | [
"Apache-2.0"
] | null | null | null | kubernetes/test/test_io_xk8s_cluster_controlplane_v1beta1_aws_managed_control_plane_list.py | mariusgheorghies/python | 68ac7e168963d8b5a81dc493b1973d29e903a15b | [
"Apache-2.0"
] | null | null | null | kubernetes/test/test_io_xk8s_cluster_controlplane_v1beta1_aws_managed_control_plane_list.py | mariusgheorghies/python | 68ac7e168963d8b5a81dc493b1973d29e903a15b | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Kubernetes
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: v1.20.7
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import unittest
import datetime
import kubernetes.client
from kubernetes.client.models.io_xk8s_cluster_controlplane_v1beta1_aws_managed_control_plane_list import IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList # noqa: E501
from kubernetes.client.rest import ApiException
class TestIoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList(unittest.TestCase):
"""IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList unit test stubs"""
def setUp(self):
pass
def tearDown(self):
pass
def make_instance(self, include_optional):
"""Test IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList
include_option is a boolean, when False only required
params are included, when True both required and
optional params are included """
# model = kubernetes.client.models.io_xk8s_cluster_controlplane_v1beta1_aws_managed_control_plane_list.IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList() # noqa: E501
if include_optional :
return IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList(
api_version = '0',
items = [
kubernetes.client.models.io/x_k8s/cluster/controlplane/v1beta1/aws_managed_control_plane.io.x-k8s.cluster.controlplane.v1beta1.AWSManagedControlPlane(
api_version = '0',
kind = '0',
metadata = kubernetes.client.models.v1/object_meta_v2.v1.ObjectMeta_v2(
annotations = {
'key' : '0'
},
cluster_name = '0',
creation_timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
deletion_grace_period_seconds = 56,
deletion_timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
finalizers = [
'0'
],
generate_name = '0',
generation = 56,
labels = {
'key' : '0'
},
managed_fields = [
kubernetes.client.models.v1/managed_fields_entry.v1.ManagedFieldsEntry(
api_version = '0',
fields_type = '0',
fields_v1 = kubernetes.client.models.fields_v1.fieldsV1(),
manager = '0',
operation = '0',
time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
],
name = '0',
namespace = '0',
owner_references = [
kubernetes.client.models.v1/owner_reference_v2.v1.OwnerReference_v2(
api_version = '0',
block_owner_deletion = True,
controller = True,
kind = '0',
name = '0',
uid = '0', )
],
resource_version = '0',
self_link = '0',
uid = '0', ),
spec = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec(
additional_tags = {
'key' : '0'
},
addons = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_addons.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_addons(
conflict_resolution = 'overwrite',
name = '01',
service_account_role_arn = '0',
version = '0', )
],
associate_oidc_provider = True,
bastion = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_bastion.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_bastion(
allowed_cidr_blocks = [
'0'
],
ami = '0',
disable_ingress_rules = True,
enabled = True,
instance_type = '0', ),
control_plane_endpoint = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_control_plane_endpoint.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_controlPlaneEndpoint(
host = '0',
port = 56, ),
disable_vpccni = True,
eks_cluster_name = '0',
encryption_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_encryption_config.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_encryptionConfig(
provider = '0',
resources = [
'0'
], ),
endpoint_access = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_endpoint_access.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_endpointAccess(
private = True,
public = True,
public_cid_rs = [
'0'
], ),
iam_authenticator_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig(
map_roles = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config_map_roles.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig_mapRoles(
groups = [
'0'
],
rolearn = '0123456789101112131415161718192021222324252627282930',
username = '0', )
],
map_users = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config_map_users.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig_mapUsers(
groups = [
'0'
],
userarn = '0123456789101112131415161718192021222324252627282930',
username = '0', )
], ),
identity_ref = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_identity_ref.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_identityRef(
kind = 'AWSClusterControllerIdentity',
name = '0', ),
image_lookup_base_os = '0',
image_lookup_format = '0',
image_lookup_org = '0',
logging = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_logging.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_logging(
api_server = True,
audit = True,
authenticator = True,
controller_manager = True,
scheduler = True, ),
network = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec(
cni = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_cni.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_cni(
cni_ingress_rules = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_cni_cni_ingress_rules.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_cni_cniIngressRules(
description = '0',
from_port = 56,
protocol = '0',
to_port = 56, )
], ),
security_group_overrides = {
'key' : '0'
},
subnets = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_subnets.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_subnets(
availability_zone = '0',
cidr_block = '0',
id = '0',
is_public = True,
nat_gateway_id = '0',
route_table_id = '0',
tags = {
'key' : '0'
}, )
],
vpc = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_vpc.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_vpc(
availability_zone_selection = 'Ordered',
availability_zone_usage_limit = 1,
cidr_block = '0',
id = '0',
internet_gateway_id = '0', ), ),
oidc_identity_provider_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha4_aws_managed_control_plane_spec_oidc_identity_provider_config.io_x_k8s_cluster_controlplane_v1alpha4_AWSManagedControlPlane_spec_oidcIdentityProviderConfig(
kubernetes.client_id = '0',
groups_claim = '0',
groups_prefix = '0',
identity_provider_config_name = '0',
issuer_url = '0',
required_claims = {
'key' : '0'
},
username_claim = '0',
username_prefix = '0', ),
region = '0',
role_additional_policies = [
'0'
],
role_name = '01',
secondary_cidr_block = '0',
ssh_key_name = '0',
token_method = 'iam-authenticator',
version = 'a', ),
status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_status.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_status(
conditions = [
kubernetes.client.models.io_x_k8s_cluster_addons_v1beta1_cluster_resource_set_status_conditions.io_x_k8s_cluster_addons_v1beta1_ClusterResourceSet_status_conditions(
last_transition_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
message = '0',
reason = '0',
severity = '0',
status = '0',
type = '0', )
],
external_managed_control_plane = True,
failure_domains = {
'key' : kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_failure_domains.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_failureDomains(
attributes = {
'key' : '0'
},
control_plane = True, )
},
failure_message = '0',
identity_provider_status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha4_aws_managed_control_plane_status_identity_provider_status.io_x_k8s_cluster_controlplane_v1alpha4_AWSManagedControlPlane_status_identityProviderStatus(
arn = '0', ),
initialized = True,
network_status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network(
api_server_elb = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb(
availability_zones = [
'0'
],
dns_name = '0',
health_checks = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb_health_checks.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb_healthChecks(
healthy_threshold = 56,
interval = 56,
target = '0',
timeout = 56,
unhealthy_threshold = 56, ),
listeners = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb_listeners.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb_listeners(
instance_port = 56,
instance_protocol = '0',
port = 56,
protocol = '0', )
],
name = '0',
scheme = '0',
security_group_ids = [
'0'
],
subnet_ids = [
'0'
], ),
security_groups = {
'key' : kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_security_groups.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_securityGroups(
id = '0',
ingress_rule = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_ingress_rule.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_ingressRule(
cidr_blocks = [
'0'
],
description = '0',
from_port = 56,
protocol = '0',
source_security_group_ids = [
'0'
],
to_port = 56, )
],
name = '0', )
}, ),
oidc_provider = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_oidc_provider.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_oidcProvider(
arn = '0',
trust_policy = '0', ),
ready = True, ), )
],
kind = '0',
metadata = kubernetes.client.models.v1/list_meta.v1.ListMeta(
continue = '0',
remaining_item_count = 56,
resource_version = '0',
self_link = '0', )
)
else :
return IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList(
items = [
kubernetes.client.models.io/x_k8s/cluster/controlplane/v1beta1/aws_managed_control_plane.io.x-k8s.cluster.controlplane.v1beta1.AWSManagedControlPlane(
api_version = '0',
kind = '0',
metadata = kubernetes.client.models.v1/object_meta_v2.v1.ObjectMeta_v2(
annotations = {
'key' : '0'
},
cluster_name = '0',
creation_timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
deletion_grace_period_seconds = 56,
deletion_timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
finalizers = [
'0'
],
generate_name = '0',
generation = 56,
labels = {
'key' : '0'
},
managed_fields = [
kubernetes.client.models.v1/managed_fields_entry.v1.ManagedFieldsEntry(
api_version = '0',
fields_type = '0',
fields_v1 = kubernetes.client.models.fields_v1.fieldsV1(),
manager = '0',
operation = '0',
time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
],
name = '0',
namespace = '0',
owner_references = [
kubernetes.client.models.v1/owner_reference_v2.v1.OwnerReference_v2(
api_version = '0',
block_owner_deletion = True,
controller = True,
kind = '0',
name = '0',
uid = '0', )
],
resource_version = '0',
self_link = '0',
uid = '0', ),
spec = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec(
additional_tags = {
'key' : '0'
},
addons = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_addons.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_addons(
conflict_resolution = 'overwrite',
name = '01',
service_account_role_arn = '0',
version = '0', )
],
associate_oidc_provider = True,
bastion = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_bastion.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_bastion(
allowed_cidr_blocks = [
'0'
],
ami = '0',
disable_ingress_rules = True,
enabled = True,
instance_type = '0', ),
control_plane_endpoint = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_control_plane_endpoint.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_controlPlaneEndpoint(
host = '0',
port = 56, ),
disable_vpccni = True,
eks_cluster_name = '0',
encryption_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_encryption_config.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_encryptionConfig(
provider = '0',
resources = [
'0'
], ),
endpoint_access = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_endpoint_access.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_endpointAccess(
private = True,
public = True,
public_cid_rs = [
'0'
], ),
iam_authenticator_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig(
map_roles = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config_map_roles.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig_mapRoles(
groups = [
'0'
],
rolearn = '0123456789101112131415161718192021222324252627282930',
username = '0', )
],
map_users = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_spec_iam_authenticator_config_map_users.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_spec_iamAuthenticatorConfig_mapUsers(
groups = [
'0'
],
userarn = '0123456789101112131415161718192021222324252627282930',
username = '0', )
], ),
identity_ref = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_identity_ref.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_identityRef(
kind = 'AWSClusterControllerIdentity',
name = '0', ),
image_lookup_base_os = '0',
image_lookup_format = '0',
image_lookup_org = '0',
logging = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_logging.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_logging(
api_server = True,
audit = True,
authenticator = True,
controller_manager = True,
scheduler = True, ),
network = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec(
cni = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_cni.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_cni(
cni_ingress_rules = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_cni_cni_ingress_rules.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_cni_cniIngressRules(
description = '0',
from_port = 56,
protocol = '0',
to_port = 56, )
], ),
security_group_overrides = {
'key' : '0'
},
subnets = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_subnets.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_subnets(
availability_zone = '0',
cidr_block = '0',
id = '0',
is_public = True,
nat_gateway_id = '0',
route_table_id = '0',
tags = {
'key' : '0'
}, )
],
vpc = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_spec_network_spec_vpc.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_spec_networkSpec_vpc(
availability_zone_selection = 'Ordered',
availability_zone_usage_limit = 1,
cidr_block = '0',
id = '0',
internet_gateway_id = '0', ), ),
oidc_identity_provider_config = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha4_aws_managed_control_plane_spec_oidc_identity_provider_config.io_x_k8s_cluster_controlplane_v1alpha4_AWSManagedControlPlane_spec_oidcIdentityProviderConfig(
kubernetes.client_id = '0',
groups_claim = '0',
groups_prefix = '0',
identity_provider_config_name = '0',
issuer_url = '0',
required_claims = {
'key' : '0'
},
username_claim = '0',
username_prefix = '0', ),
region = '0',
role_additional_policies = [
'0'
],
role_name = '01',
secondary_cidr_block = '0',
ssh_key_name = '0',
token_method = 'iam-authenticator',
version = 'a', ),
status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1beta1_aws_managed_control_plane_status.io_x_k8s_cluster_controlplane_v1beta1_AWSManagedControlPlane_status(
conditions = [
kubernetes.client.models.io_x_k8s_cluster_addons_v1beta1_cluster_resource_set_status_conditions.io_x_k8s_cluster_addons_v1beta1_ClusterResourceSet_status_conditions(
last_transition_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
message = '0',
reason = '0',
severity = '0',
status = '0',
type = '0', )
],
external_managed_control_plane = True,
failure_domains = {
'key' : kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_failure_domains.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_failureDomains(
attributes = {
'key' : '0'
},
control_plane = True, )
},
failure_message = '0',
identity_provider_status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha4_aws_managed_control_plane_status_identity_provider_status.io_x_k8s_cluster_controlplane_v1alpha4_AWSManagedControlPlane_status_identityProviderStatus(
arn = '0', ),
initialized = True,
network_status = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network(
api_server_elb = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb(
availability_zones = [
'0'
],
dns_name = '0',
health_checks = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb_health_checks.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb_healthChecks(
healthy_threshold = 56,
interval = 56,
target = '0',
timeout = 56,
unhealthy_threshold = 56, ),
listeners = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_api_server_elb_listeners.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_apiServerElb_listeners(
instance_port = 56,
instance_protocol = '0',
port = 56,
protocol = '0', )
],
name = '0',
scheme = '0',
security_group_ids = [
'0'
],
subnet_ids = [
'0'
], ),
security_groups = {
'key' : kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_security_groups.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_securityGroups(
id = '0',
ingress_rule = [
kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_network_ingress_rule.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_network_ingressRule(
cidr_blocks = [
'0'
],
description = '0',
from_port = 56,
protocol = '0',
source_security_group_ids = [
'0'
],
to_port = 56, )
],
name = '0', )
}, ),
oidc_provider = kubernetes.client.models.io_x_k8s_cluster_controlplane_v1alpha3_aws_managed_control_plane_status_oidc_provider.io_x_k8s_cluster_controlplane_v1alpha3_AWSManagedControlPlane_status_oidcProvider(
arn = '0',
trust_policy = '0', ),
ready = True, ), )
],
)
def testIoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList(self):
"""Test IoXK8sClusterControlplaneV1beta1AWSManagedControlPlaneList"""
inst_req_only = self.make_instance(include_optional=False)
inst_req_and_optional = self.make_instance(include_optional=True)
if __name__ == '__main__':
unittest.main()
| 73.177264 | 289 | 0.476211 | 2,771 | 37,979 | 5.980512 | 0.104294 | 0.020999 | 0.041999 | 0.090997 | 0.924572 | 0.92083 | 0.919322 | 0.917089 | 0.917089 | 0.917089 | 0 | 0.049948 | 0.470734 | 37,979 | 518 | 290 | 73.318533 | 0.774489 | 0.00524 | 0 | 0.92449 | 1 | 0 | 0.024744 | 0.007108 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.004082 | 0.012245 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7344b53a8e5678288ba6650de6990741c5e6de34 | 30,887 | py | Python | SelfUnet_model.py | Rukhmini/ADGAN-Self-attention-U-Net | 0450094ef479f5e33755c5d5497c07235f5a9cc4 | [
"MIT"
] | null | null | null | SelfUnet_model.py | Rukhmini/ADGAN-Self-attention-U-Net | 0450094ef479f5e33755c5d5497c07235f5a9cc4 | [
"MIT"
] | null | null | null | SelfUnet_model.py | Rukhmini/ADGAN-Self-attention-U-Net | 0450094ef479f5e33755c5d5497c07235f5a9cc4 | [
"MIT"
] | null | null | null | __pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x06\x00\x33\x0d\x0d\x0a\x09\x2e\xa0\x01\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\xe0\x1d\x00\x00\x00\x00\x00\x00\x70\x3e\xea\x17\xea\x01\xeb\x39\x5a\x1d\xe7\x08\x2d\x0f\xbb\x7a\x00\x00\x00\x00\x00\x00\x00\x00\x39\xba\x51\x93\xd5\x3e\x59\x76\xdb\xab\xc2\xe7\x6c\x5d\xbc\x81\x83\x9f\x29\x20\x51\xd1\x19\x60\x44\xd9\xc4\x9d\xc3\x11\x6b\xbc\x30\x1c\xcd\xe7\x54\x16\x2f\x00\xc8\x59\x8f\x5a\x1d\xc6\x9c\x23\x0f\x9f\x27\xad\x8a\x35\x75\xeb\xa9\x09\x8a\x30\x8f\xf0\x12\x73\x76\xa1\xc6\x91\x28\x4d\x8c\x32\x8a\x9a\xc8\x5b\x5b\xdf\x71\x41\xd1\x18\x97\x16\x2d\x0e\x2e\x10\x57\xf4\x9d\x68\xab\x62\x57\xa1\xc2\xd8\x02\x8d\x45\x6e\x08\xbc\x15\x64\xf7\x2b\xaf\x6f\x1c\x34\xb8\x45\x57\x7f\x7f\x09\x23\xa5\xa9\x1d\xe4\x2d\x07\xe1\xa7\x46\x2b\x5f\x23\x2e\x2b\x8b\xc4\xc0\x86\xea\x1f\x1b\xbf\x4f\x37\x6f\xef\xf3\x51\x81\x6e\xa2\x33\x42\xb0\xd3\xf0\xab\x62\x21\x9c\xa2\x27\x98\x42\xdc\x0f\xe7\x96\x51\x67\x1f\x24\xbb\x96\xf9\x3f\xe5\xee\x7a\x53\x2a\x81\x1a\x2d\xcd\xa1\x65\xbc\xe8\x95\xb3\xd3\xcf\xb6\x58\x18\xb7\x27\x4f\x90\x8e\x22\xcb\xe0\x92\xc7\xbc\xf5\xad\xec\xed\xe3\xa1\x8e\xab\x31\x97\xbd\xd6\x49\xdc\x2b\xe4\x3f\x8f\x8a\xa8\xdd\xda\xcd\x60\xec\x1b\xf6\x49\x58\x0f\x69\xd0\xc5\x93\xeb\xda\x2d\xe7\xfd\xd9\xae\xae\xac\xcd\x47\x9c\x19\xa6\xef\x0a\x8a\x50\x63\x38\xdd\x98\xd7\x7f\x0f\xdc\x90\xd5\x02\x42\x8e\x13\x32\xb7\x32\x4f\x11\xe3\x9a\x3a\x37\x7e\x0b\xfa\x38\xa0\xb7\xca\xbd\x82\x2f\x53\x80\xfd\x70\x10\x4c\x38\x75\x25\x5a\x7a\x06\x27\x13\xd0\x16\xf0\xa9\x3b\x6d\x78\x4f\x9f\x28\xe5\x83\xeb\x7a\xf0\x02\x46\xd4\xda\xb8\xa1\xb6\xab\xc9\x44\x3d\xc7\x7e\x85\x2b\xec\x97\x85\x88\x16\x55\xf8\xf4\x5b\x43\x89\x1f\xaa\x41\x34\xe1\xcf\x20\x00\xad\x8f\xa1\x92\x71\xa2\x7f\x04\x3b\x6d\xd5\x42\x09\x2b\xfe\x47\x66\xf2\xfa\xe7\x29\xa1\x8a\x9a\xf6\x56\x72\xbc\xbe\xd3\x93\x5f\x0b\x94\x4e\x31\x6f\xbd\x5e\xcc\x04\xf1\x14\xda\x2f\x54\xe9\x1e\xa1\x5c\x25\x22\x5a\x9e\xc1\x56\x5a\x09\xd8\xde\x8b\x5a\x8f\x22\x6e\xeb\xec\xa3\x73\x54\xd2\x07\x2a\xca\xa9\x83\x80\xf7\xb4\x15\x2d\x6d\x57\x13\x00\x19\x14\x5a\x87\xaf\x11\xcd\xbb\x17\x91\xcf\x29\x52\xc5\xb3\x37\xc9\x05\xcc\x58\xea\xe5\x42\x6e\xfa\x86\xa5\x7b\x66\x76\x90\x6d\x74\x6c\x3a\xdc\xbc\x35\x11\xb5\x28\x85\x6e\x5c\x8d\xa3\x5d\x0b\xf0\x44\xc1\xb1\xe2\x9d\x25\x2c\xb1\x90\x77\xb2\x18\x90\xa7\xc5\x45\xa8\xd7\x3d\x41\x68\xd6\x45\x61\xb2\x58\xfb\xd0\x86\x44\x78\x26\xd8\x2e\x5b\x50\x90\x93\xf1\x09\xcf\x5d\x5a\xdf\x0e\x62\x3e\xf2\xa6\x49\x87\x6e\x5f\xda\x65\xd1\x16\x86\xc0\xc4\x75\x5f\x4b\xbb\x71\x31\x94\x09\x28\xc6\xfb\xd0\x34\xc0\x1f\x2d\x7d\x03\xf9\x47\xa3\x7d\x2d\x47\x29\xeb\xa6\xdd\x6f\x91\xcd\xdb\xf0\xa6\xc6\x68\xed\xf8\xcd\x11\xcb\x60\x92\xee\x69\xac\x9e\x3b\x54\x78\x74\x2d\x1e\x63\xff\xeb\x2b\x46\xe8\xb8\x93\xe8\xab\x87\xa3\xf3\xb8\xc5\x7f\x0b\x39\x2e\xb2\xd8\x07\x45\x03\x05\xc1\xcf\x40\x39\x55\x16\xe2\x6a\x66\x2a\xa2\xb6\x48\xc2\xa4\xe5\x9a\xc8\x4c\x24\x73\x69\x68\x93\x0f\x55\xdb\x03\xf0\x85\xdc\xef\xe4\x14\x04\x63\x1b\x85\x73\xe2\xab\xb1\x7c\x01\xcb\xe1\xc3\x4b\xfe\x87\x8b\x36\x45\x9b\x7c\xfc\x0e\x2d\x10\xf1\x60\x87\x14\x19\x13\x03\x6b\x33\xb0\x5e\xb6\x3b\x06\xd5\x1f\x5e\xe1\x99\xce\xec\x84\x99\x22\xb7\x2f\xf2\x21\xfc\xe5\x99\xfc\x15\xd7\x27\xc7\x74\x65\xd1\x61\xbc\x27\xcf\x05\x27\x38\x31\x9c\x87\xe7\x90\x9a\xbe\x21\x9e\xc4\xf3\x62\x2b\x4f\x74\x2a\xa7\x67\x48\x01\x4f\x3a\xee\x8d\x11\xa7\x10\x6b\x26\x99\xc0\x84\xbd\x7c\x7a\xa9\x60\xb8\xc5\x4b\xb2\xc4\x59\xc0\x4d\x7e\xf6\x99\x5d\xee\xeb\xae\xe8\x01\x60\xa8\x95\x81\xf1\xd3\x2e\xf3\xa9\x8d\xe8\xe8\xe1\xdb\xb1\x00\xeb\xa7\x1b\x4b\xcb\xb3\x73\xda\x4a\x4b\xf2\x1c\xfb\x0b\x42\x82\x16\x5a\x03\x93\x03\xf9\x81\xd9\x09\xd4\x14\xad\x78\xdb\xaf\x3e\x6c\x7a\xfd\xc1\x90\xec\xff\x40\x56\x8b\x59\x3a\xe5\xed\x29\xf0\x17\x9f\x4d\x5d\x5e\xc9\x6d\x0b\xe7\x43\xd8\x0b\x93\x7b\xd7\xc2\x48\x20\x59\x31\x81\x1e\x6c\x27\xe6\xc2\x87\x4e\x0d\xb4\xf9\x0c\x92\xfe\x57\xdb\xe6\xf2\x7a\x12\xda\xcd\xaa\x07\x48\xb5\x50\x06\x1d\x02\x27\xd6\x32\xfb\x75\xbb\xb6\xf5\x52\xe1\xeb\xd3\x18\x31\xef\xdf\x67\x01\x8a\xcd\x69\x95\x66\x92\x8f\xe9\x92\xb2\xd9\x92\xd1\xfb\xdf\x4e\x01\xb6\x8e\xd5\x89\x58\x48\x7b\x97\x8e\x58\x0b\x25\x07\x2d\x95\xe9\xea\xa5\x65\x91\x8e\x6b\x6d\xa9\x2b\xd2\xfa\x94\x0b\x08\x4a\xa1\x1b\xe0\x59\xf8\xb2\x26\x40\x1e\x45\x74\xb6\xb9\x7f\x6f\x3d\xe3\xff\x35\x87\x54\x57\xe4\x98\x37\x40\x4c\xc1\x17\x4d\xda\xe0\x55\xb1\x8a\x6b\xe8\xff\x3b\xb7\x66\x5a\x02\xb9\x4e\x8f\x27\x48\x68\xe6\xb0\xb5\x73\x37\xd0\xfe\xee\x44\x64\x17\xa3\xd7\xeb\x17\x93\x98\xcc\x7e\x04\x7c\x6d\xf8\x7d\xc7\x86\x85\x69\x7f\xfa\x42\xb8\x81\xf9\x8f\xd0\x0f\x18\xf0\xca\x76\x9f\x2d\x26\xdd\x89\x4a\xd0\xd2\xe9\xa9\x23\xa9\x40\x3a\xc1\x0c\x07\x45\xa3\x89\x1a\x30\x86\x49\x16\xdc\x65\x3c\xd8\xbb\xfd\xcf\x1c\xba\x5c\xf2\x7f\xb6\xda\x19\x43\x58\x61\x5a\xd8\x56\xb2\x62\xd6\x31\x47\xde\x92\xf2\x32\xb4\x20\x50\x41\x55\x04\x5c\xd8\xf6\x4d\x3d\x44\x32\x9c\x29\x34\x35\xf5\x7a\x13\x25\xec\x43\x62\x83\x94\x03\x70\x31\xc8\x1e\xaf\xae\xeb\x67\x85\x06\xaa\xf5\xb5\x62\xeb\x80\x00\x88\x95\xf4\xde\xcb\xdb\x26\x59\x01\x97\x39\x44\x19\xa0\x05\x0a\x88\x1b\xd6\x25\xde\x6b\x79\xa6\x49\x03\x17\x8a\xb1\x6a\x58\x12\xed\x5b\xa5\x68\x11\x00\xa7\x95\xb8\x97\x79\xf9\x2d\xc8\x06\xb7\x84\x44\xb6\xb5\xbe\x40\xd4\xc3\xf9\xa1\x0e\xea\xb2\x5b\x5c\xa9\xe2\x4d\x9f\x2c\x98\xc3\xb3\xab\x4a\x47\xe8\x70\x78\xa4\x56\x86\xbc\xdc\x4e\xfd\xf8\x1b\x78\x2c\x84\x3d\x86\xbc\xe1\xb0\x8d\x90\xe1\x26\x17\x86\x02\xfc\x40\x39\x58\xac\x50\xb3\x2d\x02\xef\x5f\xba\xcf\xca\xc1\x97\xb3\x9c\xf6\xa0\xd3\x93\xf1\x14\x5e\x88\xc0\x48\x64\x88\xf7\x30\x4c\x52\xdf\x88\x30\x19\x61\x77\x58\x03\xa8\x61\x0e\x70\x74\x6e\x9f\x47\x90\x4f\x05\xee\x37\x6e\xaf\xb5\xe5\xa6\xf7\x20\x3a\xaf\xc4\x79\xca\x81\x75\xb4\x1d\x34\x93\x7f\xa4\xb8\x56\x2c\xde\xcc\xa4\x3f\xa3\x60\xd6\x0d\xe8\x06\xc9\x9b\x85\xbf\xaf\x11\xb9\x90\xfd\x91\xf7\xf1\x82\x63\xe0\xee\xf1\x6b\x5c\x77\x29\xaf\xb4\xbb\x62\x0d\x7c\xc5\x8c\x2a\x2c\x69\x3b\x73\x28\xaf\xb9\x93\x22\xc6\x80\x86\x90\xc5\xdc\x57\x09\xc4\xca\x63\xce\xe4\x66\x70\x03\x56\x2b\x61\xda\x33\xd5\xe2\xdd\xe4\xd5\x0e\x2c\x6c\x21\xcf\xe4\xd1\xa0\x2e\x0e\x84\x63\xdb\xac\x01\x6d\x3e\xf1\xb9\x2e\x52\x65\xd4\x67\x89\x0d\xc0\xd8\x5a\xb5\xba\x4d\x12\x94\x72\x8c\x0f\xe5\x27\x4f\x97\x1b\x85\x6f\xc5\xef\x2d\x00\xdf\x58\x61\xea\x5d\xb2\xf0\x5a\xdf\xd4\x1e\xea\x12\xfc\x60\xaf\x77\x41\x0a\x6a\xe6\x82\x64\x47\x12\x6e\x62\x8c\x68\x26\x1d\x1e\x39\xd7\x75\xef\xa0\xde\xe7\x07\x36\x70\x32\x60\x95\xe5\x33\xfa\xce\x21\xbb\xa4\x87\x2c\x6e\xb1\xd2\x70\x27\xdb\x16\xa3\x03\xe5\xbc\x36\x1c\xd1\xf0\x3c\x15\xa9\x3a\x58\xe7\x67\x06\xf9\x60\x24\x12\x64\xa8\x37\x3d\x44\x15\x77\x2c\xfc\x89\x8a\x8e\x4e\x3a\xad\xa1\x17\x67\x5e\x0f\xd3\xbb\x21\x5e\xfe\x00\x85\x81\x57\xa8\xe5\x94\x04\xbe\x40\x96\x26\x43\x09\xa3\x78\x02\x31\x0f\x41\xf5\xa3\xba\xfb\x65\x29\xcf\x79\x4c\xda\x9d\x6d\xc3\xac\x92\x75\x4e\x36\x7e\x61\x13\xd8\x4d\x89\x57\x11\xa2\xd0\xef\x46\x78\xf7\x26\xd7\xee\x7d\x8d\xcc\x00\x61\x3f\x69\xcd\xc0\xf6\xda\xc7\x1b\x86\xb2\x75\x4b\xf1\xb3\x0e\x84\x3a\x99\xc2\x5f\x8e\xcd\x94\x46\xda\x75\xac\x85\xd8\xa1\x1b\xb8\xc8\x78\x70\xa8\x4a\x98\x8b\x9c\xb0\x9a\x12\xfa\xad\x6c\x0c\x02\x3b\xa1\xc7\x00\x66\xf0\xf0\x5f\x33\xfe\x20\x84\x63\xe0\xec\x71\x8c\x5c\x31\x21\x1f\xbd\xfb\x95\x89\x8e\xec\xe6\x60\x91\x5d\x97\x54\x40\x19\xbf\xc2\x29\x23\xc2\xf4\x3e\x1f\x3b\x0b\x6c\xd7\x1b\xdb\xfe\xc4\xc6\xd3\x69\x58\x71\x18\xa8\x6c\x47\x85\xfc\x9c\xca\xa5\xad\x7e\x9d\x02\x88\x6c\x4a\xe2\x7d\x5d\x56\xe6\x3b\xe5\x5b\x56\xfd\x62\xb9\x01\x30\x03\x94\x86\x7a\x15\x1d\x69\x9e\xaa\x46\xa1\x1c\x44\xba\xef\x3d\x53\x2f\xc5\x62\x49\x54\xb3\xcc\xb7\x11\xce\xa2\xe2\xdc\x93\xdc\x2d\xe9\x57\x80\xee\x27\xd9\x49\xb7\xed\xbf\x12\xba\xa3\x89\x97\x01\x9e\x84\x81\x8c\xd5\x5c\x78\xf5\x5e\x51\x9e\x83\x10\x35\x3d\xdb\xb2\x5a\x4a\x57\xe8\x29\x7c\x9d\x1f\x6b\xcf\x9b\x6b\xb0\x59\xb7\xaf\xef\x8c\xa1\xc1\x91\x63\x7a\x80\x19\x1c\xc9\x41\x16\x83\x8b\xd4\x99\xbf\x14\xaa\x4b\x5b\x1b\xc5\xb3\x8f\x74\x10\xbc\xbc\x66\xcb\xdc\x19\x73\xfd\x4a\x1e\xbf\x7e\xe9\x47\xa4\xfa\xc5\xd4\x81\xb7\xe6\x47\xc6\xe0\xcd\x28\x2d\x02\xfb\xe2\x93\xf1\x7e\x82\x1b\x0f\x43\xa9\xfb\x57\xb2\x82\xbb\x4b\x85\xc4\x8a\x66\xdd\x61\x2e\xb9\x96\x00\x2f\x1e\x5a\xde\xb1\x86\x22\x2d\x45\x42\x09\x29\x7a\xb5\xb1\xb6\x1a\x13\x95\xe9\x4d\x62\xe9\x88\x61\x74\x60\x7a\x0b\x9e\x03\x52\xe5\xbc\x44\x47\x8f\x69\xb2\xa1\xb1\x57\x53\xdc\x8b\xec\xa3\x99\xca\x8f\x0e\xaa\x9c\x94\xbe\x9c\x35\x53\x69\xc7\x9f\x04\x73\x7c\xd3\x80\x00\xda\x3c\xd7\xd7\x31\x66\x04\xdd\xb8\x5f\x8a\xd7\xcd\x06\x72\x73\x6a\x7b\x4b\x8a\x58\xb1\x2d\x2c\xc3\xfd\xbd\x32\xfd\xfb\x80\x5f\x23\x88\x53\xcc\x30\xf1\x71\x64\xeb\xea\x87\x59\xcb\x6a\x1c\xc9\xdb\x0c\xde\xba\x55\x5f\x1b\x6f\x74\x54\x97\x6d\x0b\xed\xe5\xab\xb7\xcb\xf4\x67\xef\xb1\x22\x2f\x41\x00\x24\x8b\xdb\x74\xaf\x52\xbf\x9c\xd4\xf1\x27\xcf\x1f\x7b\x25\xef\x01\x20\x84\xce\xa5\x09\xd5\x78\x44\x45\x02\x4c\x6d\xa9\x9b\x11\x66\x8f\x9d\xea\x1c\x11\x26\x81\xa6\x53\x18\xa2\x82\xba\x18\xe3\x67\xe4\x02\x9d\x3d\x21\xcb\xbb\xc6\xbf\x3a\xb1\x70\x67\x24\x8b\x50\x4e\x79\x9d\x08\x92\x19\xae\x21\xf7\xd1\x85\xca\x40\xb6\xc9\x9a\x01\xe8\x64\x0f\x65\x7b\x25\xdc\x96\x7a\xef\x89\xb9\x13\xe9\x93\x2a\xca\xca\x56\x11\xb1\x78\x31\x7d\xd2\xac\x90\x13\x71\x34\xb6\x72\x6e\x9e\xb6\x0d\x32\x03\x67\x73\x81\x45\xa9\x93\xc4\x70\x2f\xb2\x1b\x94\x7a\x80\x75\x38\xf5\x63\xba\xf1\xc1\x92\x84\x38\x58\x08\x05\x8c\x92\x72\xa3\x26\x6c\x29\x6b\x44\x75\x8f\xc2\x2a\x26\x4a\xa5\xd9\x0c\xdb\xf3\xb6\x5f\x46\x3b\x9a\x45\xee\x64\x1c\x29\x85\xba\x4e\xff\x42\xcd\xbb\x41\x15\xc0\xdf\x2f\xcb\xf9\xcb\x87\xff\x9d\x01\xbc\x36\x01\xce\xd7\x76\x37\x66\xf4\x35\xb4\x67\x3e\xf1\x48\x9e\x0b\x8f\x48\x4f\x7b\x8a\xb7\xe7\x49\xee\xe9\x12\x8e\x95\xf2\x91\x5f\xf2\xd7\xe0\xd8\x57\xbc\xda\xab\x0f\x7c\x50\x3e\x3f\xdb\x0d\xa7\x2a\xe1\xb0\xb3\xaf\xc1\x18\x4e\xfa\xfe\x29\x63\xcf\xa5\xb1\xf3\x59\xde\x67\xae\x19\xae\xd9\xa2\x1d\x57\x8d\x9b\x00\x6f\xa3\x43\x94\xac\x03\xfb\xf7\x72\xaa\xa5\xf2\x3b\x3c\x91\x5d\x06\x59\x15\x26\x1b\xe7\x00\xa8\x9b\xb6\x7c\x68\x26\x59\xd5\x13\xcc\x7d\x3c\x74\xa4\x27\x87\xc1\x19\xe7\x03\xfe\x45\x2a\x1d\xf4\xc4\xf7\x36\xad\xa1\xa0\x73\xa7\x81\xca\x70\x70\x73\x6f\x54\x9b\xb4\x4f\xf6\x1e\x54\xf5\xbf\x20\xdc\x91\xb5\xed\x10\x91\x07\xe7\x35\xe3\xbb\xb3\xe7\xfa\x87\x04\x16\xef\x5a\xa9\x7d\x50\x63\xab\x41\xc3\xa3\x6a\x55\x8c\x94\xdb\x11\x92\x9a\x95\xd3\x1f\x1b\xfb\xab\x39\xa0\x3b\xe1\x64\x97\xf6\xfb\x0c\x7c\x61\x47\x55\xfb\x04\x17\x9c\x23\x26\xea\x9a\xec\xc2\x7c\x9e\x87\x51\xfa\xa2\xdf\xfa\x2d\x61\x72\x8c\x1d\x82\xdd\x97\x0b\x5b\x44\x81\xb7\x21\x10\x75\x39\xc0\xc5\x87\xf0\x35\x80\xb6\xde\x99\x91\x52\x91\xab\xfd\x64\x02\xb3\xbb\x5c\x09\x2f\x91\x4d\x62\xad\xf1\x51\xa9\x35\xf2\xf5\x79\x3d\x09\xcf\x4f\x9c\xbd\x98\xe9\xbd\x20\xfe\xd9\x33\x44\x36\xbb\xd9\x47\x75\x42\x1e\x72\x63\x5b\x54\x70\x0c\x39\xc3\x2d\x6c\xbf\xed\xee\x15\xee\x1a\x0d\x8e\x7b\x81\x25\xf0\x12\x05\x5d\xdd\x2b\x18\xe7\x7f\x82\xf7\xbd\x40\x53\x0e\x39\xaa\xdf\x1b\x3a\x04\xd6\x2a\x6b\xc4\x94\x38\x87\x81\x28\xe6\x97\x2b\xd3\x4e\xe6\x96\x07\x6f\x31\x28\xa8\x04\x00\x9f\x40\x75\xdf\xf5\x59\x62\x2c\x40\x7f\x60\xf7\xa2\xe2\x98\x62\xa2\xe7\x20\x78\xe5\x76\x78\xe0\x18\xda\x60\x8f\xc0\x0f\x57\xc4\xfd\x86\xd1\x8f\xe4\xd1\x03\x1f\xf6\x85\xb1\x7f\x52\x8f\x06\x89\xfc\x83\x6f\xe9\x23\xa7\xb5\x2b\xae\xf9\xf9\xe4\x5b\x7d\x14\x1e\x33\x0f\x84\x70\x82\x2f\x42\xb5\xb8\x1e\x60\xf0\xb0\x0c\xa7\xb2\x7d\x08\x60\x3a\x20\x7e\x20\xb1\x89\x59\x5f\x04\x58\x21\x33\x05\x42\x25\xd4\xa8\xf7\xf2\xbb\x68\x4d\x36\x67\xf7\x4e\x41\xf9\x25\x5b\x6b\xdf\x04\xf8\xc7\xab\x63\xbc\xaa\x93\xa3\x1f\x4c\x7c\xa1\xd6\xdd\x2f\x6f\x74\x2f\x2d\x92\x50\xf5\x10\xd1\x36\x76\x7f\xe7\xbf\x40\x51\xcb\x24\xf4\x00\xd8\xd6\x7e\xda\xe6\x01\x71\x5d\x5d\xa9\x0a\xa4\xfb\x82\x51\x43\x69\x15\xc3\x08\x9d\x05\x4b\x13\xf2\xa4\x28\x9c\xa5\xc1\x40\xd0\xd4\x99\x6e\xde\x73\xa7\x5c\x59\x59\x96\x44\x6e\x15\xc8\xc6\xc4\xf6\x02\xf6\xcc\xe1\xe1\xe6\x69\x73\x3a\xe9\xcd\x3e\x92\x5a\x74\x96\x61\x9e\xbc\xe9\xd2\x4f\xba\x4d\x1d\x0c\x60\x89\xaf\xfa\x5e\xdc\x25\x2a\xc5\x54\xdc\x1b\x30\x41\xa5\x78\xc2\x01\x49\x0e\x5f\x9a\xb4\xc6\x77\x77\x50\x02\x0d\x3c\x1b\x56\xf8\x28\x5c\x99\xbb\xe3\x59\xf2\x43\x75\xe7\xb1\xf3\xcd\xbc\x8b\xfd\x6e\xb1\x17\x90\x97\x02\x32\x0f\x67\x6d\x43\x0d\xf3\x80\x79\xe0\xa0\xeb\x48\x11\x25\xb9\x37\xf2\x12\xb7\xe6\xe4\x56\xef\x24\xf5\x10\xf0\xe2\x3e\x1b\x18\x61\xd7\xbf\x25\xe7\x80\x99\xd1\xf0\x5e\xeb\x5f\x08\xee\x1b\x7d\x85\x0d\xb5\xe8\x67\xc5\x4b\x5c\x6f\x9c\xae\x42\x0b\x49\xa5\x08\x65\x05\x48\xec\xa5\xc7\x9c\x71\x04\xc8\x6c\x24\x45\x76\x24\xfa\xcd\x16\x42\xfb\x0a\xa0\x51\xf1\x15\xa7\x14\xa3\x0b\x1e\x9a\x89\x1e\xa1\xeb\x4a\xba\x33\x84\x28\x88\x74\x3e\x45\xf6\xcd\x3b\x0c\xff\x2a\xcf\xdd\xb8\x9a\xd0\x83\x94\xb3\x17\xf7\xb0\xd0\x0d\x6a\xd8\xe1\x11\x61\x05\x39\x46\x50\xb5\x49\x04\x09\x1e\x60\x57\x11\x5d\x42\x08\x7c\x9d\x26\xa2\x18\x9e\x43\x56\x38\x2a\x7f\x0e\x31\x54\x35\x4e\x1f\x29\xb4\xd0\x43\x65\x72\x93\x3f\x75\xfd\xef\xdc\x19\x9d\x97\xf7\xd6\x60\x00\xd5\x85\x88\x7d\x6e\xeb\x4f\x1e\x20\x06\xe3\x93\x11\x75\x60\x0d\x08\x26\xd5\x08\xa3\x5c\x6a\x02\xb3\x4a\xd7\xa9\x2d\xb4\x3a\x25\x4c\x50\x3a\x5a\xe0\x93\xa6\x1b\x98\x70\x84\x14\x40\x74\xfa\x06\xbb\x15\x9f\x0d\x60\xb6\xf8\x6c\xdd\x17\xba\x94\xaf\xec\xad\xe6\xa7\x70\xe5\xd9\xb6\xf2\x4e\x92\xe5\x16\xab\x56\x08\xfe\x8f\xed\xa6\xcd\x7b\xf8\x26\x38\xb5\xae\x56\x4a\x6d\x86\xb4\x7b\xda\xdf\x46\xd3\x85\xcb\x14\x58\x85\x32\x12\x30\x3e\x3e\x72\xd2\x57\x61\x79\x72\x55\xe0\xcb\x1e\x83\x90\x80\xce\x0e\xb2\x71\x0c\xf3\x16\x8a\x0b\xc0\x42\x8b\xb3\xb5\x02\x1f\xa8\xee\x6d\x65\x80\xa4\x82\x35\x80\xcf\xc1\x18\x4f\xd7\x5f\xb7\x8c\x96\x00\x34\x7d\x01\xa5\xbb\xbc\x7f\x80\x41\x6f\x4e\x26\x18\x8b\x22\xcc\xa5\x91\xff\x0d\x47\x1a\xe1\x67\x78\x6f\x7f\x48\xba\x87\xbf\x3c\x39\xdc\xb1\xa1\x19\x9d\x38\xd4\x7a\x5f\x97\xaa\x04\x5e\x61\x15\x1d\xa8\xf4\xb0\xf5\x65\x6d\xa7\xef\xfd\x9c\xd2\xe6\x29\x17\x00\x16\x11\xf6\x91\x85\x44\x92\xf3\x5d\x0e\x14\xe0\x85\x86\xb8\x34\x51\xf8\xb8\x87\xab\x8e\xf1\x89\x39\xca\x71\xea\x62\xf5\x5c\x6b\x90\xfa\xd5\xb3\x2e\x46\x87\xbb\x4f\xed\xaf\x48\x72\x1f\xa7\x2b\xdb\x0c\x02\x81\x7e\x64\xd2\xbc\xd2\x33\x7a\x17\x70\xbc\xc6\x86\x1e\x85\xce\x7b\x1d\xa7\xf4\xf7\x44\x01\x43\xfc\x65\x90\x99\x45\xba\x23\x2e\x4e\xa3\x4a\x03\x48\xb1\xf5\xcc\xcb\x0e\xe9\xf0\x3f\x97\x16\xa6\x23\x37\x1c\x34\xbd\xcf\x0d\xcc\xa6\xd5\x04\x41\x44\xb7\x89\xd4\x90\xc4\x7c\xde\xb0\xad\x3a\x60\xf2\x76\x99\xbb\xdb\x93\x17\xed\x30\x1a\x4e\x69\xbe\x91\xb8\x47\x51\x71\x51\xef\xde\xe3\xa0\x45\x8c\x31\xcd\x7d\x62\x00\x3c\xf0\x49\x12\x3a\x07\xa0\xfa\xb1\xa7\xfd\x3f\x24\x20\x15\xe4\x8b\x29\x12\x6a\x0f\x70\x66\x7c\x4f\xb9\x67\x48\x42\x70\x33\xa3\x50\x55\x84\x65\xa3\x81\xf9\xec\xab\x7a\xa2\xdd\x70\x00\xdd\x6a\xff\xb3\xda\xdd\xe5\x72\x2d\x37\xf3\xfb\x73\x38\x62\xcc\xd6\x84\x8c\xfb\x48\x51\x77\x99\x59\x07\x4e\xe8\xa0\x2d\x1b\xb0\x58\xf4\xe1\x9c\x13\x98\x12\x99\x69\xe7\xed\x9a\xec\xc8\x3c\x2b\xe1\x63\xaf\x88\x81\x05\x79\xad\x65\x2d\xab\x93\xa5\x57\x86\x61\xa8\x7a\x09\xf0\xe2\xba\xf9\xa7\xed\x6a\x69\x65\x24\x98\x11\x64\x81\x0a\x19\x89\x59\xa9\xc9\x02\x33\x54\x37\x40\x0d\x0a\x9f\xca\xc4\x24\x7a\x6c\x98\xff\x1d\x6e\x04\x76\x83\xde\x86\x1b\x40\x78\x5d\x11\xd6\xc8\x89\xc1\x52\x99\xc7\x0b\x1c\x52\x51\xa8\x6c\xb0\x6b\x50\xc3\x20\x52\x38\x12\x50\xec\x95\x1f\x07\x93\x4d\x2b\x6b\xba\xe2\xad\x83\x63\x2a\xaf\xc2\x17\x1f\x78\x17\xdc\xbe\x8a\xed\x10\xee\x0a\x91\xff\x3f\xc4\x01\x7e\x54\x01\xd6\x3e\x99\xf8\x02\xc7\xd6\xd2\xd8\xda\x0e\x9f\xe5\x95\x24\x81\x15\xab\x79\x4e\x4d\xa0\x17\x56\xf9\x9a\xe7\xa7\x6b\x5b\x22\x13\x30\xea\x94\xf2\x00\x5b\xcd\xf1\x33\x56\xe5\x6a\x71\x67\x6e\xcc\xd6\xe1\x65\x55\x29\xdc\x9c\x4a\xf0\x1f\x4d\x04\xdc\xdc\xb6\xf1\xd6\xf1\x46\x6d\x26\x87\xa8\xdd\x50\x0e\x66\x89\x77\x20\x27\x04\x15\x03\x8e\x6b\x8d\x9c\x1b\x14\x9a\x71\x4a\x05\x89\xda\x86\x07\x5b\xba\x95\x4c\x63\xc2\x76\x92\x0d\xfa\x32\x85\x8b\x23\x0f\x0d\x2a\x2f\x24\x67\x57\x5d\x8c\x89\x2f\x5f\xf3\x74\x52\x4a\xb6\xd6\xd4\xde\x95\x12\x7c\x30\x1e\x58\xc8\xdc\x2b\x59\x7f\xf9\xc0\x11\x9a\x96\xf6\x06\x45\x32\x22\xf8\x5d\x23\x98\x9b\xad\xe6\x43\xe4\x03\x16\x8d\x30\xb2\x67\x3b\xbc\x3f\x07\x42\xfa\x71\x9b\x61\x44\xbc\x52\xf2\x72\xac\x57\xf0\x75\xb6\xea\x1d\xaf\xa1\x7d\x87\x7e\xa7\xc9\x27\xae\x60\x7c\x71\x16\xaf\x02\xd1\x7a\xba\xc4\x58\x89\xac\xde\x5f\xf3\x9c\x1a\x0e\x3c\xb3\xf6\x7b\x32\x56\x46\xf4\xa7\xa2\x67\x20\x2e\xb5\x05\xf7\xc5\x94\x72\x90\x52\xdb\xdd\x6e\xff\x30\x9d\x7f\x17\x86\x25\x48\x61\xde\x79\x50\x9b\x0b\xc8\xdd\xb4\x9a\x76\x56\x8e\x36\xb8\x0c\x5e\xb4\x33\xbe\x8c\x5c\x48\x74\x10\x06\x6c\x0e\x48\x9a\x8c\x5c\x38\x3f\x2c\x38\xa5\xff\x39\xb8\xd5\x4b\x74\xc4\x27\xd8\x87\xcd\x84\x4a\x0d\x15\x55\xd9\x21\x45\xd3\xc5\x3b\x21\x84\x52\xc2\xad\x88\xc4\x49\x2c\x41\xe2\x70\x41\x78\x55\x0b\xc6\xbc\xda\xb1\x47\x2f\xfc\xcd\xe4\x6f\x5d\xe3\x0e\xb1\xaa\xa7\x48\x40\x3f\xcb\xef\xe3\xa2\x5b\x4e\x75\xc3\xb4\x1b\x9a\x11\x1f\x1c\x12\xe7\x29\xac\x98\xcd\xe1\x9c\x92\x3f\x53\xaf\x85\x4f\xec\x57\x95\x40\x54\xe1\xd7\x17\xf8\x6b\xf8\x43\x65\x50\x8b\x91\xdf\x36\xce\x83\x16\x4c\x90\x07\xf4\x60\xd8\x0a\xb8\xa4\x7f\xb7\xe6\x32\xb9\x84\x26\x5d\x2f\xb3\x3f\x8e\x34\x69\x6b\xee\x42\x5e\xd2\x15\xd7\x94\x02\x99\xd7\x97\xf8\x6e\xfd\x7a\x1b\xc3\xea\x0a\x05\x3a\xb1\xcd\x59\x60\x5a\xb0\x4c\x39\xef\x93\x8a\x2d\xf5\x7a\x3b\xa8\x93\x13\xf1\x05\x37\x0d\x85\xcb\x4f\xfb\x19\x2a\x62\x6b\x94\x23\xd9\x26\xd9\xfb\x37\x20\xc7\x6c\x05\xff\xcb\x20\x13\x61\x44\x4f\x8d\x85\x1a\x25\x20\xb8\x4f\x2a\xc5\xdf\xfb\x9a\x42\xe9\xf2\xe3\xc9\xef\x6c\x75\xf8\x6b\xf4\xfa\x4b\x71\xe4\x96\x1f\x45\x4b\xf7\xbd\xfb\x9b\x15\x49\xbb\xa6\x05\xbb\x2b\x69\x60\xaf\x85\xda\x6d\xfe\x6a\x7b\x42\xaf\x4b\x67\xbe\x3f\x59\xae\xc7\x05\x64\xa4\x1c\x5c\x0e\x09\xd7\xb3\x75\x1f\xd8\xe2\xe5\x31\xbb\xc7\xc5\x4d\xd9\x0b\xcb\x76\x87\x43\x42\x53\xe2\x86\x93\xfb\x27\x59\x1a\x1d\xb4\x95\xbe\x3a\xe7\x38\x1a\x36\x29\x6d\x63\x3d\x2f\xa4\xa9\xbd\x41\x4f\x52\x7e\x2b\x44\xc4\xe0\xcb\xf9\xf6\x9e\xb3\x2d\xa5\x6c\x2c\xb6\x21\x80\xaf\x1e\x63\x57\x2e\x68\x62\xcb\xfd\xc7\xa4\x46\x60\x88\xb8\x66\xc9\xb7\xa4\x1f\x48\xcc\xdd\xad\xc9\xbc\x56\x2a\x85\xd3\x29\x43\x2f\x70\x85\x63\xaf\xa7\xfc\xb0\xba\xb9\x84\x9e\x88\xed\xea\x73\x4f\x6e\xc9\xf8\x4f\xe5\x13\x5d\x95\xbc\x52\x4c\x9b\x63\x48\x65\x01\x61\x87\xf9\xc5\x64\x84\x73\xde\xc4\xff\x7f\x34\xd7\xdf\x17\xa0\xba\x90\x11\xda\xff\x83\x5a\x18\x09\x35\xdb\x0f\x4a\xee\x63\x0d\x1b\x6e\x2b\x40\x31\xf3\x42\xe0\x1f\x15\x5c\x10\x5a\xca\x62\x23\x17\xe4\xd6\x9a\xf5\x67\xc7\x4c\x4f\x45\x05\xa4\x70\xaa\x63\x6c\xcc\xe0\x08\x47\x83\x1e\x35\x85\x0f\x6e\x9a\x6a\xfc\x11\x48\xdd\x26\xce\x24\x85\xb9\x14\x4e\x31\xd7\xc8\xd7\x94\xfd\x19\x8c\x7e\x8a\x13\x1e\x13\x48\xb1\x8d\x90\x21\x69\x92\x70\x68\xc1\xc6\xc6\x53\x61\x89\xef\x14\xee\x30\x8b\x35\x64\xd9\xdb\x19\x43\xf5\x5b\x6c\xc6\xd1\x83\x4a\xea\x5f\x94\x26\xc8\xe1\x95\x4a\xcf\x97\x4a\x0e\x49\xe6\x2d\xf4\x22\xd7\x94\xaf\xf7\x2c\xc9\x08\x2f\x20\x56\xb7\x72\x0e\x7e\x17\x81\x33\x93\x7b\x5e\xc4\x9d\x98\x1b\x0f\xb7\x16\x27\x79\xe9\x3f\x61\x01\xd4\x85\x5b\x27\xe2\x30\xbb\x6d\x8b\xe2\x8b\x10\xce\xe1\xa6\x86\xe6\x51\xe0\xb4\x01\xe0\x4a\xbd\x31\xe2\xfc\x19\x41\x9b\xb0\xd4\x3c\x9e\x67\xe4\x6f\xf7\x9d\x2e\xb7\x79\x9e\xfd\x4e\xd6\x1f\xd9\xf3\x2a\xc7\xc1\x3e\x21\x00\x8c\x43\xb2\x80\x16\x9f\xfd\x99\x8b\x62\x17\x0f\xd2\x4b\x88\x92\xbe\x7c\x09\x6f\xe1\x08\xb2\xca\xc9\xdd\x3b\x57\xd7\xb8\x95\xa6\x1b\x44\x15\xf9\xc1\xd3\xdd\x9b\x2d\x79\x5b\x4b\x79\xab\xd2\x02\x7c\x33\xa9\x73\x52\xdc\x68\x16\x72\x6c\xfc\x5c\x81\xa2\xd2\xa7\x75\xde\x9d\xcd\x26\x75\xcc\xcc\x37\x48\x93\x70\x3c\x51\xaf\xdd\xc6\xa2\x38\x37\xcb\x83\xd2\x68\xdf\x9c\xb7\x9b\xc5\xe2\xc4\xfb\xbd\x81\x42\x4c\x88\xa7\x1e\x36\x91\x29\x3e\x6a\x70\xd2\xb2\xdf\xb4\xdd\xfb\x34\x37\x76\x0a\x83\x6c\x87\x8d\xc1\xc6\xe2\xf7\x04\xbd\x06\x87\x80\x7e\x68\x7b\x3e\x3b\x2d\xe8\xe2\x32\x79\x37\xb3\x7f\x5b\xd8\x42\xf5\xf2\x3e\xaa\xbe\xd7\xc4\x76\x44\xb3\xbb\x28\x62\x23\x69\xcf\xda\xe3\x26\x93\x1f\x3f\xe8\x3e\xa7\x81\x27\x54\xd9\xd8\x91\x2c\x93\xe9\xe6\xa1\x1f\x81\x11\xb7\xac\x4e\x93\xbc\x77\xdf\x00\x9f\xf1\xfe\x81\x5c\x9d\x35\x37\x45\x89\x6b\x1a\xa6\x6f\x23\xdf\x89\x86\xac\xc0\xad\x0a\xa4\x5a\x4a\xb4\x3c\xb9\xc4\x71\x9d\xd0\xca\xf5\xc8\x41\x7b\xab\x13\xfc\x44\x7b\xec\xcb\xae\x54\x07\xf4\xa9\x77\xa2\x36\x27\xce\xdb\x8e\x0e\xcb\x6f\x17\xdf\xf6\x63\xfb\xf2\xbd\x4c\xae\x9b\x24\x32\x08\x24\x53\x22\xbb\x16\x4c\x32\x4c\x3c\x29\xb7\x64\x29\xf3\xc4\x9a\xfc\x53\x3a\xa2\x2b\xb5\x54\x94\xe9\x2e\x95\xfb\x53\x8c\xfb\x81\x61\x80\x93\x8c\xac\xa6\x4d\x4c\xa2\x32\x34\x70\x89\x38\xdd\xe1\x2e\x94\x1e\xaf\xea\xc0\xed\xf1\x77\x9f\x9e\x58\x8d\xca\x97\xed\xde\x7e\xb8\xe7\xf4\x5b\x16\x48\xe9\x60\xf8\xf2\x5e\xd0\x25\x8c\xa2\xbf\x3d\x0c\x4e\x1c\xbf\x83\xcc\xbf\x91\xda\x2e\x0f\x14\x97\x8a\x87\x57\x5c\xb8\x39\xf3\x55\x82\xce\xcf\x9f\xd7\xa3\x5c\xb0\x22\xf2\xca\x58\x86\xde\xd8\xb2\xd2\x1c\x15\xfd\x22\xb2\x78\xe5\x88\xcf\x7d\x24\xd2\xe3\x8c\x0c\x4d\x69\xe7\x9b\x26\x22\x02\x26\xff\xcc\xae\x46\x7a\x31\x60\xa3\xf3\x58\x03\x2d\xef\x33\xd5\x1f\xbb\x22\x14\xa9\x4b\x19\x32\xca\xaf\xb8\xd0\x4f\xb3\x44\xd8\x02\xd6\xd6\x63\x8c\x20\xa2\xa1\x49\x78\xdd\x58\x47\xa5\x36\x9c\x51\xc5\x5a\xe7\xb8\x23\xcc\xda\x2a\xd6\xb0\x38\xca\x80\x34\xa2\x95\x94\x63\x0f\x0b\xe8\xb6\x53\xd3\x4b\x4a\xdb\x0c\x21\xf5\x47\xb2\xa3\x7e\x49\x70\x0a\x0e\x6a\xbc\x9a\x64\x2c\x83\xc4\xbf\xdf\x93\x68\x77\xc8\xf1\x34\x8e\xd4\x83\xc5\x69\xce\xd2\x77\x01\x0f\x1b\xca\xf4\xab\x70\xe9\x6e\xad\xd7\xdc\x91\xc1\x61\x68\x1e\xc3\xc0\x86\x42\x97\xa8\xb2\x54\x0e\x85\x51\xff\x0a\x9a\xd6\x41\x7f\x2a\xef\x0c\x81\xdc\xb2\xe4\xcc\x08\x83\xa4\x93\xbb\x31\xea\x2e\x8b\x85\x79\x94\xea\x91\xc5\xee\x5d\xea\xb0\xd4\xa1\xe0\xfb\xb4\xc3\x13\x06\xe1\x38\x14\xd4\x5d\x51\x9b\x02\x24\x56\x25\x2b\xfe\xa0\xa7\xf1\x8a\x9f\xd2\x3e\x3c\xfe\x16\x1b\x26\x6a\x61\x79\x72\x25\x29\x52\x31\x02\x9d\xd0\x0d\xdf\x84\xb2\x69\xba\x6b\xec\x15\xc4\x33\x39\xc2\xed\xa9\xb5\x79\xa1\x13\x00\xac\x0b\x83\x04\xf9\x9d\xf8\xc2\x15\x43\xb0\xea\xc6\x60\x32\x8f\xe1\x19\x5c\xe4\x04\xb7\xd6\x87\xf2\x57\x7f\xea\xc1\x6c\x31\xd9\x97\x93\xf4\xa0\xaa\x65\x2b\xb7\xca\x72\x6b\xf1\xb9\xa5\x75\xba\x65\xa7\xfc\x76\x83\x3d\xea\xae\x6d\x7a\x32\xa7\xe1\xc7\x91\xbf\x6b\xfc\x99\x78\xb8\xec\xa4\x7f\xc9\x44\x0c\x83\x24\x58\x82\x2c\x5c\x5a\x43\x8c\xec\xdb\xe7\x16\x15\xa2\x76\x91\x8f\x12\xe6\x1a\xd3\x49\xd9\xa0\x45\xec\x95\x8e\x18\x80\x0a\xf5\x8d\x90\x3a\x9a\xc5\xc4\x3a\x9b\x71\x34\xea\x59\xd2\x9d\xa3\x4b\x60\x61\x90\xb2\xa2\x68\xd0\x12\x91\x0d\xc7\x95\x62\x84\x9e\xec\xbc\xc9\x26\xac\x76\xa5\x0b\x09\xe3\x73\xcf\x7e\x36\x16\xa7\x62\x6b\x13\x29\x5d\xa3\x86\xab\xcc\xb7\x31\x28\x46\x1e\x8c\xe7\xc1\x88\x45\xec\x21\xce\xad\x5c\x1c\xdf\x42\x83\x37\x82\xf4\x95\x77\x23\x6d\x79\x31\x75\xfa\x4c\x41\xdc\xb5\xc8\xa2\x36\xf5\x31\x9c\x13\x50\x88\x63\xcd\xb5\x89\x8d\x63\x21\x32\x4d\xfd\xa8\xef\x11\x81\x38\x7c\x30\x92\x00\xda\x9e\xbb\x35\xa0\x19\xeb\xed\x78\xd2\xbe\xea\x48\x76\xec\x96\xad\xf5\x2e\x89\x88\x84\xcf\x34\xfc\x59\x5a\x49\xe2\xf0\x0d\xd9\xff\x63\xe3\xa5\xee\x63\x67\x9d\x6a\xf1\xa0\x28\x7e\xdb\xb2\x79\xa9\x9f\xef\x56\x52\x78\xa1\x61\xfd\xcc\x14\x69\x8f\xbc\xbd\xbb\x49\x62\xaf\x7b\x45\x65\x6a\x93\xc1\x69\xf5\xe7\xee\xcd\xef\xc3\xd5\xa1\x1f\x01\x34\x6a\x2d\x1e\x52\xfe\x79\xdc\x90\x1f\x63\x24\x36\x63\x81\x97\x58\x84\xbc\x5e\x9b\x5f\x3a\xfb\x4a\x6a\xb9\x02\x7b\x6a\xda\x19\x58\x29\x0d\xa3\x46\xc3\xd9\x68\xb2\x37\x8b\x5c\xdf\x9c\x01\xad\x8a\xd0\x8e\xa6\xdd\x63\x6f\x5b\xf5\x9e\xe2\xfd\xf2\x8c\xa6\x49\x47\x13\x91\x4a\x9d\xbf\x0d\xf1\xfc\xf4\x1e\x40\x94\x82\x98\x69\x1f\xea\x68\x56\x4e\x06\xc7\xed\x5f\x87\xb6\x2e\x6d\xa8\xe2\xb8\x68\x6f\xe5\xb4\xf1\xbc\x5e\xe6\x9c\x5e\xdf\x32\xd7\x63\x9a\x07\x57\x2a\x4f\x95\x16\xbb\xc3\x1f\x02\xc3\x8d\x02\xd6\x85\xf1\xc0\xda\xdf\xa3\xb8\x95\x06\x60\x42\x95\xd8\xee\x70\x6c\xa8\x37\xa1\x5c\x71\xd2\xa7\xcc\x1f\xdb\x6d\xe4\x2d\xdc\x44\xd4\x7f\x7c\xa1\xa9\x5d\x54\x29\xe2\xa6\x27\xde\x63\x57\x96\x02\x89\xfc\xef\x29\x54\x99\xa3\x1b\x12\x35\x71\xec\x7a\x36\x33\xf1\x9e\x6f\x5d\x08\x76\x70\x91\x1b\x2f\x84\x2e\xc0\xe7\xf3\x6b\xa4\xaa\x82\x91\x1c\x8b\x21\xfc\x5d\xaa\x4d\xea\xa6\x88\xcc\xba\x20\x40\x8b\x95\x8e\xa4\x5e\x5e\x35\x4e\x11\x33\x5f\x20\xa5\x17\x31\xfe\x6a\x73\xad\xd0\xfb\x3c\xec\x4b\xb1\x4a\x4b\x58\x53\xc3\x88\x8f\xd3\x30\x79\x6e\x04\xd9\x22\x64\x0a\x60\x52\x5e\xf0\x95\x33\x5c\x26\x5f\x17\xe4\x60\x7c\x75\xa1\x5e\x9d\x54\xcc\x0d\xa5\x45\x27\x09\x1b\x78\x21\xbc\xb0\x77\xac\xfd\x6d\xb4\x80\x1c\x4c\xb8\x4a\x61\x01\x01\xf1\x86\x23\x49\xbe\x0d\x04\xd3\x05\x73\xff\x92\x1d\x2d\xf8\xd9\x07\x29\x6c\xa2\x53\xac\x70\xad\xbd\x89\x69\x64\x06\xae\xc2\xac\xf4\xc2\x5e\xb1\x6b\x6d\x1c\xfc\x0b\xc0\xd0\x36\x1e\xb9\xbe\x24\xa1\x6d\xaf\x8f\xb3\xdd\x8b\xf5\x11\x4d\x7e\x01\x2e\x36\xcd\xf8\x2c\x9c\x26\x11\x08\xa2\x44\x50\xee\x2b\xdb\xab\xa9\x52\x40\x87\x63\xd1\x98\x9b\x0d\x9b\x47\x1e\x82\x51\xcb\x34\xeb\x02\x57\x2b\xc3\x27\x58\xd6\x7e\xa7\x36\xda\xf6\xdf\x25\x73\x72\xc8\x84\xdc\xea\xf5\x68\xd6\x4e\x33\x06\x9b\xe2\x59\x8b\x9a\x7e\x87\x45\x06\x1c\xdc\x3e\x13\x5c\xfd\x79\xc9\x85\xbd\xf1\x91\x20\x23\xd2\x9b\x9e\xa6\xc2\x13\x09\x81\xe4\xce\x62\x8a\xb2\xe6\xbe\x78\xb3\x6f\xe5\x9b\x17\x22\x27\x66\x60\x78\xaf\xbd\xe9\x78\x7f\xf7\x8b\x02\x14\x96\x6e\xbb\xd0\x27\x9e\x73\xea\x08\x65\xc1\xa1\x72\xb1\x10\xda\xd5\xeb\x92\x09\x59\xea\x67\x20\xd0\x66\x8d\xac\x63\xd5\x93\x4d\xe9\x84\x31\x78\xda\x69\xaa\x43\xb8\x20\xbf\xb6\xfb\xf6\x3f\x78\x97\x09\x21\xd0\xb0\xb7\x64\xa7\x1f\x34\xe5\x6b\xcb\xad\x9c\x12\x5b\x00\xe9\xfd\x08\x86\x33\xe7\xd5\x42\xf0\x08\xf5\xb1\x64\xac\xd2\xdc\xdd\x9f\xf2\x39\xda\xbc\x3a\xb7\xaf\x9d\x0a\x43\x15\xf3\x6c\x0a\x73\x9d\x36\xaf\x31\x02\x1b\xd1\xd0\x7a\x02\xc8\xc6\xe5\xc4\x44\x23\x30\xd9\xf5\x9e\x84\x2a\x5a\x81\x66\x8b\xff\x56\xf3\x9b\xd3\xa2\xbf\xf5\xe3\x86\xa9\x81\x26\xbd\x90\xa1\x6f\xb1\xef\x9e\x6b\x33\xa1\xe4\xbf\x2a\xd0\x79\x23\x15\x05\xab\x09\x8e\x12\x0b\x42\xa6\x03\x55\x6b\x94\x9c\x7f\xa0\x15\xfb\xa9\xd7\xa0\xb9\x22\x0c\x45\x8d\x7e\x37\x85\xb0\x81\x02\x18\x7d\x5b\xcc\xb6\x2e\x46\x9a\xe8\xf9\xc6\x7d\x01\x3b\x66\xd0\xe6\x3c\x9e\x54\x57\x26\x62\x91\x36\xa7\x13\x3b\x4b\x16\xab\x90\x02\xd7\x1c\xa9\x43\xb2\xe0\xff\x42\x78\x5d\xab\x94\xf4\x72\xe6\x77\xc1\xb7\xaa\xc1\xac\x19\x07\xb4\x2f\x50\x71\x33\xc3\x16\x44\x80\x3c\x40\xf7\x8c\x90\x5c\x15\x36\x76\x90\x20\xd2\xc1\xe4\xd6\x1c\x3a\x5e\x72\xb3\x26\x9e\x3f\x92\x64\xc8\xcd\xa7\x46\xcb\xe3\x92\x9f\x25\xe6\x01\x2a\xea\x0f\xf6\x89\x1c\x3e\x2d\xda\x20\x44\x30\xc0\x3f\x49\xee\xd7\x33\x96\x5b\x99\x75\x25\x87\xf0\xde\xbb\xbb\xdd\x86\x5a\x80\x3d\x25\x97\x29\x2c\xce\x9b\x18\xcf\x9c\x14\x34\xd0\x65\xb4\x0b\xfa\x01\xd4\xae\x5d\x58\x64\x89\x66\xc0\xcd\xda\x1a\x60\x71\x94\x07\xf5\xdd\x6a\x28\xd7\x79\xe5\xa0\xf0\xeb\x2c\x0b\x8f\x5c\x76\x44\x97\x46\xc7\xa4\x66\xf2\x5e\xd7\xbc\x35\x9b\x58\xa9\x74\x7d\x2b\x12\xe5\x3d\x53\x73\xab\xba\x04\x5f\x4a\xe6\xfd\x51\xf7\x3b\x84\xc9\x4d\x5a\x14\xbc\x9b\xa5\x8d\x44\xe8\x7b\x00\xb8\x16\xe6\x26\x4e\x3e\x32\x58\x43\x8f\x8a\x54\xff\xc5\x94\xae\xcf\x4b\x68\xc1\xb2\xd3\x86\x08\x4c\xc5\xaa\x6b\x45\xa3\xd0\x66\x9b\xfc\x0e\x84\x80\x41\xd3\xc1\x3c\x8e\x6e\x42\xb0\xb5\xe5\x5c\x99\x23\x1f\xce\xe4\x1f\x02\xc4\x3d\xa9\xc9\xf2\x87\xc8\x89\xa1\xcf\xd1\x06\xba\xe0\x20\x40\x8f\x06\xe2\xc8\xf3\x17\xc2\x7e\xe2\x7f\xa9\x91\xd8\x7d\x02\xea\x30\x6d\x0a\xaa\x27\x59\xcb\x7c\xfc\xdb\xd9\x4c\x89\x43\xa3\x1e\xbc\x60\xb2\x67\xdc\x54\x90\x57\x0b\x49\x99\x8e\xe5\x5c\xf6\x0a\xa5\x90\xde\x49\x9d\x6c\x87\x4a\xd6\xd3\x9a\xd7\x48\x08\x3a\x5a\x52\x03\xa2\x2e\xfe\x24\xf9\x7b\x50\xcc\x01\xf9\xcd\xbf\x7d\x07\x7b\xeb\x74\x9a\xdd\x5a\x42\x01\xd0\xb5\x2b\x95\x89\xb1\xc5\x49\x4d\xe6\xc2\x18\x01\xe2\x0b\x3f\xe1\x75\x9b\x5f\x00\xb6\xb7\x14\x1a\xf7\x6d\x97\xd0\x0c\x2d\xfd\xb3\xe0\xb9\x84\x0d\x4f\xe1\x12\x5e\x09\x97\x47\x28\xe1\xe2\x5a\x5d\x45\xda\x92\x5b\x57\x76\xd1\x3f\x4a\x62\xa3\xc1\xcf\x46\x9e\x8c\x55\xd2\xe6\x33\xce\xf5\xe6\x37\x72\xa6\xb4\xb8\xab\xe6\xd7\x59\xee\x09\x80\x29\x1b\x73\xa8\x66\x61\x21\x73\xae\x3e\x87\x7c\x25\x1d\x33\x71\x1d\xe6\x42\x55\x3b\x13\xa8\xe5\x65\x8a\x99\x43\x0f\x8e\x75\xe9\xf1\x50\x9c\x63\x96\x90\x96\x55\xee\xd4\x1d\xc6\x2a\x4d\xe5\xfc\x8a\x3e\x00\xca\xb8\x7d\xc5\x32\x3c\x4f\xba\x9f\xfe\xfa\x6e\xe4\xa6\x55\x19\x9c\xa8\x11\x93\x7c\x6c\x0e\x77\xc8\xd4\x5a\x8c\xc4\xdc\xae\xe4\x5e\x88\xc9\x93\xa3\x50\xbe\x39\x94\xd3\x5c\xa9\xbb\xff\xed\x42\xc2\x83\x72\x41\x84\x80\x45\xa3\x3a\x8e\x67\x4d\x99\xe3\x40\xe8\x2e\x40\x68\x62\x85\x20\x77\x5d\x6e\xe6\x82\xbb\x5f\xca\x04\x65\x30\x92\x38\xe7\xfa\xc6\x2a\x72\xd1\x5f\x3b\x39\x85\xe5\x86\x05\x99\xd5\x72\x3b\x2d\x43\xdc\x74\x8f\xae\x26\x15\x8c\xf2\x68\xf3\xc7\x0d\x82\xcc\xde\x26\xb7\x06\xb9\x86\x28\x53\x2d\xcc\x7d\x4b\xac\x8e\x4f\xb1\x37\x34\x92\x69\x38\x13\x02\xef\xc3\x56\x66\x06\xb5\x2b\x67\xd4\x45\x7c\xb4\xe3\x15\x6b\xe4\x3c\x1a\x8c\x9d\xa5\x1a\x90\xc5\xce\x44\x81\xe9\xb6\xad\xd4\x50\x30\x76\x16\xd7\x3b\x2f\x70\x10\xe6\xb6\x25\xf7\x08\x3d\x21\xb7\x6a\x9a\x50\x0f\xdd\x14\xf0\x87\xd2\x30\xa4\x60\xdd\xf9\xb3\xd3\xaf\x32\xac\x83\x47\x21\x14\xbe\x1d\x67\x0e\x4b\xe7\x33\xe2\x21\x60\xea\x0a\x71\xbf\x84\x4d\x79\xec\x02\x1c\x02\xc1\xde\x16\xbb\xc7\x45\xad\xbf\x85\xfe\x68\x7f\x8b\x67\x78\x2b\x47\x19\x10\xdb\xaa\x7e\xb3\xf2\xc8\x0a\x5a\xca\x87\xd3\xc6\xd8\x2f\x84\x63\x98\x95\x50\x9c\x13\x21\xb0\x2c\x5e\x4d\x7d\x35\x43\xce\x43\xdd\xee\x5c\x18\x38\x41\x52\x77\x55\x0d\x20\x78\x24\x9d\xcb\xc6\x82\xb0\x36\x71\x76\x11\x2f\x92\xd5\xc0\x6c\x09\xf0\x82\x99\x80\x90\x88\x74\x59\x50\x1f\xb2\x42\xe3\xc3\x3a\x94\x1d\x09\xd1\xdc\x71\x8f\xf8\x96\xa6\x9c\x34\xe5\xd7\xbf\x6f\x15\x15\x94\x8b\x68\xf6\xfb\x1c\x72\x3c\x01\x42\x52\xc1\x17\x55\x4e\x40\x0a\xd4\xd1\x79\x59\xdf\x39\xff\xa9\xfc\x91\x1b\x8f\xd5\x06\x41\xa6\x3a\x73\xd8\xea\x0c\x47\xf8\xe2\x1f\x1e\xdb\x45\xc6\x57\x26\x01\x6c\x79\xba\x8a\x52\xd2\x83\x8b\x4d\x71\xe6\xc2\x64\xa7\x94\xdf\x57\xc1\x45\x8b\xe2\x3c\x79\xb0\xf4\xed\x61\x99\x71\xea\xaa\x38\xb1\x3d\xea\xd4\x35\xd3\x8b\x51\xb7\xef\xf9\xd5\x4d\xa1\x52\xb5\xcf\x15\x24\x75\xd3\xb9\x4d\xd8\xb5\xfc\xd0\xf2\x64\x82\x51\x72\x63\x36\x84\xd0\x1c\x1a\x18\x6e\x76\xe4\x6d\x63\x7b\x4d\x54\x7a\xc1\x8f\xb3\xfa\xef\x01\xca\xb5\xae\xbd\xb8\xea\x0f\x27\xcf\x87\xc5\xcd\x5e\x3f\x0f\xea\x84\x4f\xc2\x41\x00\x1e\x26\x7f\xf7\x3a\x62\xb7\x65\x92\xc0\xda\xe3\x25\xc8\xf7\xad\xfa\xf2\xf2\x62\x0b\x0c\x43\x00\xfd\xfc\x28\x68\x04\x5d\xa2\x9d\xca\x6a\xf5\xdd\x8c\xdd\xed\xc0\xac\x0f\x91\x32\xc5\x78\x3e\xbc\xae\xa2\x85\x4a\xd5\x8f\x65\x72\xe3\x22\x1a\xc9\xa0\xe8\xa3\x0a\xb8\xa5\x6e\x63\xf8\x1f\x84\xbc\x8f\x06\x41\x8b\x49\x7d\x50\x21\x32\xd3\x52\x7c\x3d\x56\x42\xb1\x43\x48\x66\x6a\x0b\x05\x6d\x4e\x31\x06\x27\xdb\x1e\xf9\x4f\x96\x31\x2e\x7a\x6b\x09\xe4\xb0\x32\x65\x9d\x8a\x94\x4f\xf7\x50\x9b\x2c\x06\x54\xf4\x7e\xff\x13\x3c\xee\x7e\xa0\xd7\xf5\x57\xed\x80\xf4\xd8\x8f\x54\x3d\x7b\x6d\x52\x56\xe8\x39\x15\x07\xd7\xb6\x4d\xb5\xe4\x70\xad\x19\x1d\x35\x1d\x2d\x22\x9f\x5b\x24\x80\xab\xfd\xa2\x99\xd9\xae\x07\x51\x31\xda\x5a\x8a\x64\x14\x79\x25\xcc\x48\x8f\xf2\x45\xe8\x46\x44\x56\x74\x22\xbd\xbd\xf1\x5a\x14\x52\xfe\x28\x2c\xdf\x2e\x96\xdf\xf5\xa3\xed\x4f\x9b\x63\x1c\x70\x7f\xf5\x85\xc8\x40\x8b\x51\x2a\x38\x28\x17\x25\xc5\x85\x33\x4a\xa5\xc9\xb6\xe0\x63\xf0\x7b\x2c\xf9\xb1\xb2\x5a\x68\x68\x1c\x9a\xc0\xc1\x41\xea\x35\x65\x64\xbc\x02\x7d\xae\xa7\x19\xcf\x97\x26\x44\x77\x17\x7f\x9c\x1a\x2f\xf8\x5c\xdc\xa9\x26\xe2\x8e\x89\x69\xab\x12\xd1\xc0\xc4\xcc\x71\xfd\xf9\xad\xc2\x20\x89\x73\x70\x17\xff\x96\x71\xba\x9d\xde\x1c\x7c\xf8\xed\x9f\x70\xbd\xf6\xda\x34\xa7\xeb\x59\xf5\xd5\x8c\xb3\x11\xde\x31\x6d\xd7\xda\x9b\x82\xb3\x72\x16\xbf\xa0\xbf\xd2\xc3\x85\xc1\xfb\x2e\x88\xf0\xf9\x9e\x62\xfa\x91\x4a\x34\x3f\xa6\x85\x1e\xaa\x27\x0a\xac\x2c\x63\x74\xbd\x63\x4f\xa2\xe1\x8f\x27\x8f\x06\xac\xdd\x09\x43\x34\xfb\xe8\xa8\x6c\x99\xcb\xe0\x79\xde\xa4\x7f\x6b\x01\xd1\xa6\x1b\xcf\x9b\x3f\x25\x50\x8d\x5d\xaf\x26\xa0\x45\xd6\x7d\x0f\x40\xb7\x44\x85\x9d\x60\x47\x7b\xbd\x46\xf1\xae\xef\x74\xb1\x8e\xec\x72\x1c\x51\x4a\xd1', 2) | 30,887 | 30,887 | 0.749992 | 7,717 | 30,887 | 3.000259 | 0.033821 | 0.005183 | 0.005442 | 0.004665 | 0.001814 | 0.001814 | 0.001037 | 0 | 0 | 0 | 0 | 0.313334 | 0.000097 | 30,887 | 1 | 30,887 | 30,887 | 0.436342 | 0 | 0 | 0 | 0 | 1 | 0.998705 | 0.998705 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
734fc215a53c4e5d9bcf6e5211e70716382624ea | 147 | py | Python | python_developer_tools/cv/bases/attentions/SimAM-master/networks/attentions/__init__.py | carlsummer/python_developer_tools | a8c4365b7cc601cda55648cdfd8c0cb1faae132f | [
"Apache-2.0"
] | 32 | 2021-06-21T04:49:48.000Z | 2022-03-29T05:46:59.000Z | python_developer_tools/cv/bases/attentions/SimAM-master/networks/attentions/__init__.py | HonestyBrave/python_developer_tools | fc0dcf5c4ef088e2e535206dc82f09bbfd01f280 | [
"Apache-2.0"
] | 1 | 2021-11-12T03:45:55.000Z | 2021-11-12T03:45:55.000Z | python_developer_tools/cv/bases/attentions/SimAM-master/networks/attentions/__init__.py | HonestyBrave/python_developer_tools | fc0dcf5c4ef088e2e535206dc82f09bbfd01f280 | [
"Apache-2.0"
] | 10 | 2021-06-03T08:05:05.000Z | 2021-12-13T03:10:42.000Z | from ..import find_module_using_name
def get_attention_module(attention_type="none"):
return find_module_using_name(attention_type.lower()) | 21 | 57 | 0.816327 | 21 | 147 | 5.238095 | 0.619048 | 0.181818 | 0.272727 | 0.345455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 147 | 7 | 57 | 21 | 0.827068 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 7 |
735cf2da167a39b9fda6366da31e71c98c6af35f | 2,581 | py | Python | udder/conf/build_led_array.py | edcolmar/udder | 946fdb7abc6b808f0f54cf4ad9bb2210861256ab | [
"Apache-2.0"
] | null | null | null | udder/conf/build_led_array.py | edcolmar/udder | 946fdb7abc6b808f0f54cf4ad9bb2210861256ab | [
"Apache-2.0"
] | null | null | null | udder/conf/build_led_array.py | edcolmar/udder | 946fdb7abc6b808f0f54cf4ad9bb2210861256ab | [
"Apache-2.0"
] | null | null | null | # build a json file with the studio config
pixel_spacing = 0.5
strip_gap = 5.0
initial_y_location = 0.0
initial_x_location = 0.0
initial_z_location = 0.0
current_y_location = initial_y_location
current_x_location = initial_x_location
current_z_location = initial_z_location
current_address = 0
build_list = []
# build something like this
# --
# | |
# |
# |
# 64
# 30 64
# 64
# 30
# 125-189
# 95-125 189-253
# 31-95
# 1-30
for i in range(30):
print i
build_list.append({
'address': current_address,
'group': 0,
'point': [
current_x_location,
current_y_location,
current_z_location
]
})
current_y_location = current_y_location - pixel_spacing
current_address = current_address + 1
current_y_location = current_y_location - strip_gap
for i in range(64):
print i
build_list.append({
'address': current_address,
'group': 0,
'point': [
current_x_location,
current_y_location,
current_z_location
]
})
current_y_location = current_y_location - pixel_spacing
current_address = current_address + 1
current_y_location = current_y_location - strip_gap
for i in range(30):
print i
build_list.append({
'address': current_address,
'group': 0,
'point': [
current_x_location,
current_y_location,
current_z_location
]
})
current_y_location = current_y_location - pixel_spacing
current_address = current_address + 1
current_y_location = current_y_location - strip_gap
current_x_location = current_x_location + strip_gap
for i in range(64):
print i
build_list.append({
'address': current_address,
'group': 0,
'point': [
current_x_location,
current_y_location,
current_z_location
]
})
current_x_location = current_x_location + pixel_spacing
current_address = current_address + 1
current_y_location = current_y_location + strip_gap
current_x_location = current_x_location + strip_gap
for i in range(64):
print i
build_list.append({
'address': current_address,
'group': 0,
'point': [
current_x_location,
current_y_location,
current_z_location
]
})
current_y_location = current_y_location + pixel_spacing
current_address = current_address + 1
print(build_list)
| 20.164063 | 59 | 0.621465 | 314 | 2,581 | 4.700637 | 0.130573 | 0.294715 | 0.238482 | 0.276423 | 0.79607 | 0.79607 | 0.784553 | 0.784553 | 0.784553 | 0.784553 | 0 | 0.036212 | 0.304533 | 2,581 | 128 | 60 | 20.164063 | 0.786072 | 0.058504 | 0 | 0.768293 | 0 | 0 | 0.035182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.073171 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b41fd200f2b41d0e3620dfed72691eea397f4286 | 12,730 | py | Python | cell_annotator/cell_annotator/my_classes.py | slimaneaymen/Malaria-Detection | 4b94ed005a660dc89794d6544810a3f4fb68e2b5 | [
"MIT"
] | null | null | null | cell_annotator/cell_annotator/my_classes.py | slimaneaymen/Malaria-Detection | 4b94ed005a660dc89794d6544810a3f4fb68e2b5 | [
"MIT"
] | null | null | null | cell_annotator/cell_annotator/my_classes.py | slimaneaymen/Malaria-Detection | 4b94ed005a660dc89794d6544810a3f4fb68e2b5 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Sep 9 15:49:49 2019
@author: gourgue
adapter le code pour les images compressers.
"""
#%%
from tensorflow.keras.utils import Sequence, to_categorical
from tensorflow.keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img,save_img
from skimage.io import imread
import numpy as np
import os, time, zlib
import matplotlib.pyplot as plt
from PIL import Image
class DataGenerator(Sequence):
def __init__(self, list_IDs, batch_size=64, dim=[84,84], n_channels=3, n_classes=2,
shuffle=True, LEDS=False):
'Initialisation'
self.batch_size = batch_size
self.dim = dim
self.list_IDs = list_IDs
self.n_channels = n_channels
self.n_classes = n_classes
self.shuffle = shuffle
self.LEDS = LEDS
self.on_epoch_end()
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.list_IDs))
if self.shuffle is True:
np.random.seed(1)
np.random.shuffle(self.indexes)
def __data_generation(self, list_IDs_temp):
'Generates data containing batch_size samples'
#Initialisation
X = np.empty((self.batch_size, *self.dim, self.n_channels))
y = np.empty((self.batch_size), dtype=int)
#Generate data
#after led
X=get_images(list_IDs_temp, self.n_channels, self.LEDS)
y=get_labels(list_IDs_temp)
return X, to_categorical(y, num_classes=self.n_classes)
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.list_IDs) / self.batch_size))
def __getitem__ (self, index):
'Generate one batch of data'
#Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
#Find list of IDs
list_IDs_temp = [self.list_IDs[k] for k in indexes]
#Generate data
X, y = self.__data_generation(list_IDs_temp)
return X, y
def get_labels(liste):
y=np.zeros([len(liste),1])
for i, path in enumerate(liste):
travel, name =os.path.split(path)
if 'healthy' in name:
y[i]=0
elif 'infected' in name:
y[i]=1
return y
def get_images(liste, nb_channels, LED=False):
for i, path in enumerate(liste):
#image=imread(path,pilmode="RGB",as_gray=True)
image = np.array(Image.open(path))
if i==0:
if LED is False:
if(image.shape[0])==35:
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
X=np.zeros([len(liste), image.shape[0], image.shape[1],nb_channels])
elif LED == "multi_led":
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
if LED is False:
if(image.shape[0])==35:
X[i,:,:,0]=image[0,:,:]
X[i,:,:,1]=image[0,:,:]
X[i,:,:,2]=image[0,:,:]
elif len(image.shape)==3:
X[i]=image
elif len(image.shape)==2:
X[i,:,:,0]=image
X[i,:,:,1]=image
X[i,:,:,2]=image
elif LED =='multi_led':
image=np.moveaxis(image,0,2)
X[i]=image[:,:,:nb_channels]
else:
image=np.moveaxis(image,0,2)
X[i]=image[:,:,LED]
return X/255#X/127-1
class DataGeneratorPhase(Sequence):
def __init__(self, list_IDs, batch_size=64, dim=[84,84], n_channels=3, n_classes=2,
shuffle=True, LEDS=False):
'Initialisation'
self.batch_size = batch_size
self.dim = dim
self.list_IDs = list_IDs
self.n_channels = n_channels
self.n_classes = n_classes
self.shuffle = shuffle
self.LEDS = LEDS
self.on_epoch_end()
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.list_IDs))
if self.shuffle is True:
np.random.seed(1)
np.random.shuffle(self.indexes)
def __data_generation(self, list_IDs_temp):
'Generates data containing batch_size samples'
#Initialisation
X = np.empty((self.batch_size, *self.dim, self.n_channels))
y = np.empty((self.batch_size), dtype=int)
#Generate data
#after led
X=get_imagesPhase(list_IDs_temp, self.n_channels, self.LEDS)
y=get_labels(list_IDs_temp)
return X, to_categorical(y, num_classes=self.n_classes)
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.list_IDs) / self.batch_size))
def __getitem__ (self, index):
'Generate one batch of data'
#Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
#Find list of IDs
list_IDs_temp = [self.list_IDs[k] for k in indexes]
#Generate data
X, y = self.__data_generation(list_IDs_temp)
return X, y
def get_imagesPhase(liste, nb_channels, LED=False):
from imageio import imread
for i, path in enumerate(liste):
path = path.split('.')[0] + '.' + path.split('.')[1][:4]
path_phase = path.replace("inten","phase")
image= imread(path)
image_phase = imread(path_phase)
nule = np.zeros([image.shape[0], image.shape[1]])
if i==0:
if LED is False:
if(image.shape[0])==35:
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
X=np.zeros([len(liste), image.shape[0], image.shape[1],nb_channels])
elif LED == "multi_led":
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
if LED is False:
if(image.shape[0])==35:
X[i,:,:,0]=image[0,:,:]
X[i,:,:,1]=image[0,:,:]
X[i,:,:,2]=image[0,:,:]
elif len(image.shape)==3:
X[i]=image
elif len(image.shape)==2:
X[i,:,:,0]=image
X[i,:,:,1]=image_phase
X[i,:,:,2]=nule
elif LED =='multi_led':
image=np.moveaxis(image,0,2)
X[i]=image[:,:,:nb_channels]
else:
image=np.moveaxis(image,0,2)
X[i]=image[:,:,LED]
return X/255#X/127-1
class DataGeneratorTopHat(Sequence):
def __init__(self, list_IDs, batch_size=64, dim=[84,84], n_channels=3, n_classes=2,
shuffle=True, LEDS=False):
'Initialisation'
self.batch_size = batch_size
self.dim = dim
self.list_IDs = list_IDs
self.n_channels = n_channels
self.n_classes = n_classes
self.shuffle = shuffle
self.LEDS = LEDS
self.on_epoch_end()
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.list_IDs))
if self.shuffle is True:
np.random.seed(1)
np.random.shuffle(self.indexes)
def __data_generation(self, list_IDs_temp):
'Generates data containing batch_size samples'
#Initialisation
X = np.empty((self.batch_size, *self.dim, self.n_channels))
y = np.empty((self.batch_size), dtype=int)
#Generate data
#after led
X=get_images_tophat(list_IDs_temp, self.n_channels, self.LEDS)
y=get_labels(list_IDs_temp)
return X, to_categorical(y, num_classes=self.n_classes)
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.list_IDs) / self.batch_size))
def __getitem__ (self, index):
'Generate one batch of data'
#Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
#Find list of IDs
list_IDs_temp = [self.list_IDs[k] for k in indexes]
#Generate data
X, y = self.__data_generation(list_IDs_temp)
return X, y
def get_images_tophat(liste, nb_channels, LED=False):
for i, path in enumerate(liste):
image = imread(path)
if i==0:
#premier tour initialisation
X = np.zeros([len(liste), image.shape[0], image.shape[1],2])
X[i,:,:,0]=image
traveling, name = os.path.split(path)
traveling+='_tophat/'
tophat = imread(traveling+name)
X[i,:,:,1]=tophat
return X/255
class DataGeneratorCentral(Sequence):
def __init__(self, list_IDs, batch_size=64, dim=[84,84], n_channels=3, n_classes=2,
shuffle=True, LEDS=False):
'Initialisation'
self.batch_size = batch_size
self.dim = dim
self.list_IDs = list_IDs
self.n_channels = n_channels
self.n_classes = n_classes
self.shuffle = shuffle
self.LEDS = LEDS
self.on_epoch_end()
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.list_IDs))
if self.shuffle is True:
np.random.seed(1)
np.random.shuffle(self.indexes)
def __data_generation(self, list_IDs_temp):
'Generates data containing batch_size samples'
#Initialisation
X = np.empty((self.batch_size, *self.dim, self.n_channels))
y = np.empty((self.batch_size), dtype=int)
#Generate data
#after led
X=get_imagesCentral(list_IDs_temp, self.n_channels, self.LEDS)
y=get_labels(list_IDs_temp)
return X, to_categorical(y, num_classes=self.n_classes)
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.list_IDs) / self.batch_size))
def __getitem__ (self, index):
'Generate one batch of data'
#Generate indexes of the batch
indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]
#Find list of IDs
list_IDs_temp = [self.list_IDs[k] for k in indexes]
#Generate data
X, y = self.__data_generation(list_IDs_temp)
return X, y
def get_imagesCentral(liste, nb_channels, LED=False):
for i, path in enumerate(liste):
image=imread(path)
#image = np.array(Image.open(path))
if i==0:
if LED is False:
if(image.shape[0])==35:
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
X=np.zeros([len(liste), image.shape[0], image.shape[1],nb_channels])
elif LED == "multi_led":
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
else:
# image=np.moveaxis(image,0,2)
X=np.zeros([len(liste), image.shape[1], image.shape[2],nb_channels])
if LED is False:
if(image.shape[0])==35:
X[i,:,:,0]=image[0,:,:]
X[i,:,:,1]=image[0,:,:]
X[i,:,:,2]=image[0,:,:]
elif len(image.shape)==3:
X[i]=image
elif len(image.shape)==2:
X[i,:,:,0]=image
X[i,:,:,1]=image
X[i,:,:,2]=image
elif LED =='multi_led':
image=np.moveaxis(image,0,2)
X[i]=image[:,:,:nb_channels]
else:
image=np.moveaxis(image,0,2)
X[i]=image[:,:,LED]
return X/255#X/127-1 | 32.979275 | 114 | 0.543441 | 1,672 | 12,730 | 3.967105 | 0.091507 | 0.046434 | 0.039801 | 0.03166 | 0.875019 | 0.864466 | 0.853912 | 0.853912 | 0.853912 | 0.853912 | 0 | 0.022594 | 0.332443 | 12,730 | 386 | 115 | 32.979275 | 0.757943 | 0.120189 | 0 | 0.846743 | 0 | 0 | 0.06019 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095785 | false | 0 | 0.030651 | 0 | 0.206897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b44a6b9313a7940a2ad8982b3fd66de835beb3e7 | 144 | py | Python | cechmate/filtrations/__init__.py | amish-mishra/cechmate-DR | e92e8b455eb2315ee691418aee8a91937bc827cb | [
"MIT"
] | null | null | null | cechmate/filtrations/__init__.py | amish-mishra/cechmate-DR | e92e8b455eb2315ee691418aee8a91937bc827cb | [
"MIT"
] | null | null | null | cechmate/filtrations/__init__.py | amish-mishra/cechmate-DR | e92e8b455eb2315ee691418aee8a91937bc827cb | [
"MIT"
] | null | null | null | from .alpha import *
from .rips import *
from .cech import *
from .del_rips import *
from .extended import *
from .miniball import get_boundary
| 20.571429 | 34 | 0.756944 | 21 | 144 | 5.095238 | 0.47619 | 0.46729 | 0.261682 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 144 | 6 | 35 | 24 | 0.891667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b44f3a8953abd63bd78bb74ae85a68bf9926000c | 296 | py | Python | cursopython/pythonteste/aula15c.py | AtilaCosta87/Python | b4eea7885d16df80feecc4c699a8348ca13a80c2 | [
"MIT"
] | null | null | null | cursopython/pythonteste/aula15c.py | AtilaCosta87/Python | b4eea7885d16df80feecc4c699a8348ca13a80c2 | [
"MIT"
] | null | null | null | cursopython/pythonteste/aula15c.py | AtilaCosta87/Python | b4eea7885d16df80feecc4c699a8348ca13a80c2 | [
"MIT"
] | null | null | null | nome = 'José'
idade = 33
salário = 987.3
print(f'O {nome} tem {idade} anos e ganha R${salário:.2f}')
print(f'O {nome:-^20} tem {idade} anos e ganha R${salário:.2f}')
print(f'O {nome:->20} tem {idade} anos e ganha R${salário:.2f}')
print(f'O {nome:-<20} tem {idade} anos e ganha R${salário:.2f}')
| 37 | 64 | 0.635135 | 58 | 296 | 3.241379 | 0.293103 | 0.12766 | 0.148936 | 0.234043 | 0.803191 | 0.803191 | 0.803191 | 0.803191 | 0.803191 | 0.803191 | 0 | 0.062745 | 0.138514 | 296 | 7 | 65 | 42.285714 | 0.67451 | 0 | 0 | 0 | 0 | 0 | 0.726351 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
b45d0d2794578a1fe84e0bf6576946cb84f71ae0 | 6,008 | py | Python | Scripts/simulation/traits/trait_day_night_tracking.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | Scripts/simulation/traits/trait_day_night_tracking.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | Scripts/simulation/traits/trait_day_night_tracking.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | # uncompyle6 version 3.7.4
# Python bytecode 3.7 (3394)
# Decompiled from: Python 3.7.9 (tags/v3.7.9:13c94747c7, Aug 17 2020, 18:58:18) [MSC v.1900 64 bit (AMD64)]
# Embedded file name: T:\InGame\Gameplay\Scripts\Server\traits\trait_day_night_tracking.py
# Compiled at: 2016-10-06 01:09:15
# Size of source mod 2**32: 6466 bytes
from buffs.tunable import TunableBuffReference
from sims4.tuning.tunable import HasTunableSingletonFactory, AutoFactoryInit, TunableSet
class DayNightTracking(HasTunableSingletonFactory, AutoFactoryInit):
FACTORY_TUNABLES = {'sunlight_buffs':TunableSet(description="\n Allows a list of buffs to be added to the owning Sim when they're in\n the sunlight.\n \n These buffs are also guaranteed to be removed from the Sim when\n they're no longer in sunlight, regardless of where the buff was\n applied. For instance, if an interaction has a basic extra that also\n applied a buff in this list, but the Sim is given this trait and\n they're not in the sunlight. That buff will be removed.\n \n Do not rely on Sunlight Buffs and Shade Buffs to be perfectly\n mutually exclusive. It's possible, due to timing issues, that both\n buffs in Sunlight Buffs and buffs in Shade buffs can be on the sim\n at the same time, or neither on the sim, for a brief amount of time.\n If you need buff exclusivity, use the tuning on buffs.\n ",
tunable=TunableBuffReference(description="\n The buff to be added to the owning Sim when they're in the\n sunlight.\n ",
pack_safe=True)),
'shade_buffs':TunableSet(description="\n Allows a list of buffs to be added to the owning Sim when they're\n not in the sunlight.\n \n These buffs are also guaranteed to be removed from the Sim when\n they're no longer in the shade, regardless of where the buff was\n applied. For instance, if an interaction has a basic extra that also\n applied a buff in this list, but the Sim is given this trait and\n they're not in the shade. That buff will be removed.\n \n Do not rely on Sunlight Buffs and Shade Buffs to be perfectly\n mutually exclusive. It's possible, due to timing issues, that both\n Sunlight Buffs and Shade Buffs can be on the Sim at the same time,\n or neither on the Sim, for a brief amount of time. If you need buff\n exclusivity, use the tuning on buffs.\n ",
tunable=TunableBuffReference(description="\n The buff to be added to the owning Sim when they're not in the\n sunlight.\n ",
pack_safe=True)),
'day_buffs':TunableSet(description="\n Allows a list of buffs to be added to the owning Sim when it's\n currently day time in the region (based on Sunrise and Sunset time\n tuning for the Region).\n \n These buffs are also guaranteed to be removed from the Sim when it's\n no longer day time, regardless of where the buff was applied. For\n instance, if an interaction has a basic extra that also applied a\n buff in this list, but the Sim is given this trait and it's not day\n time. That buff will be removed.\n \n Do not rely on Day Buffs and Night Buffs to be perfectly\n mutually exclusive. It's possible, due to timing issues, that both\n Day Buffs and Night Buffs can be on the Sim at the same time,\n or neither on the Sim, for a brief amount of time. If you need buff\n exclusivity, use the tuning on buffs.\n ",
tunable=TunableBuffReference(description="\n The buff to be added to the owning Sim when it's day time.\n ",
pack_safe=True)),
'night_buffs':TunableSet(description="\n Allows a list of buffs to be added to the owning Sim when it's\n currently night time in the region (based on Sunrise and Sunset time\n tuning for the Region).\n \n These buffs are also guaranteed to be removed from the Sim when it's\n no longer night time, regardless of where the buff was applied. For\n instance, if an interaction has a basic extra that also applied a\n buff in this list, but the Sim is given this trait and it's not\n night time. That buff will be removed.\n \n Do not rely on Day Buffs and Night Buffs to be perfectly\n mutually exclusive. It's possible, due to timing issues, that both\n Day Buffs and Night Buffs can be on the Sim at the same time,\n or neither on the Sim, for a brief amount of time. If you need buff\n exclusivity, use the tuning on buffs.\n ",
tunable=TunableBuffReference(description="\n The buff to be added to the owning Sim when it's night time.\n ",
pack_safe=True)),
'force_refresh_buffs':TunableSet(description='\n This is the list of buffs, which upon removal, refreshes the status \n of day-night-sunlight buffs. This is needed because when the vampire \n resistance cocktail buff expires, we have no good way of adding the \n burnt-by-sun buff automatically. Any buff which should refresh the \n day-night-sunlight buff should be added to this list.\n ',
tunable=TunableBuffReference(description='\n The buff that upon removal will force a refresh on the \n ',
pack_safe=True))}
class DayNightTrackingState:
def __init__(self, is_day, in_sunlight):
self.is_day = is_day
self.in_sunlight = in_sunlight | 187.75 | 1,010 | 0.638482 | 937 | 6,008 | 4.067236 | 0.184632 | 0.016793 | 0.021254 | 0.023091 | 0.729467 | 0.71556 | 0.71556 | 0.69588 | 0.682236 | 0.682236 | 0 | 0.015816 | 0.305426 | 6,008 | 32 | 1,011 | 187.75 | 0.897436 | 0.052597 | 0 | 0.181818 | 0 | 0.318182 | 0.834682 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
81f3d112ac555e63feadcc78bee3b03ed0434189 | 6,545 | py | Python | loldib/getratings/models/NA/na_syndra/na_syndra_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_syndra/na_syndra_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_syndra/na_syndra_mid.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Syndra_Mid_Aatrox(Ratings):
pass
class NA_Syndra_Mid_Ahri(Ratings):
pass
class NA_Syndra_Mid_Akali(Ratings):
pass
class NA_Syndra_Mid_Alistar(Ratings):
pass
class NA_Syndra_Mid_Amumu(Ratings):
pass
class NA_Syndra_Mid_Anivia(Ratings):
pass
class NA_Syndra_Mid_Annie(Ratings):
pass
class NA_Syndra_Mid_Ashe(Ratings):
pass
class NA_Syndra_Mid_AurelionSol(Ratings):
pass
class NA_Syndra_Mid_Azir(Ratings):
pass
class NA_Syndra_Mid_Bard(Ratings):
pass
class NA_Syndra_Mid_Blitzcrank(Ratings):
pass
class NA_Syndra_Mid_Brand(Ratings):
pass
class NA_Syndra_Mid_Braum(Ratings):
pass
class NA_Syndra_Mid_Caitlyn(Ratings):
pass
class NA_Syndra_Mid_Camille(Ratings):
pass
class NA_Syndra_Mid_Cassiopeia(Ratings):
pass
class NA_Syndra_Mid_Chogath(Ratings):
pass
class NA_Syndra_Mid_Corki(Ratings):
pass
class NA_Syndra_Mid_Darius(Ratings):
pass
class NA_Syndra_Mid_Diana(Ratings):
pass
class NA_Syndra_Mid_Draven(Ratings):
pass
class NA_Syndra_Mid_DrMundo(Ratings):
pass
class NA_Syndra_Mid_Ekko(Ratings):
pass
class NA_Syndra_Mid_Elise(Ratings):
pass
class NA_Syndra_Mid_Evelynn(Ratings):
pass
class NA_Syndra_Mid_Ezreal(Ratings):
pass
class NA_Syndra_Mid_Fiddlesticks(Ratings):
pass
class NA_Syndra_Mid_Fiora(Ratings):
pass
class NA_Syndra_Mid_Fizz(Ratings):
pass
class NA_Syndra_Mid_Galio(Ratings):
pass
class NA_Syndra_Mid_Gangplank(Ratings):
pass
class NA_Syndra_Mid_Garen(Ratings):
pass
class NA_Syndra_Mid_Gnar(Ratings):
pass
class NA_Syndra_Mid_Gragas(Ratings):
pass
class NA_Syndra_Mid_Graves(Ratings):
pass
class NA_Syndra_Mid_Hecarim(Ratings):
pass
class NA_Syndra_Mid_Heimerdinger(Ratings):
pass
class NA_Syndra_Mid_Illaoi(Ratings):
pass
class NA_Syndra_Mid_Irelia(Ratings):
pass
class NA_Syndra_Mid_Ivern(Ratings):
pass
class NA_Syndra_Mid_Janna(Ratings):
pass
class NA_Syndra_Mid_JarvanIV(Ratings):
pass
class NA_Syndra_Mid_Jax(Ratings):
pass
class NA_Syndra_Mid_Jayce(Ratings):
pass
class NA_Syndra_Mid_Jhin(Ratings):
pass
class NA_Syndra_Mid_Jinx(Ratings):
pass
class NA_Syndra_Mid_Kalista(Ratings):
pass
class NA_Syndra_Mid_Karma(Ratings):
pass
class NA_Syndra_Mid_Karthus(Ratings):
pass
class NA_Syndra_Mid_Kassadin(Ratings):
pass
class NA_Syndra_Mid_Katarina(Ratings):
pass
class NA_Syndra_Mid_Kayle(Ratings):
pass
class NA_Syndra_Mid_Kayn(Ratings):
pass
class NA_Syndra_Mid_Kennen(Ratings):
pass
class NA_Syndra_Mid_Khazix(Ratings):
pass
class NA_Syndra_Mid_Kindred(Ratings):
pass
class NA_Syndra_Mid_Kled(Ratings):
pass
class NA_Syndra_Mid_KogMaw(Ratings):
pass
class NA_Syndra_Mid_Leblanc(Ratings):
pass
class NA_Syndra_Mid_LeeSin(Ratings):
pass
class NA_Syndra_Mid_Leona(Ratings):
pass
class NA_Syndra_Mid_Lissandra(Ratings):
pass
class NA_Syndra_Mid_Lucian(Ratings):
pass
class NA_Syndra_Mid_Lulu(Ratings):
pass
class NA_Syndra_Mid_Lux(Ratings):
pass
class NA_Syndra_Mid_Malphite(Ratings):
pass
class NA_Syndra_Mid_Malzahar(Ratings):
pass
class NA_Syndra_Mid_Maokai(Ratings):
pass
class NA_Syndra_Mid_MasterYi(Ratings):
pass
class NA_Syndra_Mid_MissFortune(Ratings):
pass
class NA_Syndra_Mid_MonkeyKing(Ratings):
pass
class NA_Syndra_Mid_Mordekaiser(Ratings):
pass
class NA_Syndra_Mid_Morgana(Ratings):
pass
class NA_Syndra_Mid_Nami(Ratings):
pass
class NA_Syndra_Mid_Nasus(Ratings):
pass
class NA_Syndra_Mid_Nautilus(Ratings):
pass
class NA_Syndra_Mid_Nidalee(Ratings):
pass
class NA_Syndra_Mid_Nocturne(Ratings):
pass
class NA_Syndra_Mid_Nunu(Ratings):
pass
class NA_Syndra_Mid_Olaf(Ratings):
pass
class NA_Syndra_Mid_Orianna(Ratings):
pass
class NA_Syndra_Mid_Ornn(Ratings):
pass
class NA_Syndra_Mid_Pantheon(Ratings):
pass
class NA_Syndra_Mid_Poppy(Ratings):
pass
class NA_Syndra_Mid_Quinn(Ratings):
pass
class NA_Syndra_Mid_Rakan(Ratings):
pass
class NA_Syndra_Mid_Rammus(Ratings):
pass
class NA_Syndra_Mid_RekSai(Ratings):
pass
class NA_Syndra_Mid_Renekton(Ratings):
pass
class NA_Syndra_Mid_Rengar(Ratings):
pass
class NA_Syndra_Mid_Riven(Ratings):
pass
class NA_Syndra_Mid_Rumble(Ratings):
pass
class NA_Syndra_Mid_Ryze(Ratings):
pass
class NA_Syndra_Mid_Sejuani(Ratings):
pass
class NA_Syndra_Mid_Shaco(Ratings):
pass
class NA_Syndra_Mid_Shen(Ratings):
pass
class NA_Syndra_Mid_Shyvana(Ratings):
pass
class NA_Syndra_Mid_Singed(Ratings):
pass
class NA_Syndra_Mid_Sion(Ratings):
pass
class NA_Syndra_Mid_Sivir(Ratings):
pass
class NA_Syndra_Mid_Skarner(Ratings):
pass
class NA_Syndra_Mid_Sona(Ratings):
pass
class NA_Syndra_Mid_Soraka(Ratings):
pass
class NA_Syndra_Mid_Swain(Ratings):
pass
class NA_Syndra_Mid_Syndra(Ratings):
pass
class NA_Syndra_Mid_TahmKench(Ratings):
pass
class NA_Syndra_Mid_Taliyah(Ratings):
pass
class NA_Syndra_Mid_Talon(Ratings):
pass
class NA_Syndra_Mid_Taric(Ratings):
pass
class NA_Syndra_Mid_Teemo(Ratings):
pass
class NA_Syndra_Mid_Thresh(Ratings):
pass
class NA_Syndra_Mid_Tristana(Ratings):
pass
class NA_Syndra_Mid_Trundle(Ratings):
pass
class NA_Syndra_Mid_Tryndamere(Ratings):
pass
class NA_Syndra_Mid_TwistedFate(Ratings):
pass
class NA_Syndra_Mid_Twitch(Ratings):
pass
class NA_Syndra_Mid_Udyr(Ratings):
pass
class NA_Syndra_Mid_Urgot(Ratings):
pass
class NA_Syndra_Mid_Varus(Ratings):
pass
class NA_Syndra_Mid_Vayne(Ratings):
pass
class NA_Syndra_Mid_Veigar(Ratings):
pass
class NA_Syndra_Mid_Velkoz(Ratings):
pass
class NA_Syndra_Mid_Vi(Ratings):
pass
class NA_Syndra_Mid_Viktor(Ratings):
pass
class NA_Syndra_Mid_Vladimir(Ratings):
pass
class NA_Syndra_Mid_Volibear(Ratings):
pass
class NA_Syndra_Mid_Warwick(Ratings):
pass
class NA_Syndra_Mid_Xayah(Ratings):
pass
class NA_Syndra_Mid_Xerath(Ratings):
pass
class NA_Syndra_Mid_XinZhao(Ratings):
pass
class NA_Syndra_Mid_Yasuo(Ratings):
pass
class NA_Syndra_Mid_Yorick(Ratings):
pass
class NA_Syndra_Mid_Zac(Ratings):
pass
class NA_Syndra_Mid_Zed(Ratings):
pass
class NA_Syndra_Mid_Ziggs(Ratings):
pass
class NA_Syndra_Mid_Zilean(Ratings):
pass
class NA_Syndra_Mid_Zyra(Ratings):
pass
| 15.695444 | 46 | 0.766692 | 972 | 6,545 | 4.736626 | 0.151235 | 0.209818 | 0.389661 | 0.479583 | 0.803432 | 0.803432 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169748 | 6,545 | 416 | 47 | 15.733173 | 0.847258 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
c33fcf9f0d8cf0a41eea7dfa59491c00aefeb65a | 21,276 | py | Python | siri/__init__.py | grutts/siri | c388c209d1c803be166e767d5aa53a93a34c002d | [
"MIT"
] | null | null | null | siri/__init__.py | grutts/siri | c388c209d1c803be166e767d5aa53a93a34c002d | [
"MIT"
] | null | null | null | siri/__init__.py | grutts/siri | c388c209d1c803be166e767d5aa53a93a34c002d | [
"MIT"
] | null | null | null | """siri - A library for dealing with Service Interface for Real-time Information (SIRI) data"""
__version__ = '0.1.0'
__author__ = 'Adrian Gruetter <git@adriang.org>'
from .main import (
CapabilitiesRequest,
CapabilitiesRequestStructure,
CapabilitiesResponse,
CapabilitiesResponseStructure,
ServiceDelivery,
ServiceDeliveryBodyStructure,
ServiceDeliveryStructure,
ServiceRequest,
ServiceRequestStructure,
Siri,
SiriServiceDeliveryStructure,
SiriSubscriptionRequestStructure,
SubscriptionRequest,
SubscriptionRequestStructure,
parse,
serialize,
)
from siri.siri.siri_common_services_v2_0 import (
CheckStatusRequest,
CheckStatusResponse,
ContextualisedRequestStructure,
DataReadyAcknowledgement,
DataReadyNotification,
DataReceivedAcknowledgement,
DataSupplyRequest,
HeartbeatNotification,
SubscriptionResponse,
TerminateSubscriptionRequest,
TerminateSubscriptionResponse,
)
from siri.siri_connection_monitoring_service import (
AbstractDistributorItemStructure,
ConnectingJourneyFilterStructure,
ConnectingTimeFilterStructure,
ConnectionMonitoringCapabilitiesRequest,
ConnectionMonitoringCapabilitiesResponse,
ConnectionMonitoringCapabilitiesResponseStructure,
ConnectionMonitoringCapabilityRequestPolicyStructure,
ConnectionMonitoringDeliveriesStructure,
ConnectionMonitoringDetailEnumeration,
ConnectionMonitoringDistributorDelivery,
ConnectionMonitoringDistributorDeliveryStructure,
ConnectionMonitoringFeederDelivery,
ConnectionMonitoringFeederDeliveryStructure,
ConnectionMonitoringPermissions,
ConnectionMonitoringRequest,
ConnectionMonitoringRequestStructure,
ConnectionMonitoringServiceCapabilities,
ConnectionMonitoringServiceCapabilitiesStructure,
ConnectionMonitoringSubscriptionRequest,
ConnectionMonitoringSubscriptionRequestStructure,
DistributorDepartureCancellationStructure,
MonitoredFeederArrival,
MonitoredFeederArrivalCancellation,
MonitoredFeederArrivalCancellationStructure,
MonitoredFeederArrivalStructure,
StoppingPositionChangedDepartureStructure,
WaitProlongedDepartureStructure,
)
from siri.siri_connection_timetable_service import (
AbstractFeederItemStructure,
ConnectionTimetableCapabilitiesRequest,
ConnectionTimetableCapabilitiesResponse,
ConnectionTimetableCapabilitiesResponseStructure,
ConnectionTimetableCapabilityRequestPolicyStructure,
ConnectionTimetableDeliveriesStructure,
ConnectionTimetableDelivery,
ConnectionTimetableDeliveryStructure,
ConnectionTimetableRequest,
ConnectionTimetableRequestStructure,
ConnectionTimetableServiceCapabilities,
ConnectionTimetableServiceCapabilitiesStructure,
ConnectionTimetableSubscriptionRequest,
ConnectionTimetableSubscriptionStructure,
TimetabledFeederArrival,
TimetabledFeederArrivalCancellation,
TimetabledFeederArrivalCancellationStructure,
TimetabledFeederArrivalStructure,
)
from siri.siri_discovery import (
ConnectionLinksDelivery,
ConnectionLinksDeliveryStructure,
ConnectionLinksDetailEnumeration,
ConnectionLinksDiscoveryRequestStructure,
ConnectionLinksRequest,
FacilityDelivery,
FacilityDeliveryStructure,
FacilityDetailEnumeration,
FacilityRequest,
FacilityRequestStructure,
InfoChannelDelivery,
InfoChannelDeliveryStructure,
InfoChannelDiscoveryRequestStructure,
InfoChannelRequest,
LinesDelivery,
LinesDeliveryStructure,
LinesDetailEnumeration,
LinesDiscoveryRequestStructure,
LinesRequest,
ProductCategoriesDelivery,
ProductCategoriesDeliveryStructure,
ProductCategoriesDiscoveryRequestStructure,
ProductCategoriesRequest,
ServiceFeaturesDelivery,
ServiceFeaturesDeliveryStructure,
ServiceFeaturesDiscoveryRequestStructure,
ServiceFeaturesRequest,
StopPointsDelivery,
StopPointsDeliveryStructure,
StopPointsDetailEnumeration,
StopPointsDiscoveryRequestStructure,
StopPointsRequest,
VehicleFeaturesDelivery,
VehicleFeaturesDeliveryStructure,
VehicleFeaturesRequest,
VehicleFeaturesRequestStructure,
)
from siri.siri_estimated_timetable_service import (
EstimatedTimetableCapabilitiesRequest,
EstimatedTimetableCapabilitiesResponse,
EstimatedTimetableCapabilitiesResponseStructure,
EstimatedTimetableCapabilityRequestPolicyStructure,
EstimatedTimetableDeliveriesStructure,
EstimatedTimetableDelivery,
EstimatedTimetableDeliveryStructure,
EstimatedTimetableDetailEnumeration,
EstimatedTimetablePermissions,
EstimatedTimetableRequest,
EstimatedTimetableRequestStructure,
EstimatedTimetableServiceCapabilities,
EstimatedTimetableServiceCapabilitiesStructure,
EstimatedTimetableSubscriptionRequest,
EstimatedTimetableSubscriptionStructure,
EstimatedVersionFrameStructure,
)
from siri.siri_facility_monitoring_service import (
AccessibilityNeedsFilterStructure,
FacilityCondition,
FacilityMonitoringCapabilitiesRequest,
FacilityMonitoringCapabilitiesResponse,
FacilityMonitoringCapabilitiesResponseStructure,
FacilityMonitoringDeliveriesStructure,
FacilityMonitoringDelivery,
FacilityMonitoringDeliveryStructure,
FacilityMonitoringPermissions,
FacilityMonitoringRequest,
FacilityMonitoringRequestStructure,
FacilityMonitoringServiceCapabilities,
FacilityMonitoringServiceCapabilitiesStructure,
FacilityMonitoringServicePermissionStructure,
FacilityMonitoringSubscriptionRequest,
FacilityMonitoringSubscriptionStructure,
)
from siri.siri_general_message_service import (
GeneralMessage,
GeneralMessageCancellation,
GeneralMessageCapabilitiesRequest,
GeneralMessageCapabilitiesResponse,
GeneralMessageCapabilitiesResponseStructure,
GeneralMessageCapabilityAccessControlStructure,
GeneralMessageDeliveriesStructure,
GeneralMessageDelivery,
GeneralMessageDeliveryStructure,
GeneralMessagePermissions,
GeneralMessageRequest,
GeneralMessageRequestStructure,
GeneralMessageServiceCapabilities,
GeneralMessageServiceCapabilitiesStructure,
GeneralMessageServicePermissionStructure,
GeneralMessageSubscriptionRequest,
GeneralMessageSubscriptionStructure,
InfoChannelPermissionStructure,
InfoMessageCancellationStructure,
InfoMessageStructure,
)
from siri.siri_production_timetable_service import (
DatedTimetableVersionFrame,
DatedTimetableVersionFrameStructure,
ProductionTimetableCapabilitiesRequest,
ProductionTimetableCapabilitiesResponse,
ProductionTimetableCapabilitiesResponseStructure,
ProductionTimetableCapabilityRequestPolicyStructure,
ProductionTimetableDeliveriesStructure,
ProductionTimetableDelivery,
ProductionTimetableDeliveryStructure,
ProductionTimetablePermissions,
ProductionTimetableRequest,
ProductionTimetableRequestStructure,
ProductionTimetableServiceCapabilities,
ProductionTimetableServiceCapabilitiesStructure,
ProductionTimetableSubscriptionRequest,
ProductionTimetableSubscriptionStructure,
)
from siri.siri_situation_exchange_service import (
ContextStructure,
NetworkContextStructure,
RoadFilterStructure,
SituationExchangeCapabilitiesRequest,
SituationExchangeCapabilitiesResponse,
SituationExchangeCapabilitiesResponseStructure,
SituationExchangeDeliveriesStructure,
SituationExchangeDelivery,
SituationExchangeDeliveryStructure,
SituationExchangePermissions,
SituationExchangeRequest,
SituationExchangeRequestStructure,
SituationExchangeServiceCapabilities,
SituationExchangeServiceCapabilitiesStructure,
SituationExchangeServicePermissionStructure,
SituationExchangeSubscriptionRequest,
SituationExchangeSubscriptionStructure,
)
from siri.siri_stop_monitoring_service import (
DeliveryVariantStructure,
MonitoredStopVisit,
MonitoredStopVisitCancellation,
MonitoredStopVisitCancellationStructure,
MonitoredStopVisitStructure,
ServiceException,
ServiceExceptionEnumeration,
ServiceExceptionStructure,
StopLineNotice,
StopLineNoticeCancellation,
StopLineNoticeCancellationStructure,
StopLineNoticeStructure,
StopMonitoringCapabilitiesRequest,
StopMonitoringCapabilitiesResponse,
StopMonitoringCapabilitiesResponseStructure,
StopMonitoringCapabilityRequestPolicyStructure,
StopMonitoringDeliveriesStructure,
StopMonitoringDelivery,
StopMonitoringDeliveryStructure,
StopMonitoringDetailEnumeration,
StopMonitoringFilterStructure,
StopMonitoringMultipleRequest,
StopMonitoringMultipleRequestStructure,
StopMonitoringPermissions,
StopMonitoringRequest,
StopMonitoringRequestStructure,
StopMonitoringServiceCapabilities,
StopMonitoringServiceCapabilitiesStructure,
StopMonitoringServicePermissionStructure,
StopMonitoringSubscriptionRequest,
StopMonitoringSubscriptionStructure,
StopNotice,
StopNoticeCancellation,
StopNoticeCancellationStructure,
StopNoticeStructure,
StopVisitTypeEnumeration,
)
from siri.siri_stop_timetable_service import (
StopTimetableCapabilitiesRequest,
StopTimetableCapabilitiesResponse,
StopTimetableCapabilitiesResponseStructure,
StopTimetableCapabilityRequestPolicyStructure,
StopTimetableDeliveriesStructure,
StopTimetableDelivery,
StopTimetableDeliveryStructure,
StopTimetablePermissions,
StopTimetableRequest,
StopTimetableRequestStructure,
StopTimetableServiceCapabilities,
StopTimetableServiceCapabilitiesStructure,
StopTimetableServicePermissionStructure,
StopTimetableSubscriptionRequest,
StopTimetableSubscriptionStructure,
TimetabledStopVisitCancellationStructure,
TimetabledStopVisitStructure,
)
from siri.siri_vehicle_monitoring_service import (
VehicleActivityCancellationStructure,
VehicleActivityStructure,
VehicleMonitorPermissionStructure,
VehicleMonitoringCapabilitiesRequest,
VehicleMonitoringCapabilitiesResponse,
VehicleMonitoringCapabilitiesResponseStructure,
VehicleMonitoringCapabilityRequestPolicyStructure,
VehicleMonitoringDeliveriesStructure,
VehicleMonitoringDelivery,
VehicleMonitoringDeliveryStructure,
VehicleMonitoringDetailEnumeration,
VehicleMonitoringPermissions,
VehicleMonitoringRequest,
VehicleMonitoringRequestStructure,
VehicleMonitoringServiceCapabilities,
VehicleMonitoringServiceCapabilitiesStructure,
VehicleMonitoringServicePermissionStructure,
VehicleMonitoringSubscriptionRequest,
VehicleMonitoringSubscriptionStructure,
)
__all__ = [
"CapabilitiesRequest",
"CapabilitiesRequestStructure",
"CapabilitiesResponse",
"CapabilitiesResponseStructure",
"ServiceDelivery",
"ServiceDeliveryBodyStructure",
"ServiceDeliveryStructure",
"ServiceRequest",
"ServiceRequestStructure",
"Siri",
"SiriServiceDeliveryStructure",
"SiriSubscriptionRequestStructure",
"SubscriptionRequest",
"SubscriptionRequestStructure",
"AbstractDistributorItemStructure",
"ConnectingJourneyFilterStructure",
"ConnectingTimeFilterStructure",
"ConnectionMonitoringCapabilitiesRequest",
"ConnectionMonitoringCapabilitiesResponse",
"ConnectionMonitoringCapabilitiesResponseStructure",
"ConnectionMonitoringCapabilityRequestPolicyStructure",
"ConnectionMonitoringDeliveriesStructure",
"ConnectionMonitoringDetailEnumeration",
"ConnectionMonitoringDistributorDelivery",
"ConnectionMonitoringDistributorDeliveryStructure",
"ConnectionMonitoringFeederDelivery",
"ConnectionMonitoringFeederDeliveryStructure",
"ConnectionMonitoringPermissions",
"ConnectionMonitoringRequest",
"ConnectionMonitoringRequestStructure",
"ConnectionMonitoringServiceCapabilities",
"ConnectionMonitoringServiceCapabilitiesStructure",
"ConnectionMonitoringSubscriptionRequest",
"ConnectionMonitoringSubscriptionRequestStructure",
"DistributorDepartureCancellationStructure",
"MonitoredFeederArrival",
"MonitoredFeederArrivalCancellation",
"MonitoredFeederArrivalCancellationStructure",
"MonitoredFeederArrivalStructure",
"StoppingPositionChangedDepartureStructure",
"WaitProlongedDepartureStructure",
"AbstractFeederItemStructure",
"ConnectionTimetableCapabilitiesRequest",
"ConnectionTimetableCapabilitiesResponse",
"ConnectionTimetableCapabilitiesResponseStructure",
"ConnectionTimetableCapabilityRequestPolicyStructure",
"ConnectionTimetableDeliveriesStructure",
"ConnectionTimetableDelivery",
"ConnectionTimetableDeliveryStructure",
"ConnectionTimetableRequest",
"ConnectionTimetableRequestStructure",
"ConnectionTimetableServiceCapabilities",
"ConnectionTimetableServiceCapabilitiesStructure",
"ConnectionTimetableSubscriptionRequest",
"ConnectionTimetableSubscriptionStructure",
"TimetabledFeederArrival",
"TimetabledFeederArrivalCancellation",
"TimetabledFeederArrivalCancellationStructure",
"TimetabledFeederArrivalStructure",
"ConnectionLinksDelivery",
"ConnectionLinksDeliveryStructure",
"ConnectionLinksDetailEnumeration",
"ConnectionLinksDiscoveryRequestStructure",
"ConnectionLinksRequest",
"FacilityDelivery",
"FacilityDeliveryStructure",
"FacilityDetailEnumeration",
"FacilityRequest",
"FacilityRequestStructure",
"InfoChannelDelivery",
"InfoChannelDeliveryStructure",
"InfoChannelDiscoveryRequestStructure",
"InfoChannelRequest",
"LinesDelivery",
"LinesDeliveryStructure",
"LinesDetailEnumeration",
"LinesDiscoveryRequestStructure",
"LinesRequest",
"ProductCategoriesDelivery",
"ProductCategoriesDeliveryStructure",
"ProductCategoriesDiscoveryRequestStructure",
"ProductCategoriesRequest",
"ServiceFeaturesDelivery",
"ServiceFeaturesDeliveryStructure",
"ServiceFeaturesDiscoveryRequestStructure",
"ServiceFeaturesRequest",
"StopPointsDelivery",
"StopPointsDeliveryStructure",
"StopPointsDetailEnumeration",
"StopPointsDiscoveryRequestStructure",
"StopPointsRequest",
"VehicleFeaturesDelivery",
"VehicleFeaturesDeliveryStructure",
"VehicleFeaturesRequest",
"VehicleFeaturesRequestStructure",
"EstimatedTimetableCapabilitiesRequest",
"EstimatedTimetableCapabilitiesResponse",
"EstimatedTimetableCapabilitiesResponseStructure",
"EstimatedTimetableCapabilityRequestPolicyStructure",
"EstimatedTimetableDeliveriesStructure",
"EstimatedTimetableDelivery",
"EstimatedTimetableDeliveryStructure",
"EstimatedTimetableDetailEnumeration",
"EstimatedTimetablePermissions",
"EstimatedTimetableRequest",
"EstimatedTimetableRequestStructure",
"EstimatedTimetableServiceCapabilities",
"EstimatedTimetableServiceCapabilitiesStructure",
"EstimatedTimetableSubscriptionRequest",
"EstimatedTimetableSubscriptionStructure",
"EstimatedVersionFrameStructure",
"AccessibilityNeedsFilterStructure",
"FacilityCondition",
"FacilityMonitoringCapabilitiesRequest",
"FacilityMonitoringCapabilitiesResponse",
"FacilityMonitoringCapabilitiesResponseStructure",
"FacilityMonitoringDeliveriesStructure",
"FacilityMonitoringDelivery",
"FacilityMonitoringDeliveryStructure",
"FacilityMonitoringPermissions",
"FacilityMonitoringRequest",
"FacilityMonitoringRequestStructure",
"FacilityMonitoringServiceCapabilities",
"FacilityMonitoringServiceCapabilitiesStructure",
"FacilityMonitoringServicePermissionStructure",
"FacilityMonitoringSubscriptionRequest",
"FacilityMonitoringSubscriptionStructure",
"GeneralMessage",
"GeneralMessageCancellation",
"GeneralMessageCapabilitiesRequest",
"GeneralMessageCapabilitiesResponse",
"GeneralMessageCapabilitiesResponseStructure",
"GeneralMessageCapabilityAccessControlStructure",
"GeneralMessageDeliveriesStructure",
"GeneralMessageDelivery",
"GeneralMessageDeliveryStructure",
"GeneralMessagePermissions",
"GeneralMessageRequest",
"GeneralMessageRequestStructure",
"GeneralMessageServiceCapabilities",
"GeneralMessageServiceCapabilitiesStructure",
"GeneralMessageServicePermissionStructure",
"GeneralMessageSubscriptionRequest",
"GeneralMessageSubscriptionStructure",
"InfoChannelPermissionStructure",
"InfoMessageCancellationStructure",
"InfoMessageStructure",
"DatedTimetableVersionFrame",
"DatedTimetableVersionFrameStructure",
"ProductionTimetableCapabilitiesRequest",
"ProductionTimetableCapabilitiesResponse",
"ProductionTimetableCapabilitiesResponseStructure",
"ProductionTimetableCapabilityRequestPolicyStructure",
"ProductionTimetableDeliveriesStructure",
"ProductionTimetableDelivery",
"ProductionTimetableDeliveryStructure",
"ProductionTimetablePermissions",
"ProductionTimetableRequest",
"ProductionTimetableRequestStructure",
"ProductionTimetableServiceCapabilities",
"ProductionTimetableServiceCapabilitiesStructure",
"ProductionTimetableSubscriptionRequest",
"ProductionTimetableSubscriptionStructure",
"ContextStructure",
"NetworkContextStructure",
"RoadFilterStructure",
"SituationExchangeCapabilitiesRequest",
"SituationExchangeCapabilitiesResponse",
"SituationExchangeCapabilitiesResponseStructure",
"SituationExchangeDeliveriesStructure",
"SituationExchangeDelivery",
"SituationExchangeDeliveryStructure",
"SituationExchangePermissions",
"SituationExchangeRequest",
"SituationExchangeRequestStructure",
"SituationExchangeServiceCapabilities",
"SituationExchangeServiceCapabilitiesStructure",
"SituationExchangeServicePermissionStructure",
"SituationExchangeSubscriptionRequest",
"SituationExchangeSubscriptionStructure",
"DeliveryVariantStructure",
"MonitoredStopVisit",
"MonitoredStopVisitCancellation",
"MonitoredStopVisitCancellationStructure",
"MonitoredStopVisitStructure",
"ServiceException",
"ServiceExceptionEnumeration",
"ServiceExceptionStructure",
"StopLineNotice",
"StopLineNoticeCancellation",
"StopLineNoticeCancellationStructure",
"StopLineNoticeStructure",
"StopMonitoringCapabilitiesRequest",
"StopMonitoringCapabilitiesResponse",
"StopMonitoringCapabilitiesResponseStructure",
"StopMonitoringCapabilityRequestPolicyStructure",
"StopMonitoringDeliveriesStructure",
"StopMonitoringDelivery",
"StopMonitoringDeliveryStructure",
"StopMonitoringDetailEnumeration",
"StopMonitoringFilterStructure",
"StopMonitoringMultipleRequest",
"StopMonitoringMultipleRequestStructure",
"StopMonitoringPermissions",
"StopMonitoringRequest",
"StopMonitoringRequestStructure",
"StopMonitoringServiceCapabilities",
"StopMonitoringServiceCapabilitiesStructure",
"StopMonitoringServicePermissionStructure",
"StopMonitoringSubscriptionRequest",
"StopMonitoringSubscriptionStructure",
"StopNotice",
"StopNoticeCancellation",
"StopNoticeCancellationStructure",
"StopNoticeStructure",
"StopVisitTypeEnumeration",
"StopTimetableCapabilitiesRequest",
"StopTimetableCapabilitiesResponse",
"StopTimetableCapabilitiesResponseStructure",
"StopTimetableCapabilityRequestPolicyStructure",
"StopTimetableDeliveriesStructure",
"StopTimetableDelivery",
"StopTimetableDeliveryStructure",
"StopTimetablePermissions",
"StopTimetableRequest",
"StopTimetableRequestStructure",
"StopTimetableServiceCapabilities",
"StopTimetableServiceCapabilitiesStructure",
"StopTimetableServicePermissionStructure",
"StopTimetableSubscriptionRequest",
"StopTimetableSubscriptionStructure",
"TimetabledStopVisitCancellationStructure",
"TimetabledStopVisitStructure",
"VehicleActivityCancellationStructure",
"VehicleActivityStructure",
"VehicleMonitorPermissionStructure",
"VehicleMonitoringCapabilitiesRequest",
"VehicleMonitoringCapabilitiesResponse",
"VehicleMonitoringCapabilitiesResponseStructure",
"VehicleMonitoringCapabilityRequestPolicyStructure",
"VehicleMonitoringDeliveriesStructure",
"VehicleMonitoringDelivery",
"VehicleMonitoringDeliveryStructure",
"VehicleMonitoringDetailEnumeration",
"VehicleMonitoringPermissions",
"VehicleMonitoringRequest",
"VehicleMonitoringRequestStructure",
"VehicleMonitoringServiceCapabilities",
"VehicleMonitoringServiceCapabilitiesStructure",
"VehicleMonitoringServicePermissionStructure",
"VehicleMonitoringSubscriptionRequest",
"VehicleMonitoringSubscriptionStructure",
"CheckStatusRequest",
"CheckStatusResponse",
"ContextualisedRequestStructure",
"DataReadyAcknowledgement",
"DataReadyNotification",
"DataReceivedAcknowledgement",
"DataSupplyRequest",
"HeartbeatNotification",
"SubscriptionResponse",
"TerminateSubscriptionRequest",
"TerminateSubscriptionResponse",
"parse",
"serialize",
]
| 37.590106 | 95 | 0.818105 | 642 | 21,276 | 27.038941 | 0.475078 | 0.005991 | 0.008295 | 0.011061 | 0.962843 | 0.962843 | 0.962843 | 0.962843 | 0.962843 | 0.962843 | 0 | 0.00027 | 0.129019 | 21,276 | 565 | 96 | 37.656637 | 0.936485 | 0.004183 | 0 | 0 | 0 | 0 | 0.397007 | 0.372079 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023214 | 0 | 0.023214 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c3440cec930073ba6a6d50edba4f5eacae45c7c2 | 7,323 | py | Python | usaspending_api/search/tests/integration/hierarchical_filters/test_tas_filter_heirarchical_cases.py | bminahankcfrb/usaspending-api | 1fb5c4c261edf91ab2930aea7928ca24dfa49123 | [
"CC0-1.0"
] | null | null | null | usaspending_api/search/tests/integration/hierarchical_filters/test_tas_filter_heirarchical_cases.py | bminahankcfrb/usaspending-api | 1fb5c4c261edf91ab2930aea7928ca24dfa49123 | [
"CC0-1.0"
] | null | null | null | usaspending_api/search/tests/integration/hierarchical_filters/test_tas_filter_heirarchical_cases.py | bminahankcfrb/usaspending-api | 1fb5c4c261edf91ab2930aea7928ca24dfa49123 | [
"CC0-1.0"
] | null | null | null | import pytest
from usaspending_api.search.tests.integration.hierarchical_filters.tas_fixtures import (
BASIC_TAS,
ATA_TAS,
SISTER_TAS,
TAS_DICTIONARIES,
TAS_STRINGS,
)
from usaspending_api.search.tests.integration.hierarchical_filters.es_search_test_helpers import (
_setup_es,
query_by_tas,
)
@pytest.mark.django_db
def test_agency_level_require_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_agency_path(BASIC_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_fa_level_require_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_fa_path(BASIC_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_tas_level_require_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_tas_path(BASIC_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_agency_level_exclude_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_agency_path(ATA_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_fa_level_exclude_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_fa_path(ATA_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_tas_level_exclude_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_tas_path(ATA_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_agency_level_require_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_agency_path(ATA_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_fa_level_require_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_fa_path(ATA_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_tas_level_require_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_tas_path(ATA_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_agency_level_exclude_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_agency_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_fa_level_exclude_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_fa_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_tas_level_exclude_non_match(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_tas_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_double_require(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_fa_path(BASIC_TAS), _tas_path(BASIC_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_double_exclude(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"exclude": [_fa_path(BASIC_TAS), _tas_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_exclude_overrides_require(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_tas_path(BASIC_TAS)], "exclude": [_tas_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_exclude_eclipsing_require(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_agency_path(BASIC_TAS)], "exclude": [_fa_path(BASIC_TAS)]})
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_require_eclipsing_exclude(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_fa_path(BASIC_TAS)], "exclude": [_agency_path(BASIC_TAS)]})
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_double_eclipsing_filters(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(
client, {"require": [_agency_path(BASIC_TAS), _tas_path(BASIC_TAS)], "exclude": [_fa_path(BASIC_TAS)]}
)
assert resp.json()["results"] == [_award1()]
@pytest.mark.django_db
def test_double_eclipsing_filters2(client, monkeypatch, elasticsearch_award_index, award_with_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(
client, {"require": [_fa_path(BASIC_TAS)], "exclude": [_agency_path(BASIC_TAS), _tas_path(BASIC_TAS)]}
)
assert resp.json()["results"] == []
@pytest.mark.django_db
def test_sibling_filters(client, monkeypatch, elasticsearch_award_index, multiple_awards_with_sibling_tas):
_setup_es(client, monkeypatch, elasticsearch_award_index)
resp = query_by_tas(client, {"require": [_tas_path(SISTER_TAS[1])]})
assert resp.json()["results"] == [_award2()]
def _award1():
return {"internal_id": 1, "Award ID": "abcdefg", "generated_internal_id": "AWARD_1"}
def _award2():
return {"internal_id": 2, "Award ID": "abcdefg", "generated_internal_id": "AWARD_2"}
def _agency_path(index):
return [_agency(index)]
def _fa_path(index):
return [_agency(index), _fa(index)]
def _tas_path(index):
return [_agency(index), _fa(index), _tas(index)]
def _agency(index):
return TAS_DICTIONARIES[index]["aid"]
def _fa(index):
return f"{TAS_DICTIONARIES[index]['aid']}-{TAS_DICTIONARIES[index]['main']}"
def _tas(index):
return TAS_STRINGS[index]
def _sort_by_id(dictionary):
dictionary["internal_id"]
| 34.219626 | 110 | 0.753926 | 961 | 7,323 | 5.283039 | 0.068678 | 0.133937 | 0.23636 | 0.275753 | 0.903092 | 0.897971 | 0.887335 | 0.858184 | 0.834351 | 0.827063 | 0 | 0.002783 | 0.116619 | 7,323 | 213 | 111 | 34.380282 | 0.782037 | 0 | 0 | 0.455224 | 0 | 0 | 0.068688 | 0.014748 | 0 | 0 | 0 | 0 | 0.149254 | 1 | 0.216418 | false | 0 | 0.022388 | 0.059701 | 0.298507 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5edb0bb361f222a628d91e59d6ec7401fa396cf1 | 43,727 | py | Python | custos-client-sdks/custos-python-sdk/build/lib/custos/server/core/SharingService_pb2_grpc.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 10 | 2019-05-21T22:42:35.000Z | 2022-03-25T15:58:09.000Z | custos-client-sdks/custos-python-sdk/build/lib/custos/server/core/SharingService_pb2_grpc.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 83 | 2019-02-22T12:22:14.000Z | 2022-03-30T13:42:47.000Z | custos-client-sdks/custos-python-sdk/build/lib/custos/server/core/SharingService_pb2_grpc.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 20 | 2019-02-22T08:10:05.000Z | 2021-11-07T19:37:04.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
import custos.server.core.SharingService_pb2 as SharingService__pb2
class SharingServiceStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.createEntityType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/createEntityType',
request_serializer=SharingService__pb2.EntityTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.updateEntityType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/updateEntityType',
request_serializer=SharingService__pb2.EntityTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.deleteEntityType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/deleteEntityType',
request_serializer=SharingService__pb2.EntityTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.getEntityType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getEntityType',
request_serializer=SharingService__pb2.EntityTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.EntityType.FromString,
)
self.getEntityTypes = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getEntityTypes',
request_serializer=SharingService__pb2.SearchRequest.SerializeToString,
response_deserializer=SharingService__pb2.EntityTypes.FromString,
)
self.createPermissionType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/createPermissionType',
request_serializer=SharingService__pb2.PermissionTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.updatePermissionType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/updatePermissionType',
request_serializer=SharingService__pb2.PermissionTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.deletePermissionType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/deletePermissionType',
request_serializer=SharingService__pb2.PermissionTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.getPermissionType = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getPermissionType',
request_serializer=SharingService__pb2.PermissionTypeRequest.SerializeToString,
response_deserializer=SharingService__pb2.PermissionType.FromString,
)
self.getPermissionTypes = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getPermissionTypes',
request_serializer=SharingService__pb2.SearchRequest.SerializeToString,
response_deserializer=SharingService__pb2.PermissionTypes.FromString,
)
self.createEntity = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/createEntity',
request_serializer=SharingService__pb2.EntityRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.updateEntity = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/updateEntity',
request_serializer=SharingService__pb2.EntityRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.isEntityExists = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/isEntityExists',
request_serializer=SharingService__pb2.EntityRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.getEntity = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getEntity',
request_serializer=SharingService__pb2.EntityRequest.SerializeToString,
response_deserializer=SharingService__pb2.Entity.FromString,
)
self.deleteEntity = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/deleteEntity',
request_serializer=SharingService__pb2.EntityRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.searchEntities = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/searchEntities',
request_serializer=SharingService__pb2.SearchRequest.SerializeToString,
response_deserializer=SharingService__pb2.Entities.FromString,
)
self.getListOfSharedUsers = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getListOfSharedUsers',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.SharedOwners.FromString,
)
self.getListOfDirectlySharedUsers = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getListOfDirectlySharedUsers',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.SharedOwners.FromString,
)
self.getListOfSharedGroups = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getListOfSharedGroups',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.SharedOwners.FromString,
)
self.getListOfDirectlySharedGroups = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getListOfDirectlySharedGroups',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.SharedOwners.FromString,
)
self.getAllDirectSharings = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/getAllDirectSharings',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.GetAllDirectSharingsResponse.FromString,
)
self.shareEntityWithUsers = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/shareEntityWithUsers',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.shareEntityWithGroups = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/shareEntityWithGroups',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.revokeEntitySharingFromUsers = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/revokeEntitySharingFromUsers',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.revokeEntitySharingFromGroups = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/revokeEntitySharingFromGroups',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
self.userHasAccess = channel.unary_unary(
'/org.apache.custos.sharing.service.SharingService/userHasAccess',
request_serializer=SharingService__pb2.SharingRequest.SerializeToString,
response_deserializer=SharingService__pb2.Status.FromString,
)
class SharingServiceServicer(object):
"""Missing associated documentation comment in .proto file."""
def createEntityType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def updateEntityType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def deleteEntityType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getEntityType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getEntityTypes(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def createPermissionType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def updatePermissionType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def deletePermissionType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getPermissionType(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getPermissionTypes(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def createEntity(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def updateEntity(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def isEntityExists(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getEntity(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def deleteEntity(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def searchEntities(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getListOfSharedUsers(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getListOfDirectlySharedUsers(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getListOfSharedGroups(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getListOfDirectlySharedGroups(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getAllDirectSharings(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def shareEntityWithUsers(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def shareEntityWithGroups(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def revokeEntitySharingFromUsers(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def revokeEntitySharingFromGroups(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def userHasAccess(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_SharingServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'createEntityType': grpc.unary_unary_rpc_method_handler(
servicer.createEntityType,
request_deserializer=SharingService__pb2.EntityTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'updateEntityType': grpc.unary_unary_rpc_method_handler(
servicer.updateEntityType,
request_deserializer=SharingService__pb2.EntityTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'deleteEntityType': grpc.unary_unary_rpc_method_handler(
servicer.deleteEntityType,
request_deserializer=SharingService__pb2.EntityTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'getEntityType': grpc.unary_unary_rpc_method_handler(
servicer.getEntityType,
request_deserializer=SharingService__pb2.EntityTypeRequest.FromString,
response_serializer=SharingService__pb2.EntityType.SerializeToString,
),
'getEntityTypes': grpc.unary_unary_rpc_method_handler(
servicer.getEntityTypes,
request_deserializer=SharingService__pb2.SearchRequest.FromString,
response_serializer=SharingService__pb2.EntityTypes.SerializeToString,
),
'createPermissionType': grpc.unary_unary_rpc_method_handler(
servicer.createPermissionType,
request_deserializer=SharingService__pb2.PermissionTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'updatePermissionType': grpc.unary_unary_rpc_method_handler(
servicer.updatePermissionType,
request_deserializer=SharingService__pb2.PermissionTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'deletePermissionType': grpc.unary_unary_rpc_method_handler(
servicer.deletePermissionType,
request_deserializer=SharingService__pb2.PermissionTypeRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'getPermissionType': grpc.unary_unary_rpc_method_handler(
servicer.getPermissionType,
request_deserializer=SharingService__pb2.PermissionTypeRequest.FromString,
response_serializer=SharingService__pb2.PermissionType.SerializeToString,
),
'getPermissionTypes': grpc.unary_unary_rpc_method_handler(
servicer.getPermissionTypes,
request_deserializer=SharingService__pb2.SearchRequest.FromString,
response_serializer=SharingService__pb2.PermissionTypes.SerializeToString,
),
'createEntity': grpc.unary_unary_rpc_method_handler(
servicer.createEntity,
request_deserializer=SharingService__pb2.EntityRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'updateEntity': grpc.unary_unary_rpc_method_handler(
servicer.updateEntity,
request_deserializer=SharingService__pb2.EntityRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'isEntityExists': grpc.unary_unary_rpc_method_handler(
servicer.isEntityExists,
request_deserializer=SharingService__pb2.EntityRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'getEntity': grpc.unary_unary_rpc_method_handler(
servicer.getEntity,
request_deserializer=SharingService__pb2.EntityRequest.FromString,
response_serializer=SharingService__pb2.Entity.SerializeToString,
),
'deleteEntity': grpc.unary_unary_rpc_method_handler(
servicer.deleteEntity,
request_deserializer=SharingService__pb2.EntityRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'searchEntities': grpc.unary_unary_rpc_method_handler(
servicer.searchEntities,
request_deserializer=SharingService__pb2.SearchRequest.FromString,
response_serializer=SharingService__pb2.Entities.SerializeToString,
),
'getListOfSharedUsers': grpc.unary_unary_rpc_method_handler(
servicer.getListOfSharedUsers,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.SharedOwners.SerializeToString,
),
'getListOfDirectlySharedUsers': grpc.unary_unary_rpc_method_handler(
servicer.getListOfDirectlySharedUsers,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.SharedOwners.SerializeToString,
),
'getListOfSharedGroups': grpc.unary_unary_rpc_method_handler(
servicer.getListOfSharedGroups,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.SharedOwners.SerializeToString,
),
'getListOfDirectlySharedGroups': grpc.unary_unary_rpc_method_handler(
servicer.getListOfDirectlySharedGroups,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.SharedOwners.SerializeToString,
),
'getAllDirectSharings': grpc.unary_unary_rpc_method_handler(
servicer.getAllDirectSharings,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.GetAllDirectSharingsResponse.SerializeToString,
),
'shareEntityWithUsers': grpc.unary_unary_rpc_method_handler(
servicer.shareEntityWithUsers,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'shareEntityWithGroups': grpc.unary_unary_rpc_method_handler(
servicer.shareEntityWithGroups,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'revokeEntitySharingFromUsers': grpc.unary_unary_rpc_method_handler(
servicer.revokeEntitySharingFromUsers,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'revokeEntitySharingFromGroups': grpc.unary_unary_rpc_method_handler(
servicer.revokeEntitySharingFromGroups,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
'userHasAccess': grpc.unary_unary_rpc_method_handler(
servicer.userHasAccess,
request_deserializer=SharingService__pb2.SharingRequest.FromString,
response_serializer=SharingService__pb2.Status.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'org.apache.custos.sharing.service.SharingService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class SharingService(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def createEntityType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/createEntityType',
SharingService__pb2.EntityTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def updateEntityType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/updateEntityType',
SharingService__pb2.EntityTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def deleteEntityType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/deleteEntityType',
SharingService__pb2.EntityTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getEntityType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getEntityType',
SharingService__pb2.EntityTypeRequest.SerializeToString,
SharingService__pb2.EntityType.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getEntityTypes(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getEntityTypes',
SharingService__pb2.SearchRequest.SerializeToString,
SharingService__pb2.EntityTypes.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def createPermissionType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/createPermissionType',
SharingService__pb2.PermissionTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def updatePermissionType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/updatePermissionType',
SharingService__pb2.PermissionTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def deletePermissionType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/deletePermissionType',
SharingService__pb2.PermissionTypeRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getPermissionType(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getPermissionType',
SharingService__pb2.PermissionTypeRequest.SerializeToString,
SharingService__pb2.PermissionType.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getPermissionTypes(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getPermissionTypes',
SharingService__pb2.SearchRequest.SerializeToString,
SharingService__pb2.PermissionTypes.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def createEntity(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/createEntity',
SharingService__pb2.EntityRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def updateEntity(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/updateEntity',
SharingService__pb2.EntityRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def isEntityExists(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/isEntityExists',
SharingService__pb2.EntityRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getEntity(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getEntity',
SharingService__pb2.EntityRequest.SerializeToString,
SharingService__pb2.Entity.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def deleteEntity(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/deleteEntity',
SharingService__pb2.EntityRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def searchEntities(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/searchEntities',
SharingService__pb2.SearchRequest.SerializeToString,
SharingService__pb2.Entities.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getListOfSharedUsers(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getListOfSharedUsers',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.SharedOwners.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getListOfDirectlySharedUsers(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getListOfDirectlySharedUsers',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.SharedOwners.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getListOfSharedGroups(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getListOfSharedGroups',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.SharedOwners.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getListOfDirectlySharedGroups(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getListOfDirectlySharedGroups',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.SharedOwners.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def getAllDirectSharings(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/getAllDirectSharings',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.GetAllDirectSharingsResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def shareEntityWithUsers(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/shareEntityWithUsers',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def shareEntityWithGroups(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/shareEntityWithGroups',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def revokeEntitySharingFromUsers(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/revokeEntitySharingFromUsers',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def revokeEntitySharingFromGroups(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/revokeEntitySharingFromGroups',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def userHasAccess(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/org.apache.custos.sharing.service.SharingService/userHasAccess',
SharingService__pb2.SharingRequest.SerializeToString,
SharingService__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 49.0213 | 144 | 0.669884 | 3,544 | 43,727 | 8.024266 | 0.037528 | 0.094451 | 0.027956 | 0.041001 | 0.881321 | 0.881321 | 0.863598 | 0.795661 | 0.793762 | 0.734897 | 0 | 0.004858 | 0.256135 | 43,727 | 891 | 145 | 49.076319 | 0.869432 | 0.043063 | 0 | 0.695707 | 1 | 0 | 0.126254 | 0.089962 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068182 | false | 0 | 0.002525 | 0.032828 | 0.107323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f019677f1e1ce461ddc360c80e7b4ad3c3bc153 | 6,466 | py | Python | yt_transcript/youtube_transcript_api/test/test_cli.py | uxio-andrade/hackXLR8 | 4afab09638d37ddc2ba54b76b52097ba10b71770 | [
"MIT"
] | 4 | 2019-11-04T16:34:34.000Z | 2019-11-06T12:22:33.000Z | yt_transcript/youtube_transcript_api/test/test_cli.py | uxio-andrade/hackXLR8 | 4afab09638d37ddc2ba54b76b52097ba10b71770 | [
"MIT"
] | null | null | null | yt_transcript/youtube_transcript_api/test/test_cli.py | uxio-andrade/hackXLR8 | 4afab09638d37ddc2ba54b76b52097ba10b71770 | [
"MIT"
] | null | null | null | from unittest import TestCase
from mock import MagicMock
import json
from youtube_transcript_api._cli import YouTubeTranscriptCli, YouTubeTranscriptApi
class TestYouTubeTranscriptCli(TestCase):
def test_argument_parsing(self):
parsed_args = YouTubeTranscriptCli('v1 v2 --json --languages de en'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.http_proxy, '')
self.assertEqual(parsed_args.https_proxy, '')
parsed_args = YouTubeTranscriptCli('v1 v2 --languages de en --json'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.http_proxy, '')
self.assertEqual(parsed_args.https_proxy, '')
parsed_args = YouTubeTranscriptCli(' --json v1 v2 --languages de en'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.http_proxy, '')
self.assertEqual(parsed_args.https_proxy, '')
parsed_args = YouTubeTranscriptCli(
'v1 v2 --languages de en --json --http-proxy http://user:pass@domain:port --https-proxy https://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.http_proxy, 'http://user:pass@domain:port')
self.assertEqual(parsed_args.https_proxy, 'https://user:pass@domain:port')
parsed_args = YouTubeTranscriptCli(
'v1 v2 --languages de en --json --http-proxy http://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.http_proxy, 'http://user:pass@domain:port')
self.assertEqual(parsed_args.https_proxy, '')
parsed_args = YouTubeTranscriptCli(
'v1 v2 --languages de en --json --https-proxy https://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, ['de', 'en'])
self.assertEqual(parsed_args.https_proxy, 'https://user:pass@domain:port')
self.assertEqual(parsed_args.http_proxy, '')
def test_argument_parsing__only_video_ids(self):
parsed_args = YouTubeTranscriptCli('v1 v2'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, False)
self.assertEqual(parsed_args.languages, [])
def test_argument_parsing__fail_without_video_ids(self):
with self.assertRaises(SystemExit):
YouTubeTranscriptCli('--json'.split())._parse_args()
def test_argument_parsing__json(self):
parsed_args = YouTubeTranscriptCli('v1 v2 --json'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, [])
parsed_args = YouTubeTranscriptCli('--json v1 v2'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, True)
self.assertEqual(parsed_args.languages, [])
def test_argument_parsing__languages(self):
parsed_args = YouTubeTranscriptCli('v1 v2 --languages de en'.split())._parse_args()
self.assertEqual(parsed_args.video_ids, ['v1', 'v2'])
self.assertEqual(parsed_args.json, False)
self.assertEqual(parsed_args.languages, ['de', 'en'])
def test_argument_parsing__proxies(self):
parsed_args = YouTubeTranscriptCli(
'v1 v2 --http-proxy http://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.http_proxy, 'http://user:pass@domain:port')
parsed_args = YouTubeTranscriptCli(
'v1 v2 --https-proxy https://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.https_proxy, 'https://user:pass@domain:port')
parsed_args = YouTubeTranscriptCli(
'v1 v2 --http-proxy http://user:pass@domain:port --https-proxy https://user:pass@domain:port'.split()
)._parse_args()
self.assertEqual(parsed_args.http_proxy, 'http://user:pass@domain:port')
self.assertEqual(parsed_args.https_proxy, 'https://user:pass@domain:port')
parsed_args = YouTubeTranscriptCli(
'v1 v2'.split()
)._parse_args()
self.assertEqual(parsed_args.http_proxy, '')
self.assertEqual(parsed_args.https_proxy, '')
def test_run(self):
YouTubeTranscriptApi.get_transcripts = MagicMock(return_value=([], []))
YouTubeTranscriptCli('v1 v2 --languages de en'.split()).run()
YouTubeTranscriptApi.get_transcripts.assert_called_once_with(
['v1', 'v2'],
languages=['de', 'en'],
continue_after_error=True,
proxies=None
)
def test_run__json_output(self):
YouTubeTranscriptApi.get_transcripts = MagicMock(return_value=([{'boolean': True}], []))
output = YouTubeTranscriptCli('v1 v2 --languages de en --json'.split()).run()
# will fail if output is not valid json
json.loads(output)
def test_run__proxies(self):
YouTubeTranscriptApi.get_transcripts = MagicMock(return_value=([], []))
YouTubeTranscriptCli(
'v1 v2 --languages de en --http-proxy http://user:pass@domain:port --https-proxy https://user:pass@domain:port'.split()).run()
YouTubeTranscriptApi.get_transcripts.assert_called_once_with(
['v1', 'v2'],
languages=['de', 'en'],
continue_after_error=True,
proxies={'http': 'http://user:pass@domain:port', 'https': 'https://user:pass@domain:port'}
)
| 46.855072 | 138 | 0.662852 | 752 | 6,466 | 5.465426 | 0.090426 | 0.150852 | 0.245255 | 0.291971 | 0.878346 | 0.872749 | 0.851825 | 0.817275 | 0.792701 | 0.773723 | 0 | 0.011124 | 0.193628 | 6,466 | 137 | 139 | 47.19708 | 0.777138 | 0.005722 | 0 | 0.642857 | 0 | 0.044643 | 0.178933 | 0 | 0 | 0 | 0 | 0 | 0.455357 | 1 | 0.080357 | false | 0.142857 | 0.035714 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
48e9e65030ceaefc5be33327fab3b6416ff2cbbd | 15,621 | py | Python | netbox/dcim/migrations/0123_standardize_models.py | orphanedgamboa/netbox | 5cdc38ec3adb5278480b267a6c8e674e9d3fca39 | [
"Apache-2.0"
] | 1 | 2022-02-18T03:00:08.000Z | 2022-02-18T03:00:08.000Z | netbox/dcim/migrations/0123_standardize_models.py | emersonfelipesp/netbox | fecca5ad83fb6b48a2f15982dfd3242653f105f9 | [
"Apache-2.0"
] | 1 | 2021-08-23T15:38:47.000Z | 2021-08-23T15:40:10.000Z | netbox/dcim/migrations/0123_standardize_models.py | emersonfelipesp/netbox | fecca5ad83fb6b48a2f15982dfd3242653f105f9 | [
"Apache-2.0"
] | 1 | 2018-12-05T12:03:21.000Z | 2018-12-05T12:03:21.000Z | import django.core.serializers.json
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dcim', '0122_standardize_name_length'),
]
operations = [
migrations.AddField(
model_name='consoleport',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='consoleport',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='consoleport',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='consoleporttemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='consoleporttemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='consoleserverport',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='consoleserverport',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='consoleserverport',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='consoleserverporttemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='consoleserverporttemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='devicebay',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='devicebay',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='devicebay',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='devicebaytemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='devicebaytemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='devicerole',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='frontport',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='frontport',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='frontport',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='frontporttemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='frontporttemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='interface',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='interface',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='interface',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='interfacetemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='interfacetemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='inventoryitem',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='inventoryitem',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='inventoryitem',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='manufacturer',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='platform',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='poweroutlet',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='poweroutlet',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='poweroutlet',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='poweroutlettemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='poweroutlettemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='powerport',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='powerport',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='powerport',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='powerporttemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='powerporttemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='rackgroup',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='rackrole',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='rearport',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='rearport',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AddField(
model_name='rearport',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='rearporttemplate',
name='created',
field=models.DateField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='rearporttemplate',
name='last_updated',
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='region',
name='custom_field_data',
field=models.JSONField(blank=True, default=dict, encoder=django.core.serializers.json.DjangoJSONEncoder),
),
migrations.AlterField(
model_name='cable',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='cablepath',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='consoleport',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='consoleporttemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='consoleserverport',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='consoleserverporttemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='device',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='devicebay',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='devicebaytemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='devicerole',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='devicetype',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='frontport',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='frontporttemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='interface',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='interfacetemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='inventoryitem',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='manufacturer',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='platform',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='powerfeed',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='poweroutlet',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='poweroutlettemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='powerpanel',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='powerport',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='powerporttemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rack',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rackgroup',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rackreservation',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rackrole',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rearport',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='rearporttemplate',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='region',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='site',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='virtualchassis',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
]
| 36.929078 | 117 | 0.579348 | 1,391 | 15,621 | 6.350827 | 0.056794 | 0.083541 | 0.127575 | 0.149762 | 0.931741 | 0.931741 | 0.915327 | 0.915327 | 0.915327 | 0.869142 | 0 | 0.00037 | 0.308431 | 15,621 | 422 | 118 | 37.016588 | 0.817366 | 0 | 0 | 0.956938 | 0 | 0 | 0.10838 | 0.006594 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004785 | 0 | 0.011962 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2e80e743b004215436e811586d3d3e368db35ca | 8,464 | py | Python | posthog/plugins/test/plugin_archives.py | adamb70/posthog | 54ae8f0e70092f86b4aefbd93b56680dbd28b1c5 | [
"MIT"
] | 1 | 2020-12-08T04:04:52.000Z | 2020-12-08T04:04:52.000Z | posthog/plugins/test/plugin_archives.py | adamb70/posthog | 54ae8f0e70092f86b4aefbd93b56680dbd28b1c5 | [
"MIT"
] | null | null | null | posthog/plugins/test/plugin_archives.py | adamb70/posthog | 54ae8f0e70092f86b4aefbd93b56680dbd28b1c5 | [
"MIT"
] | null | null | null | # https://github.com/PostHog/helloworldplugin in a base64 encoded zip file
HELLO_WORLD_PLUGIN_GITHUB_ZIP = (
"d5aa1d2b8a534f37cd93be48b214f490ef9ee904",
"UEsDBAoAAAAAAA8LbVEAAAAAAAAAAAAAAAAjAAkAaGVsbG93b3JsZHBsdWdpbi1pbWFnZWxlc3MtdmVyc2lvbi9VVAUAAc9Qrl9QSwMECgAAAAgADwttUQT7a+JUAAAAdQAAAC4ACQBoZWxsb3dvcmxkcGx1Z2luLWltYWdlbGVzcy12ZXJzaW9uLy5wcmV0dGllcnJjVVQFAAHPUK5fq+ZSAAKlkqLEzJzMvHTn/NzcRCUrBaXUYlMlHahcYlJ4ZkpJBlDYBCpUnJqbCeSmJeYUp8KEgLpzUgNL80tSgTIlRaUwiYKizLwSmAGGRgZctVwAUEsDBAoAAAAIAA8LbVG/Zg9y6wAAAPMBAAArAAkAaGVsbG93b3JsZHBsdWdpbi1pbWFnZWxlc3MtdmVyc2lvbi9pbmRleC5qc1VUBQABz1CuX31RTUvEMBC951eMi5AUS1E8e9Sz4FFE0jjNBrKTkkxcZOl/N9u0hyo0hyHDe/PefOj0QwaGTIZdIEjIeXz12TpSFzCBBmdhauAioLySp+Cx88GqwxsyO7KQR+AjAqM+3Ryaf7yq0YhJCL31GmMwmNLzNxIrvMYWVs8WjDZFdWPNJWZijPAE+qwdV1JnkZVcINnC/dLEjKUttgrcwUMjZpoboJp3pZ8RIztMq+n1/cXe5RG9D/KjNCHPIfovucPtdZyZdaqupDvk27XPWjH/d+je9Z+UT/1SUNKXZbXqsa5gqiPGctRIVaHc4RdQSwMECgAAAAgADwttUVpiCFkvAAAAOQAAACkACQBoZWxsb3dvcmxkcGx1Z2luLWltYWdlbGVzcy12ZXJzaW9uL2xpYi5qc1VUBQABz1CuX0srzUsuyczPU8jJTHKDsTXySnOTUos0Faq5FICgKLWktChPASKooKVgZM1VywUAUEsDBAoAAAAIAA8LbVE7Twti+wAAAO0BAAAvAAkAaGVsbG93b3JsZHBsdWdpbi1pbWFnZWxlc3MtdmVyc2lvbi9wYWNrYWdlLmpzb25VVAUAAc9Qrl+VUbtOAzEQ7O8rjJHSQO6S9noEFEgUlGkud4vtyOe1dm1IFOXf8eNE0qazZ2ZnvONzI4R0wwyyF9IjB41qrcFa/EWy09rbqIyTz1n2A8QGXVZu2k27regEPJLxYWFeCSCIoEEUg4cqmgdTWOMmOLYHrmgd5ESc0zUBAThkGYwaxU6+ECH1wqHIhGAPo/k2MO2kWK0EHE0QW5kmL8WNIL3fBKTTjeHJl82UCSUyQZHsgjzpEDz3XZfOOu7bEefuM1Xwhqq7VlAbaLPDf9QQU0+Ubeoi1ozguCR9vH9VbB/VzWZL6h2JnWGOwNdQjTP4QcGdPo8Ew5T+t7k0f1BLAwQKAAAACAAPC21Ru8C8oc0AAABTAQAALgAJAGhlbGxvd29ybGRwbHVnaW4taW1hZ2VsZXNzLXZlcnNpb24vcGx1Z2luLmpzb25VVAUAAc9Qrl9tjz1rwzAQhnf/iquXLCHesxQytKVToEPms3WWr8g6VzqRL/LfKymQdOggxPu8zwvStQFoPc7UbqGdyDk5SnBmccmyb9elTcHVUnWJ266zrFPqN4PM3V6ifojt/t8ZikPgRVl82b8HIgWdCA7FBPQG3kQAYYdhDZ9fQIaL/HKfz8h1x97QafMd79RxX2C+HmgQP7LN9JpTzj2GR/jzucOEuorAvr4hS691Xh09L9WJGtjbJzc0YnJaqh4vTx7oJ3Egk4sRXaTKb005t+YXUEsBAgAACgAAAAAADwttUQAAAAAAAAAAAAAAACMACQAAAAAAAAAQAAAAAAAAAGhlbGxvd29ybGRwbHVnaW4taW1hZ2VsZXNzLXZlcnNpb24vVVQFAAHPUK5fUEsBAgAACgAAAAgADwttUQT7a+JUAAAAdQAAAC4ACQAAAAAAAQAAAAAASgAAAGhlbGxvd29ybGRwbHVnaW4taW1hZ2VsZXNzLXZlcnNpb24vLnByZXR0aWVycmNVVAUAAc9Qrl9QSwECAAAKAAAACAAPC21Rv2YPcusAAADzAQAAKwAJAAAAAAABAAAAAADzAAAAaGVsbG93b3JsZHBsdWdpbi1pbWFnZWxlc3MtdmVyc2lvbi9pbmRleC5qc1VUBQABz1CuX1BLAQIAAAoAAAAIAA8LbVFaYghZLwAAADkAAAApAAkAAAAAAAEAAAAAADACAABoZWxsb3dvcmxkcGx1Z2luLWltYWdlbGVzcy12ZXJzaW9uL2xpYi5qc1VUBQABz1CuX1BLAQIAAAoAAAAIAA8LbVE7Twti+wAAAO0BAAAvAAkAAAAAAAEAAAAAAK8CAABoZWxsb3dvcmxkcGx1Z2luLWltYWdlbGVzcy12ZXJzaW9uL3BhY2thZ2UuanNvblVUBQABz1CuX1BLAQIAAAoAAAAIAA8LbVG7wLyhzQAAAFMBAAAuAAkAAAAAAAEAAAAAAAAEAABoZWxsb3dvcmxkcGx1Z2luLWltYWdlbGVzcy12ZXJzaW9uL3BsdWdpbi5qc29uVVQFAAHPUK5fUEsFBgAAAAAGAAYATAIAACIFAAAoAGQ1YWExZDJiOGE1MzRmMzdjZDkzYmU0OGIyMTRmNDkwZWY5ZWU5MDQ=",
)
HELLO_WORLD_PLUGIN_GITHUB_ATTACHMENT_ZIP = (
"04801fa46ba26a00eb552fda08d421cbd8bc676d",
"UEsDBAoAAAAAAGkjbVEAAAAAAAAAAAAAAAA6AAkAaGVsbG93b3JsZHBsdWdpbi0wNDgwMWZhNDZiYTI2YTAwZWI1NTJmZGEwOGQ0MjFjYmQ4YmM2NzZkL1VUBQABpnuuX1BLAwQKAAAACABpI21RBPtr4lQAAAB1AAAARQAJAGhlbGxvd29ybGRwbHVnaW4tMDQ4MDFmYTQ2YmEyNmEwMGViNTUyZmRhMDhkNDIxY2JkOGJjNjc2ZC8ucHJldHRpZXJyY1VUBQABpnuuX6vmUgACpZKixMyczLx05/zc3EQlKwWl1GJTJR2oXGJSeGZKSQZQ2AQqVJyamwnkpiXmFKfChIC6c1IDS/NLUoEyJUWlMImCosy8EpgBhkYGXLVcAFBLAwQKAAAACABpI21Rv2YPcusAAADzAQAAQgAJAGhlbGxvd29ybGRwbHVnaW4tMDQ4MDFmYTQ2YmEyNmEwMGViNTUyZmRhMDhkNDIxY2JkOGJjNjc2ZC9pbmRleC5qc1VUBQABpnuuX31RTUvEMBC951eMi5AUS1E8e9Sz4FFE0jjNBrKTkkxcZOl/N9u0hyo0hyHDe/PefOj0QwaGTIZdIEjIeXz12TpSFzCBBmdhauAioLySp+Cx88GqwxsyO7KQR+AjAqM+3Ryaf7yq0YhJCL31GmMwmNLzNxIrvMYWVs8WjDZFdWPNJWZijPAE+qwdV1JnkZVcINnC/dLEjKUttgrcwUMjZpoboJp3pZ8RIztMq+n1/cXe5RG9D/KjNCHPIfovucPtdZyZdaqupDvk27XPWjH/d+je9Z+UT/1SUNKXZbXqsa5gqiPGctRIVaHc4RdQSwMECgAAAAgAaSNtUVpiCFkvAAAAOQAAAEAACQBoZWxsb3dvcmxkcGx1Z2luLTA0ODAxZmE0NmJhMjZhMDBlYjU1MmZkYTA4ZDQyMWNiZDhiYzY3NmQvbGliLmpzVVQFAAGme65fSyvNSy7JzM9TyMlMcoOxNfJKc5NSizQVqrkUgKAotaS0KE8BIqigpWBkzVXLBQBQSwMECgAAAAgAaSNtUTtPC2L7AAAA7QEAAEYACQBoZWxsb3dvcmxkcGx1Z2luLTA0ODAxZmE0NmJhMjZhMDBlYjU1MmZkYTA4ZDQyMWNiZDhiYzY3NmQvcGFja2FnZS5qc29uVVQFAAGme65flVG7TgMxEOzvK4yR0kDukvZ6BBRIFJRpLneL7cjntXZtSBTl3/HjRNKms2dmZ7zjcyOEdMMMshfSIweNaq3BWvxFstPa26iMk89Z9gPEBl1WbtpNu63oBDyS8WFhXgkgiKBBFIOHKpoHU1jjJji2B65oHeREnNM1AQE4ZBmMGsVOvhAh9cKhyIRgD6P5NjDtpFitBBxNEFuZJi/FjSC93wSk043hyZfNlAklMkGR7II86RA8912Xzjru2xHn7jNV8Iaqu1ZQG2izw3/UEFNPlG3qItaM4Lgkfbx/VWwf1c1mS+odiZ1hjsDXUI0z+EHBnT6PBMOU/re5NH9QSwMECgAAAAgAaSNtUcEA2yT0AAAAygEAAEUACQBoZWxsb3dvcmxkcGx1Z2luLTA0ODAxZmE0NmJhMjZhMDBlYjU1MmZkYTA4ZDQyMWNiZDhiYzY3NmQvcGx1Z2luLmpzb25VVAUAAaZ7rl+Fj0FLxDAQhe/9FWMvXsoWPO5F8KDiSRDZ87SZtiNppiYTdV32v5uksCu44CGE+d57k5dDBVA7nKneQj2RtfIp3prFxpFd3WQ1eltE1SVs23ZknWK36WVunyXoo4zt5Zyh0HtelMXl/IMnUtCJYJedgM7AvQgg3KFv4OkFyHA2N/AhFm6u1i0zcomzM/S1eQsrtdxlmK4T6sUNPCZ6SFOaO/Sn4dcfdxPqdQB2pUoy3ZZ48eh+KZ6gnt145oYGjFaz1OH3mXt6j+zJJGFAG6jw4yrXg4jpLjV4Xayggb1EDwYVOwz0twOqYj/N5PS/96p8jtUPUEsBAgAACgAAAAAAaSNtUQAAAAAAAAAAAAAAADoACQAAAAAAAAAQAAAAAAAAAGhlbGxvd29ybGRwbHVnaW4tMDQ4MDFmYTQ2YmEyNmEwMGViNTUyZmRhMDhkNDIxY2JkOGJjNjc2ZC9VVAUAAaZ7rl9QSwECAAAKAAAACABpI21RBPtr4lQAAAB1AAAARQAJAAAAAAABAAAAAABhAAAAaGVsbG93b3JsZHBsdWdpbi0wNDgwMWZhNDZiYTI2YTAwZWI1NTJmZGEwOGQ0MjFjYmQ4YmM2NzZkLy5wcmV0dGllcnJjVVQFAAGme65fUEsBAgAACgAAAAgAaSNtUb9mD3LrAAAA8wEAAEIACQAAAAAAAQAAAAAAIQEAAGhlbGxvd29ybGRwbHVnaW4tMDQ4MDFmYTQ2YmEyNmEwMGViNTUyZmRhMDhkNDIxY2JkOGJjNjc2ZC9pbmRleC5qc1VUBQABpnuuX1BLAQIAAAoAAAAIAGkjbVFaYghZLwAAADkAAABAAAkAAAAAAAEAAAAAAHUCAABoZWxsb3dvcmxkcGx1Z2luLTA0ODAxZmE0NmJhMjZhMDBlYjU1MmZkYTA4ZDQyMWNiZDhiYzY3NmQvbGliLmpzVVQFAAGme65fUEsBAgAACgAAAAgAaSNtUTtPC2L7AAAA7QEAAEYACQAAAAAAAQAAAAAACwMAAGhlbGxvd29ybGRwbHVnaW4tMDQ4MDFmYTQ2YmEyNmEwMGViNTUyZmRhMDhkNDIxY2JkOGJjNjc2ZC9wYWNrYWdlLmpzb25VVAUAAaZ7rl9QSwECAAAKAAAACABpI21RwQDbJPQAAADKAQAARQAJAAAAAAABAAAAAABzBAAAaGVsbG93b3JsZHBsdWdpbi0wNDgwMWZhNDZiYTI2YTAwZWI1NTJmZGEwOGQ0MjFjYmQ4YmM2NzZkL3BsdWdpbi5qc29uVVQFAAGme65fUEsFBgAAAAAGAAYA1gIAANMFAAAoADA0ODAxZmE0NmJhMjZhMDBlYjU1MmZkYTA4ZDQyMWNiZDhiYzY3NmQ=",
)
HELLO_WORLD_PLUGIN_NPM_TGZ = (
"0.0.1",
"H4sIAC4bpF8AA+0Za3PbNjKf+Ss2zE0tX1mZetkzvrm5kSU6ZqvXUXJ9mbZzhkhQQo8ieABo1+34v98CJCVZSezJXexOetrEZrDvBbCLBZKR8F9kQY9ePSO4rnvS6YD5Hh+br9tsF98SoNE66TTcRqfdaoHbaDUaJ6+g85xOVZBLRQS6siKC5fLjfMgWx4/oKeNYf78QyMr1H/g9bzT1nsUGzsdxu/3R9W+6xy64rc6x22gcH7ttXP92o9l4Be6zeLMD/+frP/RnMGAhTSW1rB7P7gRbLBXUwkNo4srAt2RFJbwVlKZLliSWNaFixaRkPAUmYUkFnd/BQpBU0ciBGBmBxxAuiVhQBxQHkt5BRoVEAT5XhKUsXQCBEG1ZyKmWqEbyWN0SQZE5AiIlDxlBfRDxMF/RVBGl7cUsQV9qaknBnpYS9qExElGSWCwFTatIcMvUkucKBJVKsFDrcIClYZJH2oeKnLAVKy1ocTMB0kKlucQItJ8OrHjEYv2lJqwsnydMLh2ImFY9zxUipUaamXR0HEdcgKQ4ZaiBod8m1o13hke7nukJVeUUSY25XfLVw0iYtOJcpGiSGpmI45QZiz/TUGmMZo95kvBbHVrI04jpiOSpZc2QROb8hppYivVNuUJXCxf0AmSbVS1JckmSBOa0nDC0i9NLtsIR2jxmRaoYSSDjwtjbDbOO9i88mI7PZ1fdwAN/CpNg/L3f9/pgd6c4th248mcX48sZIEfQHc3ewfgcuqN38J0/6jvg/WMSeNMpjAPLH04Gvoc4f9QbXPb90Vs4Q7nRGDexj1sZlc7GoA2WqnxvqpUNvaB3gcPumT/wZ+8c69yfjbTO83EAXZh0g5nfuxx0A5hcBpPx1EPzfVQ78kfnAVrxht5oVkeriAPvexzA9KI7GGhTVvcSvQ+0f9AbT94F/tuLGVyMB30PkWceetY9G3iFKQyqN+j6Qwf63WH3rWekxqglsDRb4R1cXXgape118W9v5o9HOozeeDQLcOhglMFsLXrlTz0HuoE/1RNyHoyHjqWnEyXGRgnKjbxCi55qeLAiyKLHl1NvrRD6XneAuqZaWIdYMdet37te7eHzQnX+J2xe//mR4+9/gSfOf8Q2d8//5v78fxmI89QcjHgKzv+5HtTSfDWn4hB+swBBUIWnDxRI+DM0/2Ld7yvBHwKq/GdpRH95pgrwVP6fdNq7+d852ef/iwCRd2kI68SXmOnZjJJV7TfdRMZsAfdVFcCx5AmtJ3xRs6dUKd1p5plp9xSKvLYP3+MrdBxaWC+sHVOZ4CGV0rvB/r5G9W8HPmxU4e8c7xcC/grklmCzHJJwSesLqmoHJenAAbc0b2jyIa1S8DU00BnNxmIorNbREWx/FTa1lVENu7QfDpbYzPODn9CJg1sukujgEd45EYazCKeOw0eY/1T5WUiYfz/Crit1UYqNwHbhrrWKObi3tuu2UfGRil3lf5bkC5ZiAeDpZ99jT+V/p9nayf9Oyz3Z5/9LgN7wdoqXfPsUbLPDzd4utoPtaGouEkNUKpOnR0cLvDbn83rIV0cTLtUFXxx9WC6iMhQs0/tSy+s3BGWKxZXmNHfPc86BwBkRDnw7BVrcWV8X4ivCjFx1MhVY3O0aWbSrBarIMcQWyWtjsq0HW8FdLYk6kFA+EiDT34y44VF3meHRl/l0scFHNCZ5ojRpTn7d4AX9d84EjZAQk0TSMun0zxfWGFX5H3jd/tCrr6JnsPFE/jda7nv9f2t//r8MvIHd5LV0Vi/5oiCAoUB5PMClpIDnL5GYtnMi9ZuTgDueC+C3acklX1vWmzcw1a0E3DACZZmwrEYdzpl5cqJgl8w2ZLj/dFpWbM06eFhjqHk+M89byF5wQyzKp7GEYVuAtvWTlkFcBgOonhMFzbhkios7/TSGRpR+uHroVG/gG4f8ggo/ZEXY34QJ+6n2SLHb4jvUvlYKinc04+YpXJdcld+lD+9N9rXVWkeb43RuQtWVCvuAmIRUx6lr41qrOaavi8fSqJjOqiJVDdQNbmcyT6g8ta6vrzW/VZbHR4v9f1/uUXKR8DlJtmov4miqvdCFUomcOhvCTtVe44vqbcec22vCfVlf73UsVrsOQb7ZLr93Bn3ZsO7/iu+zNIBP3/86u/1fu9Ha1/+XgAf9X1XaNgn+zXZDd0OFLJs5t45/nmjzjIJHm7lCUG5aN0WlabZouOTwo+0JwcUppBw0AWRGQxYzGv1ow1dfAf0FS2JDV4l7o21T9bcUlo0dVrGd+oaYrz+hxtW1hrUpkuM8mTpVdaXm/300YujPCtw8X2xF9ulV9YhJmVO5MbrkK6rPyk/U80ZQEuH6frgzrfIfb5j6Rk+FCD/7Hnsq/xvH7+V/y23v8/8loMoTQViCV58eX62IyUDZqRJGkfkVi9QS0e0SJemKVXefCoXSCf17zhV9cNjbGV6pVKWg0XS/tBvSHvawhz38MeE//YvvUAAoAAA=",
)
| 604.571429 | 3,047 | 0.970227 | 192 | 8,464 | 42.703125 | 0.869792 | 0.003659 | 0.005854 | 0.005367 | 0.054885 | 0.054885 | 0.054885 | 0.054885 | 0.054885 | 0.054885 | 0 | 0.129903 | 0.005907 | 8,464 | 13 | 3,048 | 651.076923 | 0.844545 | 0.008507 | 0 | 0 | 0 | 0.25 | 0.980334 | 0.979738 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2f193daaff75e549057736f03d9b87f7d480bd9 | 20,779 | py | Python | Departments.py | TheoEfthymiadis/HR-psychometrics-synthetic-data-set | b01809cdadf11f42601b3f950b81c94c8eb2b912 | [
"CC0-1.0"
] | null | null | null | Departments.py | TheoEfthymiadis/HR-psychometrics-synthetic-data-set | b01809cdadf11f42601b3f950b81c94c8eb2b912 | [
"CC0-1.0"
] | null | null | null | Departments.py | TheoEfthymiadis/HR-psychometrics-synthetic-data-set | b01809cdadf11f42601b3f950b81c94c8eb2b912 | [
"CC0-1.0"
] | null | null | null | import numpy as np
import pandas as pd
from faker import Faker
import random
import datetime
import sys
import xlwt
#import xlrd
import openpyxl
folder_path = sys.path[0]
input_path = folder_path + '\\employees.xlsx' # Name of the input file
# There is a number of different seed functions that should be specified to produce a controlled and consistent output
fake = Faker()
Faker.seed(1)
seed = 7
random.seed(10)
np.random.seed(seed=5)
employees_df = pd.read_excel(input_path, sheet_name='Professional_Profile', engine='openpyxl')
evaluation_performance = {'1': 'Low', '2': 'Medium', '3': 'High'} # Dictionary that will be used for evaluation
# ----------------------- Working with the HR department -------------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = employees_df[employees_df['Department'] == 'HR'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# HR specific evaluation metrics
# Calculating all employees hired by the specific employee
hired_employees_df = employees_df[
(((employees_df['Recruiter ID'] == department_df.at[i, 'ID']) &
(pd.to_datetime(employees_df['Date Hired'], format='%Y-%m-%d') <=
datetime.datetime.strptime(str(calendar_year), '%Y'))))].reset_index()[['ID', 'Date Hired', 'Time Left']]
hired_employees_df['Time in Company'] = 0
# Calculating the exact time that each of the recruited employees worked for the company
for j in hired_employees_df.index:
hired_employees_df.at[j, 'Time in Company'] = 2020 - hired_employees_df.at[j, 'Time Left'] - \
int(hired_employees_df.at[j, 'Date Hired'][0:4])
evaluation['Total Time of hired employees(years)'] = hired_employees_df['Time in Company'].sum() # Total employee time
evaluation['Average Recruitment Time(months)'] = float("{:.2f}".format(np.random.uniform(1, 12))) # Average recruitment time
active_recruits = hired_employees_df[hired_employees_df['Time Left'] == 0]['Time Left'].count() #How many recruits are still working in the company
evaluation['Employees Fired'] = int(0.2*(len(hired_employees_df) - active_recruits)) # 20% of the recruits that left are considered fired
all_evaluations.append(evaluation.copy())
hr_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
hr_df.to_excel(writer, index=False, sheet_name='HR')
writer.save()
writer.close()
# ------------------------------------------- HR FINISHED --------------------------------------------------------------
# ----------------------- Working with the Sales department ------------------------------------------------------------
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Sales'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Sales specific evaluation metrics
evaluation['Total Sales'] = int(np.random.uniform(1000, 100000))
evaluation['Clients Asking'] = int(np.random.uniform(0, 5))
all_evaluations.append(evaluation.copy())
sales_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
sales_df.to_excel(writer, index=False, sheet_name='Sales')
writer.save()
writer.close()
# ------------------------------------------- Sales FINISHED -----------------------------------------------------------
# ----------------------- Working with the Product department ---------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Product'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Product specific evaluation metrics
evaluation['Total Defects'] = int(np.random.uniform(10, 50))
evaluation['Number of Complaining Customers'] = int(np.random.uniform(0, 20))
all_evaluations.append(evaluation.copy())
product_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
product_df.to_excel(writer, index=False, sheet_name='Product')
writer.save()
writer.close()
# ------------------------------------------- Product FINISHED ---------------------------------------------------------
# ----------------------- Working with the Finance department ---------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Finance'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Finance specific evaluation metrics
evaluation['Non - Servicing Obligactions'] = int(np.random.uniform(0, 10000))
all_evaluations.append(evaluation.copy())
finance_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
finance_df.to_excel(writer, index=False, sheet_name='Finance')
writer.save()
writer.close()
# ------------------------------------------- Finance FINISHED ---------------------------------------------------------
# ----------------------- Working with the Legal department ---------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Legal'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Legal specific evaluation metrics
evaluation['Successful Lawsuits'] = int(np.random.uniform(0, 3))
evaluation['Disputes amicably resolved'] = int(np.random.uniform(0, 6))
all_evaluations.append(evaluation.copy())
legal_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
legal_df.to_excel(writer, index=False, sheet_name='Legal')
writer.save()
writer.close()
# ------------------------------------------- Legal FINISHED ---------------------------------------------------------
# ----------------------- Working with the Strategy department --------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Strategy'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Strategy specific evaluation metrics
evaluation['Total Sales'] = int(np.random.uniform(1000, 10000))
evaluation['Number of Teams'] = int(np.random.uniform(1, 10))
evaluation['Number of Projects'] = int(np.random.uniform(1, 20))
all_evaluations.append(evaluation.copy())
strategy_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
strategy_df.to_excel(writer, index=False, sheet_name='Strategy')
writer.save()
writer.close()
# ------------------------------------------- Strategy FINISHED --------------------------------------------------------
# ----------------------- Working with the Technology department ------------------------------------------------------#
# We only extract the useful information for our department to execute calculations faster
department_df = []
department_df = employees_df[employees_df['Department'] == 'Technology'].reset_index()[['ID', 'Date Hired', 'Time Left',
'Salary', 'Working Experience', 'Recruiter ID']]
all_evaluations = [] # Empty list to append the annual evaluations of the department employees
for i in range(len(department_df)):
evaluation = {}
evaluation['ID'] = department_df.at[i, 'ID']
time_in_company = 2020 - department_df.at[i, 'Time Left'] - int(department_df.at[i, 'Date Hired'][0:4])
for year in range(min(5, time_in_company)):
calendar_year = 2020 - department_df.at[i, 'Time Left'] - year
evaluation['Year'] = calendar_year # Calendar year of the specific evaluation record
evaluation['Loyalty'] = calendar_year - int(department_df.at[i, 'Date Hired'][0:4]) # Employee Loyalty
evaluation['Number of Promotions'] = int(evaluation['Loyalty']/4) # Number of promotions of the employee
evaluation['Bonus'] = int(np.random.uniform(0, 30)/100*int(department_df.at[i, 'Salary'])) # Annual Bonus
evaluation['Overtime'] = int(np.random.uniform(0, 20) / 100 * 1816) # Annual working hours are 1816
evaluation['Chargeability'] = int(np.random.uniform(0, 100))
percentile = np.random.uniform(0, 100) # Randomly estimate the percentile of the employee within the department
if percentile < 15:
evaluation['Department Percentile'] = 'Bottom 15%'
evaluation['Performance'] = 'Low'
elif percentile > 85:
evaluation['Department Percentile'] = 'Top 15%'
evaluation['Performance'] = 'High'
else:
evaluation['Department Percentile'] = 'Mid 70%'
evaluation['Performance'] = evaluation_performance[str(int(np.random.uniform(1, 3)))]
# Technology specific evaluation metrics
evaluation['Problematic Code Commits'] = int(np.random.uniform(0, 20))
all_evaluations.append(evaluation.copy())
technology_df = pd.DataFrame(all_evaluations)
with pd.ExcelWriter(input_path, engine='openpyxl', mode='a') as writer:
technology_df.to_excel(writer, index=False, sheet_name='Technology')
writer.save()
writer.close()
# ------------------------------------------- Strategy FINISHED --------------------------------------------------------
| 59.368571 | 157 | 0.60744 | 2,365 | 20,779 | 5.238478 | 0.085412 | 0.061022 | 0.056905 | 0.052062 | 0.834046 | 0.807168 | 0.788361 | 0.783437 | 0.764226 | 0.764226 | 0 | 0.027619 | 0.215891 | 20,779 | 349 | 158 | 59.538682 | 0.732769 | 0.241879 | 0 | 0.768061 | 0 | 0 | 0.182157 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030418 | 0 | 0.030418 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2fc6502ba56514d6ceeeb6e418031937f4356df | 7,881 | py | Python | rtamt/parser/stl/StlParserVisitor.py | sguysc/rtamt | a16db77b61028f774d81457ff22e666229a5432c | [
"BSD-3-Clause"
] | 24 | 2019-12-04T00:20:16.000Z | 2022-03-24T17:48:14.000Z | rtamt/parser/stl/StlParserVisitor.py | sguysc/rtamt | a16db77b61028f774d81457ff22e666229a5432c | [
"BSD-3-Clause"
] | 142 | 2020-01-16T15:36:21.000Z | 2022-03-28T20:40:45.000Z | rtamt/parser/stl/StlParserVisitor.py | sguysc/rtamt | a16db77b61028f774d81457ff22e666229a5432c | [
"BSD-3-Clause"
] | 17 | 2020-07-07T20:32:08.000Z | 2022-03-07T07:20:22.000Z | # Generated from StlParser.g4 by ANTLR 4.5.1
from antlr4 import *
# This class defines a complete generic visitor for a parse tree produced by StlParser.
class StlParserVisitor(ParseTreeVisitor):
# Visit a parse tree produced by StlParser#interval.
def visitInterval(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#intervalTimeLiteral.
def visitIntervalTimeLiteral(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#constantTimeLiteral.
def visitConstantTimeLiteral(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#unit.
def visitUnit(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprSince.
def visitExprSince(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprParen.
def visitExprParen(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprIff.
def visitExprIff(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExpreOnce.
def visitExpreOnce(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprEv.
def visitExprEv(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprImplies.
def visitExprImplies(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprUntil.
def visitExprUntil(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprNot.
def visitExprNot(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprNext.
def visitExprNext(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprAnd.
def visitExprAnd(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprUnless.
def visitExprUnless(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprPrevious.
def visitExprPrevious(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprHist.
def visitExprHist(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprFall.
def visitExprFall(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprPredicate.
def visitExprPredicate(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprXor.
def visitExprXor(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprRise.
def visitExprRise(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprOr.
def visitExprOr(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprAlways.
def visitExprAlways(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprReal.
def visitExprReal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#specification_file.
def visitSpecification_file(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#specification.
def visitSpecification(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#SpecificationId.
def visitSpecificationId(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#modImport.
def visitModImport(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#assertion.
def visitAssertion(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#declVariable.
def visitDeclVariable(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#declConstant.
def visitDeclConstant(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#annotation.
def visitAnnotation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#rosTopic.
def visitRosTopic(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#variableDeclaration.
def visitVariableDeclaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#constantDeclaration.
def visitConstantDeclaration(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#AsgnLiteral.
def visitAsgnLiteral(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#AsgnExpr.
def visitAsgnExpr(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#domainType.
def visitDomainType(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ioType.
def visitIoType(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprSubtraction.
def visitExprSubtraction(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprPow.
def visitExprPow(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprDivision.
def visitExprDivision(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprMultiplication.
def visitExprMultiplication(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprLiteral.
def visitExprLiteral(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprExp.
def visitExprExp(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprSqrt.
def visitExprSqrt(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprId.
def visitExprId(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprAbs.
def visitExprAbs(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#ExprAddition.
def visitExprAddition(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Leq.
def visitLeq(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Geq.
def visitGeq(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Less.
def visitLess(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Greater.
def visitGreater(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Eq.
def visitEq(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Neq.
def visitNeq(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#literal.
def visitLiteral(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by StlParser#Id.
def visitId(self, ctx):
return self.visitChildren(ctx)
| 26.897611 | 87 | 0.698262 | 943 | 7,881 | 5.83351 | 0.151644 | 0.063261 | 0.105435 | 0.189784 | 0.704236 | 0.704236 | 0.692965 | 0.686784 | 0.686784 | 0.686784 | 0 | 0.000822 | 0.228144 | 7,881 | 292 | 88 | 26.989726 | 0.903502 | 0.387895 | 0 | 0.491379 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008621 | 1 | 0.491379 | false | 0 | 0.017241 | 0.491379 | 1.008621 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d2ff48ded3d834720195675e6d7c0a7a3d7e61ee | 13,339 | py | Python | docraptor/apis/doc_api.py | mkandler/pdf-generator | 1e0fc1e17fd3533b780ff91fc4e321b1f70b600a | [
"CC0-1.0"
] | null | null | null | docraptor/apis/doc_api.py | mkandler/pdf-generator | 1e0fc1e17fd3533b780ff91fc4e321b1f70b600a | [
"CC0-1.0"
] | null | null | null | docraptor/apis/doc_api.py | mkandler/pdf-generator | 1e0fc1e17fd3533b780ff91fc4e321b1f70b600a | [
"CC0-1.0"
] | null | null | null | # coding: utf-8
"""
DocApi.py
Copyright 2016 SmartBear Software
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class DocApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def create_async_doc(self, doc, **kwargs):
"""
Creates a document asynchronously. You must use a callback url or the the returned status id and the status api to find out when it completes. Then use the download api to get the document.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_async_doc(doc, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param Doc doc: The document to be created. (required)
:return: AsyncDoc
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['doc']
all_params.append('callback')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_async_doc" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'doc' is set
if ('doc' not in params) or (params['doc'] is None):
raise ValueError("Missing the required parameter `doc` when calling `create_async_doc`")
resource_path = '/async_docs'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'doc' in params:
body_params = params['doc']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'application/pdf', 'application/vnd.ms-excel', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = ['basicAuth']
response = self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AsyncDoc',
auth_settings=auth_settings,
callback=params.get('callback'))
return response
def create_doc(self, doc, **kwargs):
"""
Creates a document synchronously.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_doc(doc, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param Doc doc: The document to be created. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['doc']
all_params.append('callback')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_doc" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'doc' is set
if ('doc' not in params) or (params['doc'] is None):
raise ValueError("Missing the required parameter `doc` when calling `create_doc`")
resource_path = '/docs'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'doc' in params:
body_params = params['doc']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'application/pdf', 'application/vnd.ms-excel', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = ['basicAuth']
response = self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str',
auth_settings=auth_settings,
callback=params.get('callback'))
return response
def get_async_doc(self, id, **kwargs):
"""
Downloads a document.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_async_doc(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: The download_id returned from status request or a callback. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_async_doc" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_async_doc`")
resource_path = '/download/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'application/pdf', 'application/vnd.ms-excel', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = ['basicAuth']
response = self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str',
auth_settings=auth_settings,
callback=params.get('callback'))
return response
def get_async_doc_status(self, id, **kwargs):
"""
Check on the status of an asynchronously created document.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_async_doc_status(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: The status_id returned when creating an asynchronous document. (required)
:return: AsyncDocStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_async_doc_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_async_doc_status`")
resource_path = '/status/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'application/pdf', 'application/vnd.ms-excel', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = ['basicAuth']
response = self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AsyncDocStatus',
auth_settings=auth_settings,
callback=params.get('callback'))
return response
| 37.574648 | 197 | 0.549441 | 1,350 | 13,339 | 5.271111 | 0.156296 | 0.040472 | 0.027403 | 0.025857 | 0.806352 | 0.799607 | 0.799607 | 0.790613 | 0.790613 | 0.790613 | 0 | 0.001301 | 0.366219 | 13,339 | 354 | 198 | 37.680791 | 0.840431 | 0.288852 | 0 | 0.793296 | 0 | 0 | 0.170065 | 0.042348 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027933 | false | 0 | 0.03352 | 0 | 0.089385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
960e0c56380c97f2e4b73cccebb705cf6187c7fb | 163 | py | Python | myApp/views.py | anthonyc1/django-materialize-boilerplate | ba1ae43bf153647d7a26f665a13596f2b0217d0f | [
"MIT"
] | null | null | null | myApp/views.py | anthonyc1/django-materialize-boilerplate | ba1ae43bf153647d7a26f665a13596f2b0217d0f | [
"MIT"
] | null | null | null | myApp/views.py | anthonyc1/django-materialize-boilerplate | ba1ae43bf153647d7a26f665a13596f2b0217d0f | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.http import HttpResponse
from django.http import Http404
def index(request):
return render(request, 'index.html') | 27.166667 | 37 | 0.815951 | 23 | 163 | 5.782609 | 0.565217 | 0.225564 | 0.210526 | 0.300752 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02069 | 0.110429 | 163 | 6 | 37 | 27.166667 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.6 | 0.2 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 7 |
825db1c822c361564e5fc6ab648dc6340aaa1e30 | 13 | py | Python | src/BBSpider/__init__.py | Zerozzx/BiliBiliSpider | 2154b0a0ed7871f0fe10b9f884b6d40c3330f7a0 | [
"MIT"
] | null | null | null | src/BBSpider/__init__.py | Zerozzx/BiliBiliSpider | 2154b0a0ed7871f0fe10b9f884b6d40c3330f7a0 | [
"MIT"
] | null | null | null | src/BBSpider/__init__.py | Zerozzx/BiliBiliSpider | 2154b0a0ed7871f0fe10b9f884b6d40c3330f7a0 | [
"MIT"
] | null | null | null | '''
T
T
'''
| 2.166667 | 3 | 0.153846 | 2 | 13 | 1 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 13 | 5 | 4 | 2.6 | 0.25 | 0.230769 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7da01464266b532c586bd3ace117a4a4c65ff3b4 | 39,290 | py | Python | tests/test_legal.py | jbradberry/chong | 1468c9c8ab99e4a83fde98b27fcb88366fea787a | [
"MIT"
] | null | null | null | tests/test_legal.py | jbradberry/chong | 1468c9c8ab99e4a83fde98b27fcb88366fea787a | [
"MIT"
] | null | null | null | tests/test_legal.py | jbradberry/chong | 1468c9c8ab99e4a83fde98b27fcb88366fea787a | [
"MIT"
] | 1 | 2018-04-05T19:00:04.000Z | 2018-04-05T19:00:04.000Z | from __future__ import absolute_import
import unittest
from chong import chong
from six.moves import range
board = chong.Board()
class IsLegalPlacementTestCase(unittest.TestCase):
def test_simple_placement(self):
p1 = board.positions[(0, 3)]
p2 = board.positions[(7, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 3, True)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (4, 4, True)))
def test_p1_home_row(self):
p1 = board.positions[(0, 3)]
p2 = board.positions[(1, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (0, 4, True)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (0, 4, True)))
def test_p2_home_row(self):
p1 = board.positions[(6, 3)]
p2 = board.positions[(7, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (7, 3, True)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (7, 3, True)))
def test_occupied_by_enemy_pawn(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, True)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, True)))
def test_occupied_by_friendly_pawn(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, True)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, True)))
def test_occupied_by_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, True)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, True)))
def test_occupied_by_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, True)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, True)))
def test_stones_exhausted(self):
p1 = board.positions[(0, 3)]
p2 = board.positions[(7, 4)]
# p1 to move
player = 1
stones = sum(board.positions[(1, x)] for x in range(6))
state = (p1, p2, stones, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, True)))
# p2 to move
player = 2
stones = sum(board.positions[(6, x)] for x in range(7))
state = (p1, p2, 0, stones, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, True)))
class IsLegalMoveTestCase(unittest.TestCase):
def test_north_simple(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (2, 3, False)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 4, False)))
def test_north_enemy_pawn_block(self):
# p1 to move
player = 1
p1 = board.positions[(4, 3)]
p2 = board.positions[(3, 3)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(4, 4)]
p2 = board.positions[(5, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 4, False)))
def test_north_enemy_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(2, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 3, False)))
# p2 to move
player = 2
stone = board.positions[(3, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 4, False)))
def test_north_friendly_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(2, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 3, False)))
# p2 to move
player = 2
stone = board.positions[(3, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 4, False)))
def test_south_simple(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (4, 3, False)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (5, 4, False)))
def test_south_enemy_pawn_block(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 3)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(5, 4)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 4, False)))
def test_south_enemy_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(4, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
# p2 to move
player = 2
stone = board.positions[(5, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 4, False)))
def test_south_friendly_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(4, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
# p2 to move
player = 2
stone = board.positions[(5, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 4, False)))
def test_east_simple(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 2, False)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (4, 3, False)))
def test_east_enemy_pawn_block(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(3, 2)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 2, False)))
# p2 to move
player = 2
p1 = board.positions[(4, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
def test_east_enemy_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(3, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 2, False)))
# p2 to move
player = 2
stone = board.positions[(4, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
def test_east_friendly_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(3, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 2, False)))
# p2 to move
player = 2
stone = board.positions[(4, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 3, False)))
def test_west_simple(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 4, False)))
# p2 to move
player = 2
state = (p1, p2, 0, 0, player, 1)
self.assertTrue(board.is_legal(state, (4, 5, False)))
def test_west_enemy_pawn_block(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(3, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 4, False)))
# p2 to move
player = 2
p1 = board.positions[(4, 5)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 5, False)))
def test_west_enemy_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(3, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 4, False)))
# p2 to move
player = 2
stone = board.positions[(4, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 5, False)))
def test_west_friendly_stone_block(self):
p1 = board.positions[(3, 3)]
p2 = board.positions[(4, 4)]
# p1 to move
player = 1
stone = board.positions[(3, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 4, False)))
# p2 to move
player = 2
stone = board.positions[(4, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 5, False)))
class IsLegalJumpTestCase(unittest.TestCase):
def test_north_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (2, 4, False)))
def test_north_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 4, False)))
def test_north_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 4, False)))
def test_north_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(1, 3)]
stone = board.positions[(2, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(2, 4)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 4, False)))
def test_north_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(2, 3)]
p2_stone = board.positions[(1, 3)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(2, 4)]
p2_stone = board.positions[(3, 4)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 4, False)))
def test_north_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 3)] + board.positions[(1, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 4)] + board.positions[(2, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 4, False)))
def test_nw_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (2, 6, False)))
def test_nw_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 6, False)))
def test_nw_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 6, False)))
def test_nw_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(1, 5)]
stone = board.positions[(2, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(2, 6)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 6, False)))
def test_nw_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(2, 4)]
p2_stone = board.positions[(1, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(2, 6)]
p2_stone = board.positions[(3, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 6, False)))
def test_nw_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 4)] + board.positions[(1, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 5)] + board.positions[(2, 6)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 6, False)))
def test_west_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (4, 6, False)))
def test_west_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 6, False)))
def test_west_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 6, False)))
def test_west_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(3, 5)]
stone = board.positions[(3, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(4, 6)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 6, False)))
def test_west_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(3, 4)]
p2_stone = board.positions[(3, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(4, 6)]
p2_stone = board.positions[(4, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 6, False)))
def test_west_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 4)] + board.positions[(3, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 5)] + board.positions[(4, 6)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 6, False)))
def test_sw_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (6, 6, False)))
def test_sw_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 6, False)))
def test_sw_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 6, False)))
def test_sw_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(5, 5)]
stone = board.positions[(4, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(6, 6)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 5)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 6, False)))
def test_sw_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(4, 4)]
p2_stone = board.positions[(5, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(6, 6)]
p2_stone = board.positions[(5, 5)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 6, False)))
def test_sw_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 4)] + board.positions[(5, 5)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 5, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 5)] + board.positions[(6, 6)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 6, False)))
def test_south_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (6, 4, False)))
def test_south_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 4, False)))
def test_south_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 4)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 4, False)))
def test_south_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(5, 3)]
stone = board.positions[(4, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(6, 4)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 4, False)))
def test_south_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(4, 3)]
p2_stone = board.positions[(5, 3)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(6, 4)]
p2_stone = board.positions[(5, 4)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 4, False)))
def test_south_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 3)] + board.positions[(5, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 3, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 4)] + board.positions[(6, 4)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 4, False)))
def test_se_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (6, 2, False)))
def test_se_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 2, False)))
def test_se_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (6, 2, False)))
def test_se_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(5, 1)]
stone = board.positions[(4, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(6, 2)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 2, False)))
def test_se_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(4, 2)]
p2_stone = board.positions[(5, 1)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(6, 2)]
p2_stone = board.positions[(5, 3)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 2, False)))
def test_se_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(4, 2)] + board.positions[(5, 1)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (5, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(5, 3)] + board.positions[(6, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (6, 2, False)))
def test_east_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (4, 2, False)))
def test_east_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 2, False)))
def test_east_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (4, 2, False)))
def test_east_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(3, 1)]
stone = board.positions[(3, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(4, 2)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 2, False)))
def test_east_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(3, 2)]
p2_stone = board.positions[(3, 1)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(4, 2)]
p2_stone = board.positions[(4, 3)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 2, False)))
def test_east_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(3, 2)] + board.positions[(3, 1)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (3, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(4, 3)] + board.positions[(4, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (4, 2, False)))
def test_ne_simple(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertTrue(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertTrue(board.is_legal(state, (2, 2, False)))
def test_ne_no_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
state = (p1, p2, 0, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 2, False)))
def test_ne_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (2, 2, False)))
def test_ne_blocking_pawn(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(1, 1)]
stone = board.positions[(2, 2)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(2, 2)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 2, False)))
def test_ne_blocking_enemy_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
p1_stone = board.positions[(2, 2)]
p2_stone = board.positions[(1, 1)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
p1_stone = board.positions[(2, 2)]
p2_stone = board.positions[(3, 3)]
state = (p1, p2, p1_stone, p2_stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 2, False)))
def test_ne_blocking_friendly_stone(self):
# p1 to move
player = 1
p1 = board.positions[(3, 3)]
p2 = board.positions[(7, 4)]
stone = board.positions[(2, 2)] + board.positions[(1, 1)]
state = (p1, p2, stone, 0, player, 1)
self.assertFalse(board.is_legal(state, (1, 1, False)))
# p2 to move
player = 2
p1 = board.positions[(0, 3)]
p2 = board.positions[(4, 4)]
stone = board.positions[(3, 3)] + board.positions[(2, 2)]
state = (p1, p2, 0, stone, player, 1)
self.assertFalse(board.is_legal(state, (2, 2, False)))
| 32.878661 | 65 | 0.530924 | 5,495 | 39,290 | 3.714832 | 0.012011 | 0.264733 | 0.084652 | 0.119924 | 0.979866 | 0.977514 | 0.973546 | 0.966149 | 0.959634 | 0.9564 | 0 | 0.084542 | 0.317205 | 39,290 | 1,194 | 66 | 32.906198 | 0.676371 | 0.04029 | 0 | 0.816327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 1 | 0.081633 | false | 0 | 0.004535 | 0 | 0.089569 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
81469d0f8b3844fb589d83994c38fd5da94d1e11 | 46,774 | py | Python | old/control/DPO.py | ali493/pyro | 1245340077a733e2ab35765eae783b358d2f3af9 | [
"MIT"
] | null | null | null | old/control/DPO.py | ali493/pyro | 1245340077a733e2ab35765eae783b358d2f3af9 | [
"MIT"
] | null | null | null | old/control/DPO.py | ali493/pyro | 1245340077a733e2ab35765eae783b358d2f3af9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Aug 10 13:45:44 2015
@author: agirard
"""
import numpy as np
import matplotlib.pyplot as plt
from scipy.interpolate import RectBivariateSpline as interpol2D
import matplotlib.animation as animation
from AlexRobotics.dynamic import DynamicSystem as RDDS
'''
################################################################################
'''
class ValueIteration1DOF:
""" Dynamic programming for 1 DOF continous dynamic system, one continuous input u """
############################
def __init__(self, sys , cost = 'time' ):
self.DS = sys # Dynamic system class
# Parameters
self.dt = 0.05 # time discretization
self.Nx0 = 101 # x discretizatio0.02**2n
self.Nx1 = 101 # dx discretization
self.Nu0 = 11 # u0 discretization
self.INF = 10 # default large cost
self.max_error = [0.2,0.2] # Value of epsilon
# Quadratic cost
self.rho = 1
self.w_quad = np.array([ 0.02 , 0.01 , self.rho * 0.01 ])
# Predefined cost params
if cost == 'time':
#print('Minimium time optimization')
self.g = self.g_time
self.h = self.h_target
self.Nu0 = 3
self.INF = 6
elif cost == 'quadratic':
#print('Quadratic cost optimization')
self.g = self.g_quadratic
self.h = self.h_zero # no final cost
self.Nu0 = 21
self.INF = 10
elif cost == 'energy':
#print('Minimium energy optimization')
self.g = self.g_energy
self.h = self.h_target
self.INF = 6
else :
print('Warning: not a standar cost function')
#############################
def discretizespace(self):
""" Grid the state space """
self.X = [ None , None ]
self.X[0] = np.linspace( self.DS.x_lb[0] , self.DS.x_ub[0] , self.Nx0 )
self.X[1] = np.linspace( self.DS.x_lb[1] , self.DS.x_ub[1] , self.Nx1 )
#############################
def discretizeactions(self):
self.U = np.linspace( self.DS.u_lb[0] , self.DS.u_ub[0] , self.Nu0 )
#############################
def h(self, x ):
""" Final cost function """
return 0
#############################
def h_zero(self, x ):
""" Final cost function with zero value """
return 0
#############################
def h_target(self, x ):
""" Final cost function """
# Minimum time problem h = 0 , g = 1
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ):
# On target = OK
cost = 0
else:
# Off target = bad
cost = self.INF
return cost
#############################
def g(self, x , u ):
""" step cost function """
return 1
#############################
def g_time(self, x , u ):
""" Minimum time cost """
# On target not doing anything (don't count time at this point)
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ) and ( abs(u[0]) <= 0.1 ):
cost = 0
# Add time for the move
else:
cost = self.dt # minimum time
return cost
#############################
def g_quadratic(self, x , u ):
""" Quadratic additive cost """
# On target not doing anything (don't count time at this point)
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ) and ( abs(u[0]) <= 0.1 ):
cost = 0
# Add time for the move
else:
cost = ( self.w_quad[0] * x[0] ** 2 + self.w_quad[1] * x[1] ** 2 + self.w_quad[2] * u[0] ** 2 ) * self.dt
return cost
#############################
def g_energy(self, x , u ):
""" Electric energy lost """
cost = ( self.w_quad[2] * u[0] ** 2 ) * self.dt # Energy
return cost
##############################
def first_step(self):
""" initial evaluation of cost-to-go """
self.discretizespace()
self.discretizeactions()
self.gridsize = ( self.Nx0 , self.Nx1 )
self.J = np.zeros( self.gridsize )
self.action_policy = np.zeros( self.gridsize , dtype = int )
self.u0_policy = np.zeros( self.gridsize )
self.Jnew = np.zeros( self.gridsize )
self.Jplot = np.zeros( self.gridsize )
# Approximation
self.u0_policy_a = np.zeros( self.gridsize )
# Evaluation lookup tables
self.X_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 ) )
self.U_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 ) )
self.X_next = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 , 2 ) ) # lookup table for dynamic
# Initial evaluation
# For all state nodes
for i in range(self.Nx0):
for j in range(self.Nx1):
x = np.array([ self.X[0][i] , self.X[1][j] ])
# Compute cost of initial states
self.J[i,j] = self.h( x )
# For all control actions
for k in range( self.Nu0 ):
u = np.array([ self.U[k] ])
# Compute next state for all inputs
x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
x_ok = self.DS.isavalidstate(x_next)
u_ok = self.DS.isavalidinput(x,u)
self.X_next[i,j,k,:] = x_next
self.U_ok[i,j,k] = u_ok
self.X_ok[i,j,k] = x_ok
###############################
def compute_step(self):
""" One step of value iteration """
# Get interpolation of current cost space
J_interpol = interpol2D( self.X[0] , self.X[1] , self.J , bbox=[None, None, None, None], kx=1, ky=1,)
# For all states
for i in range(self.Nx0):
for j in range(self.Nx1):
# Actual state vector
x = np.array([ self.X[0][i] , self.X[1][j] ])
# One steps costs - Q values
Q = np.zeros( self.Nu0 )
# For all control actions
for k in range( self.Nu0 ):
# Current u vector to test
u = np.array([ self.U[k] ])
# Compute possibles futur states
x_next = self.X_next[i,j,k] #x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
#x_ok = self.DS.isavalidstate(x_next)
#u_ok = self.DS.isavalidinput(x,u)
x_ok = self.X_ok[i,j,k]
u_ok = self.U_ok[i,j,k]
# If the current option is allowable
if x_ok and u_ok:
J_next = J_interpol( x_next[0] , x_next[1] )
# Cost-to-go of a given action
Q[k] = self.g( x , u ) + J_next[0,0]
else:
# Not allowable states or inputs/states combinations
Q[k] = self.INF
self.Jnew[i,j] = Q.min()
self.action_policy[i,j] = Q.argmin()
self.u0_policy[i,j] = self.U[ self.action_policy[i,j] ]
# Impossible situation ( unaceptable situation for any control action )
if self.Jnew[i,j] > (self.INF-1) :
self.action_policy[i,j] = -1
self.u0_policy[i,j] = 0
# Convergence check
delta = self.J - self.Jnew
j_max =self.Jnew.max()
delta_max = delta.max()
delta_min = delta.min()
print('Max:',j_max,'Delta max:',delta_max, 'Delta min:',delta_min)
self.J = self.Jnew.copy()
################################
def compute_steps(self, l = 50, plot = False):
""" compute number of step """
for i in range(l):
print('Step:',i)
self.compute_step()
if plot:
self.plot_J_update()
################################
def plot_J(self):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
self.Jplot = self.J
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
################################
def plot_J_update(self):
""" print graphic """
self.im1.set_array( self.J )
plt.show()
################################
def plot_raw(self):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
self.Jplot = self.J
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig2 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig2.canvas.set_window_title('Optimal action index')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im2 = plt.pcolormesh( self.X[0] , self.X[1] , self.action_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig3 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig3.canvas.set_window_title('Optimal Policy for u[0]')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im3 = plt.pcolormesh( self.X[0] , self.X[1] , self.u0_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
################################
def plot_J_nice(self, maxJ = 10):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
## Saturation function for cost
for i in range(self.Nx0):
for j in range(self.Nx1):
if self.J[i,j] >= maxJ :
self.Jplot[i,j] = maxJ
else:
self.Jplot[i,j] = self.J[i,j]
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
################################
def assign_interpol_controller(self):
""" controller from optimal actions """
self.b_u0 = interpol2D( self.X[0] , self.X[1] , self.u0_policy , bbox=[None, None, None, None], kx=1, ky=1,)
self.DS.ctl = self.feedback_law_interpol
################################
def feedback_law_interpol(self, x , t = 0 ):
""" controller from optimal actions """
u = np.zeros( self.DS.m )
u[0] = self.b_u0( x[0] , x[1] )
return u
################################
def load_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
try:
# Dyan prog data
self.X = np.load( name + '_X' + '.npy' )
self.J = np.load( name + '_J' + '.npy' )
self.action_policy = np.load( name + '_a' + '.npy' ).astype(int)
self.u0_policy = np.load( name + '_u0' + '.npy' )
except:
print('Failed to load DP data ' )
################################
def save_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
# Dyan prog data
np.save( name + '_X' , self.X )
np.save( name + '_J' , self.J )
np.save( name + '_a' , self.action_policy.astype(int))
np.save( name + '_u0' , self.u0_policy )
################################
def compute_traj_cost(self):
""" Compute cost for trajectories """
X = self.DS.Sim.x_sol_CL
U = self.DS.Sim.u_sol_CL
J_time = 0
J_quad = 0
J_ener = 0
for i in range( X.shape[0] ):
J_time = J_time + self.g_time( X[i,:] , U[i,:] ) / self.dt * self.DS.Sim.tf / self.DS.Sim.n
J_quad = J_quad + self.g_quadratic( X[i,:] , U[i,:] ) / self.dt * self.DS.Sim.tf / self.DS.Sim.n
J_ener = J_ener + self.g_energy( X[i,:] , U[i,:] ) / self.dt * self.DS.Sim.tf / self.DS.Sim.n
self.J_time = J_time
self.J_quad = J_quad
self.J_ener = J_ener
print('Energy : ' + str(J_ener))
print('Time : ' + str(J_time))
print('Quadratic: ' + str(J_quad))
'''
################################################################################
'''
class QLearning1DOF:
""" Dynamic programming for 1 DOF """
############################
def __init__(self, sys , cost = 'time' , experiment_name = 'data' ):
self.DS = sys # Dynamic system class
# Parameters
# Learning params
self.alpha = 0.8
self.gamma = 0.7
self.exp_n = 0
self.x0 = np.array([0,0])
#########################
self.dt = 0.1 # time discretization
self.Nx0 = 51 # x discretizatio0.02**2n
self.Nx1 = 51 # dx discretization
self.Nu0 = 11 # u0 discretization
self.INF = 10 # default large cost
self.max_error = [0.2,0.2] # Value of epsilon
self.cost = cost
self.experiment_name = experiment_name
# Quadratic cost
self.rho = 0.1
self.w_quad = np.array([ 0.01 , 0.01 , self.rho * 0.01 ])
print('Qlearning Algo:')
# Predefined cost params
if cost == 'time':
print('Minimium time optimization')
self.g = self.g_time
self.h = self.h_quad
self.Nu0 = 3
self.INF = 6
elif cost == 'quadratic':
print('Quadratic cost optimization')
self.g = self.g_quadratic
self.h = self.h_quad # no final cost
self.Nu0 = 3
self.INF = 10
elif cost == 'energy':
print('Minimium energy optimization')
self.g = self.g_energy
self.h = self.h_target
self.INF = 6
else :
print('Warning: not a standar cost function')
#############################
def discretizespace(self):
""" Grid the state space """
self.X = [ None , None ]
self.X[0] = np.linspace( self.DS.x_lb[0] , self.DS.x_ub[0] , self.Nx0 )
self.X[1] = np.linspace( self.DS.x_lb[1] , self.DS.x_ub[1] , self.Nx1 )
#############################
def discretizeactions(self):
self.U = np.linspace( self.DS.u_lb[0] , self.DS.u_ub[0] , self.Nu0 )
#############################
def h(self, x ):
""" Final cost function """
return 0
#############################
def h_zero(self, x ):
""" Final cost function with zero value """
return 0
#############################
def h_target(self, x ):
""" Final cost function """
# Minimum time problem h = 0 , g = 1
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ):
# On target = OK
cost = 0
else:
# Off target = bad
cost = self.INF
return cost
#############################
def h_quad(self, x ):
""" Final cost function """
# Minimum time problem h = 0 , g = 1
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ):
# On target = OK
cost = 0
else:
# Off target = bad
cost = ( self.w_quad[0] * x[0] ** 2 + self.w_quad[1] * x[1] ** 2 ) * 10
return cost
#############################
def g(self, x , u ):
""" step cost function """
return 1
#############################
def g_time(self, x , u ):
""" Minimum time cost """
# On target not doing anything (don't count time at this point)
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ) and ( abs(u[0]) <= 0.1 ):
cost = 0
# Add time for the move
else:
cost = self.dt # minimum time
return cost
#############################
def g_quadratic(self, x , u ):
""" Quadratic additive cost """
# On target not doing anything (don't count time at this point)
if ( abs(x[1]) <= self.max_error[1] ) and ( abs(x[0]) <= self.max_error[0] ) and ( abs(u[0]) <= 0.1 ):
cost = 0
# Add time for the move
else:
cost = ( self.w_quad[0] * x[0] ** 2 + self.w_quad[1] * x[1] ** 2 + self.w_quad[2] * u[0] ** 2 ) * self.dt
return cost
#############################
def g_energy(self, x , u ):
""" Electric energy lost """
cost = ( self.w_quad[2] * u[0] ** 2 ) * self.dt # Energy
return cost
##############################
def first_step(self):
""" initial evaluation of cost-to-go """
self.discretizespace()
self.discretizeactions()
self.gridsize = ( self.Nx0 , self.Nx1 )
self.J = np.zeros( self.gridsize )
self.action_policy = np.zeros( self.gridsize )
self.u0_policy = np.zeros( self.gridsize )
self.Jnew = np.zeros( self.gridsize )
self.Jplot = np.zeros( self.gridsize )
# Approximation
self.u0_policy_a = np.zeros( self.gridsize )
# Evaluation lookup tables
self.X_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 ) )
self.U_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 ) )
self.X_next = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 , 2 ) ) # lookup table for dynamic
# Q-values
self.Q = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 ) )
# Initial evaluation
for i in range(self.Nx0):
for j in range(self.Nx1):
x = np.array([ self.X[0][i] , self.X[1][j] ])
# Compute cost of initial states
self.J[i,j] = self.h( x )
for k in range( self.Nu0 ):
self.Q[i,j,k] = self.J[i,j] # Initial Q-value is only local cost
u = self.U[k]
if self.DS.m == 1:
u = np.array( [ u ] )
# Compute next state for all inputs
x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
x_ok = self.DS.isavalidstate( x_next )
u_ok = self.DS.isavalidinput( x , u )
self.X_next[i,j,k,:] = x_next
self.U_ok[i,j,k] = u_ok
self.X_ok[i,j,k] = x_ok
self.assign_interpol_controller()
##############################
def Qlearn(self,i,j,k):
""" """
J_interpol = interpol2D( self.X[0] , self.X[1] , self.J , bbox=[None, None, None, None], kx=1, ky=1,)
x = np.array([ self.X[0][i] , self.X[1][j] ])
u = self.U[k]
x_next = self.X_next[i,j,k]
x_ok = self.X_ok[i,j,k]
u_ok = self.U_ok[i,j,k]
if self.DS.m ==1:
u = np.array( [u] )
# New Q sample
if x_ok and u_ok:
J_next = J_interpol( x_next[0] , x_next[1] )
Q_sample = self.g( x , u ) + J_next[0,0]
else:
Q_sample = self.INF
# Q update
error = Q_sample - self.Q[i,j,k]
self.Q[i,j,k] = self.Q[i,j,k] + self.alpha * error
# J and Policy update
Q_list = self.Q[i,j,:]
self.J[i,j] = Q_list.min()
self.action_policy[i,j] = Q_list.argmin()
self.u0_policy[i,j] = self.U[ self.action_policy[i,j] ]
# Impossible situation
if self.J[i,j] > (self.INF-1) :
self.action_policy[i,j] = -1
self.u0_policy[i,j] = 0
##############################
def Qlearn2( self , x = np.array([0,0]) , k = 0 ):
""" """
i = (np.abs(self.X[0]-x[0])).argmin()
j = (np.abs(self.X[1]-x[1])).argmin()
self.Qlearn(i,j,k)
##############################
def Qlearn3( self , x = np.array([0,0]) , u = np.array([0]) ):
""" Find closest index before calling Qlearn """
i = (np.abs(self.X[0]-x[0])).argmin()
j = (np.abs(self.X[1]-x[1])).argmin()
k = np.abs( self.U - u[0] ).argmin()
self.Qlearn(i,j,k)
##############################
def exploration_ctl( self , x = np.array([0,0]) , t = 0 ):
""" Random or Optimal CTL """
u = np.zeros( self.DS.m )
if np.random.uniform(0,1) < self.gamma:
# Current optimal behavior
u[0] = self.feedback_law_interpol( x , t )
else:
# Random exploration
random_index = int(np.random.uniform( 0 , self.Nu0 ))
u[0] = self.U[ random_index ]
return u
##############################
def training( self , n_trial = 1 , random = False , show = True ):
""" Training experiments """
x0 = self.x0
tf = 10
dt = 0.05
plot = False
n_plot = 1000.
n_print = 100.
for i in range( n_trial ):
self.exp_n = self.exp_n + 1
if random:
p = np.random.uniform( x0[0] - 1 , x0[0] + 1 )
s = np.random.uniform( x0[1] - 0.5 , x0[1] + 0.5 )
x = np.array([p,s])
else:
x = x0
if (i/n_print-int(i/n_print)) < 0.00001 :
print('Experiment #',self.exp_n)
if (i/n_plot-int(i/n_plot)) < 0.00001 and show :
# Show behavior so far
plot = True
self.DS.ctl = self.feedback_law_interpol
self.save_data( self.experiment_name )
else:
plot = False
self.DS.ctl = self.exploration_ctl
self.experiment( x , tf , dt , plot )
# Update optimal laws
self.assign_interpol_controller()
#############################
def experiment(self, x0 = np.array([0,0]) , tf = 10 , dt = 0.05 , plot = False ):
""" Simulate (EULER) and animate robot """
n = int( ( tf + 0.0 ) / dt + 1 )
self.DS.Sim = RDDS.Simulation( self.DS , tf , n , 'euler' )
self.DS.Sim.x0 = x0
self.DS.Sim.compute()
if plot:
self.DS.PTS = np.zeros((2,2,n))
for i in range(n):
self.DS.PTS[:,:,i] = self.DS.fwd_kinematic( self.DS.Sim.x_sol_CL[i,0] ) # Forward kinematic
self.fig = plt.figure()
self.ax = self.fig.add_subplot(111, autoscale_on=False, xlim=(-2, 2), ylim=(-2, 2))
self.ax.grid()
self.DS.line, = self.ax.plot([], [], 'o-', lw=2)
self.DS.time_template = 'time = %.1fs'
self.DS.time_text = self.ax.text(0.05, 0.9, '', transform=self.ax.transAxes)
self.DS.ani = animation.FuncAnimation( self.fig, self.DS.__animateStop__, n, interval=25, blit=True, init_func=self.DS.__ani_init__ , repeat=False)
plt.show()
#Learning
for i in range(n):
self.Qlearn3( self.DS.Sim.x_sol_CL[i,:] , self.DS.Sim.u_sol_CL[i,:] )
###############################
def compute_step(self):
""" One step of value iteration """
# Get interpolation of current cost space
J_interpol = interpol2D( self.X[0] , self.X[1] , self.J , bbox=[None, None, None, None], kx=1, ky=1,)
for i in range(self.Nx0):
for j in range(self.Nx1):
# Actual state vector
x = np.array([ self.X[0][i] , self.X[1][j] ])
#print x
# One steps costs
C = np.zeros( self.Nu0 * 2 )
for k in range( self.Nu0 * 2 ):
# Current u vector to test
u = self.U[k]
# Compute possibles futur states
x_next = self.X_next[i,j,k] #x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
#x_ok = self.DS.isavalidstate(x_next)
#u_ok = self.DS.isavalidinput(x,u)
x_ok = self.X_ok[i,j,k]
u_ok = self.U_ok[i,j,k]
# If the current option is allowable
if x_ok and u_ok:
J_next = J_interpol( x_next[0] , x_next[1] )
# Cost-to-go of a given action
C[k] = self.g( x , u ) + J_next[0,0]
else:
# Not allowable states or inputs/states combinations
C[k] = self.INF
#print x,u,x_next,C[k]
self.Jnew[i,j] = C.min()
self.action_policy[i,j] = C.argmin()
self.u0_policy[i,j] = self.U[ self.action_policy[i,j] ]
# Impossible situation
if self.Jnew[i,j] > (self.INF-1) :
self.action_policy[i,j] = -1
self.u0_policy[i,j] = 0
delta = self.J - self.Jnew
j_max = self.Jnew.max()
delta_max = delta.max()
delta_min = delta.min()
print('Max:',j_max,'Delta max:',delta_max, 'Delta min:',delta_min)
self.J = self.Jnew.copy()
################################
def compute_steps(self, l = 50, plot = False):
""" compute number of step """
#self.first_step()
#self.plot_J()
for i in range(l):
print('Step:',i)
self.compute_step()
if plot:
self.plot_J_update()
################################
def plot_J(self):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
self.Jplot = self.J
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
################################
def plot_raw_nice(self, maxJ = 10):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
## Saturation function for cost
for i in range(self.Nx0):
for j in range(self.Nx1):
if self.J[i,j] >= maxJ :
self.Jplot[i,j] = maxJ
else:
self.Jplot[i,j] = self.J[i,j]
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig2 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig2.canvas.set_window_title('Optimal action index')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im2 = plt.pcolormesh( self.X[0] , self.X[1] , self.action_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig3 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig3.canvas.set_window_title('Optimal Policy for u[0]')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im3 = plt.pcolormesh( self.X[0] , self.X[1] , self.u0_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
################################
def assign_interpol_controller(self):
""" controller from optimal actions """
self.b_u0 = interpol2D( self.X[0] , self.X[1] , self.u0_policy , bbox=[None, None, None, None], kx=1, ky=1,)
self.DS.ctl = self.feedback_law_interpol
################################
def feedback_law_interpol(self, x , t = 0 ):
""" controller from optimal actions """
u = np.zeros( self.DS.m )
u[0] = self.b_u0( x[0] , x[1] )
return u
################################
def load_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
folder = 'data/'
# Dyan prog data
self.X = np.load( folder + name + '_X' + '.npy' )
self.J = np.load( folder + name + '_J' + '.npy' )
self.action_policy = np.load( folder + name + '_a' + '.npy' )
self.u0_policy = np.load( folder + name + '_u0' + '.npy' )
self.Q = np.load( folder + name + '_Q' + '.npy' )
self.assign_interpol_controller()
################################
def save_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
folder = 'data/'
# Dyan prog data
np.save( folder + name + '_X' , self.X )
np.save( folder + name + '_J' , self.J )
np.save( folder + name + '_a' , self.action_policy )
np.save( folder + name + '_u0' , self.u0_policy )
np.save( folder + name + '_Q' , self.Q )
'''
################################################################################
'''
class ValueIteration_hybrid_1DOF( ValueIteration1DOF ) :
############################
def __init__( self , sys , cost = 'time' ) :
ValueIteration1DOF.__init__( self, sys , cost )
#############################
def discretizeactions(self):
self.U = np.zeros([self.Nu0 * 2 , 2])
# Continuous options
Uc = np.linspace( self.DS.u_lb[0] , self.DS.u_ub[0] , self.Nu0 )
self.U[0:self.Nu0,0] = Uc
self.U[self.Nu0:,0] = Uc
# Discrete options
self.U[0:self.Nu0,1] = 1 # Gear #1
self.U[self.Nu0:,1] = 10 # Gear #2
##############################
def first_step(self):
""" initial evaluation of cost-to-go """
self.discretizespace()
self.discretizeactions()
self.gridsize = ( self.Nx0 , self.Nx1 )
self.J = np.zeros( self.gridsize )
self.action_policy = np.zeros( self.gridsize )
self.u0_policy = np.zeros( self.gridsize )
self.u1_policy = np.zeros( self.gridsize )
self.Jnew = np.zeros( self.gridsize )
self.Jplot = np.zeros( self.gridsize )
# Approximation
self.u0_policy_a = np.zeros( self.gridsize )
self.u1_policy_a = np.zeros( self.gridsize )
# Evaluation lookup tables
self.X_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 * 2 ) )
self.U_ok = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 * 2 ) )
self.X_next = np.zeros( ( self.Nx0 , self.Nx1 , self.Nu0 * 2 , 2 ) ) # lookup table for dynamic
# Initial evaluation
for i in range(self.Nx0):
for j in range(self.Nx1):
x = np.array([ self.X[0][i] , self.X[1][j] ])
# Compute cost of initial states
self.J[i,j] = self.h( x )
for k in range( self.Nu0 * 2 ):
u = self.U[k]
# Compute next state for all inputs
x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
x_ok = self.DS.isavalidstate(x_next)
u_ok = self.DS.isavalidinput(x,u)
self.X_next[i,j,k,:] = x_next
self.U_ok[i,j,k] = u_ok
self.X_ok[i,j,k] = x_ok
###############################
def compute_step(self):
""" One step of value iteration """
# Get interpolation of current cost space
J_interpol = interpol2D( self.X[0] , self.X[1] , self.J , bbox=[None, None, None, None], kx=1, ky=1,)
for i in range(self.Nx0):
for j in range(self.Nx1):
# Actual state vector
x = np.array([ self.X[0][i] , self.X[1][j] ])
#print x
# One steps costs
Q = np.zeros( self.Nu0 * 2 )
for k in range( self.Nu0 * 2 ):
# Current u vector to test
u = self.U[k]
# Compute possibles futur states
x_next = self.X_next[i,j,k] #x_next = self.DS.fc( x , u ) * self.dt + x
# validity of the options
#x_ok = self.DS.isavalidstate(x_next)
#u_ok = self.DS.isavalidinput(x,u)
x_ok = self.X_ok[i,j,k]
u_ok = self.U_ok[i,j,k]
# If the current option is allowable
if x_ok and u_ok:
J_next = J_interpol( x_next[0] , x_next[1] )
# Cost-to-go of a given action
#print x,u,self.g( x , u ) , J_next[0,0]
Q[k] = self.g( x , u ) + J_next[0,0]
else:
# Not allowable states or inputs/states combinations
Q[k] = self.INF
#print x,u,x_next,C[k]
self.Jnew[i,j] = Q.min()
self.action_policy[i,j] = Q.argmin()
self.u0_policy[i,j] = self.U[ self.action_policy[i,j] ][0]
self.u1_policy[i,j] = self.U[ self.action_policy[i,j] ][1]
# Impossible situation
if self.Jnew[i,j] > (self.INF-1) :
self.action_policy[i,j] = -1
self.u0_policy[i,j] = 0
self.u1_policy[i,j] = 1
delta = self.J - self.Jnew
j_max =self.Jnew.max()
delta_max = delta.max()
delta_min = delta.min()
print('Max:',j_max,'Delta max:',delta_max, 'Delta min:',delta_min)
self.J = self.Jnew.copy()
################################
def assign_interpol_controller(self):
""" controller from optimal actions """
self.b_u0 = interpol2D( self.X[0] , self.X[1] , self.u0_policy , bbox=[None, None, None, None], kx=1, ky=1,)
self.b_u1 = interpol2D( self.X[0] , self.X[1] , self.u1_policy , bbox=[None, None, None, None], kx=1, ky=1,)
self.DS.ctl = self.feedback_law_interpol
################################
def feedback_law_interpol(self, x , t = 0 ):
""" controller from optimal actions """
u = np.zeros( self.DS.m )
u[0] = self.b_u0( x[0] , x[1] )
u[1] = np.round( self.b_u1( x[0] , x[1] ) )
# Quick fix
if u[1] > 5 :
u[1] = 10
elif u[1] <= 5:
u[1] = 1
return u
################################
def load_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
# Dyan prog data
self.X = np.load( name + '_X' + '.npy' )
self.J = np.load( name + '_J' + '.npy' )
self.action_policy = np.load( name + '_a' + '.npy' )
self.u0_policy = np.load( name + '_u0' + '.npy' )
self.u1_policy = np.load( name + '_u1' + '.npy' )
self.assign_interpol_controller()
################################
def save_data(self, name = 'DP_data'):
""" Save optimal controller policy and cost to go """
# Dyan prog data
np.save( name + '_X' , self.X )
np.save( name + '_J' , self.J )
np.save( name + '_a' , self.action_policy )
np.save( name + '_u0' , self.u0_policy )
np.save( name + '_u1' , self.u1_policy )
################################
def plot_raw_nice(self, maxJ = 10):
""" print graphic """
xname = self.DS.state_label[0] + ' ' + self.DS.state_units[0]
yname = self.DS.state_label[1] + ' ' + self.DS.state_units[1]
## Saturation function for cost
for i in range(self.Nx0):
for j in range(self.Nx1):
if self.J[i,j] >= maxJ :
self.Jplot[i,j] = maxJ
else:
self.Jplot[i,j] = self.J[i,j]
###################
fs = 10
self.fig1 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig1.canvas.set_window_title('Cost-to-go')
self.ax1 = self.fig1.add_subplot(1,1,1)
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im1 = plt.pcolormesh( self.X[0] , self.X[1] , self.Jplot.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig2 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig2.canvas.set_window_title('Optimal action index')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im2 = plt.pcolormesh( self.X[0] , self.X[1] , self.action_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig3 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig3.canvas.set_window_title('Optimal Policy for u[0]')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im3 = plt.pcolormesh( self.X[0] , self.X[1] , self.u0_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
self.fig4 = plt.figure(figsize=(4, 4),dpi=300, frameon=True)
self.fig4.canvas.set_window_title('Optimal Policy for u[1]')
plt.ylabel(yname, fontsize = fs)
plt.xlabel(xname, fontsize = fs)
self.im4 = plt.pcolormesh( self.X[0] , self.X[1] , self.u1_policy.T )
plt.axis([self.DS.x_lb[0] , self.DS.x_ub[0], self.DS.x_lb[1] , self.DS.x_ub[1]])
plt.colorbar()
plt.grid(True)
plt.tight_layout()
'''
#################################################################
################## Main ########
#################################################################
'''
if __name__ == "__main__":
""" MAIN TEST """
pass
| 33.602011 | 159 | 0.416407 | 5,497 | 46,774 | 3.436966 | 0.066764 | 0.046049 | 0.02223 | 0.014291 | 0.858466 | 0.840946 | 0.817446 | 0.794686 | 0.78193 | 0.776372 | 0 | 0.031388 | 0.418309 | 46,774 | 1,392 | 160 | 33.602011 | 0.663004 | 0.111151 | 0 | 0.762259 | 0 | 0 | 0.020523 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084695 | false | 0.001486 | 0.007429 | 0 | 0.124814 | 0.026746 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
816103d9c17e7ad5934f8b21d7728c32afa022a2 | 303 | py | Python | guillotina/fields/__init__.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 173 | 2017-03-10T18:26:12.000Z | 2022-03-03T06:48:56.000Z | guillotina/fields/__init__.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 921 | 2017-03-08T14:04:43.000Z | 2022-03-30T10:28:56.000Z | guillotina/fields/__init__.py | rboixaderg/guillotina | fcae65c2185222272f3b8fee4bc2754e81e0e983 | [
"BSD-2-Clause"
] | 60 | 2017-03-16T19:59:44.000Z | 2022-03-03T06:48:59.000Z | from guillotina.fields.annotation import BucketDictField # noqa
from guillotina.fields.annotation import BucketListField # noqa
from guillotina.fields.dynamic import DynamicField # noqa
from guillotina.fields.files import CloudFileField # noqa
from guillotina.fields.patch import PatchField # noqa
| 50.5 | 64 | 0.834983 | 35 | 303 | 7.228571 | 0.4 | 0.27668 | 0.395257 | 0.379447 | 0.284585 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115512 | 303 | 5 | 65 | 60.6 | 0.94403 | 0.079208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8186fb473adc13c3e9e6c804f597b14fb12fce5a | 13,011 | py | Python | tests/test_typeinference.py | diraol/csvkit | 118f79dbf013c7be2c160bb7dc5990f0997adc92 | [
"MIT"
] | 3 | 2016-05-16T13:35:03.000Z | 2020-02-13T04:19:14.000Z | tests/test_typeinference.py | diraol/csvkit | 118f79dbf013c7be2c160bb7dc5990f0997adc92 | [
"MIT"
] | null | null | null | tests/test_typeinference.py | diraol/csvkit | 118f79dbf013c7be2c160bb7dc5990f0997adc92 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import datetime
from types import NoneType
import unittest
from csvkit import typeinference
from csvkit.exceptions import InvalidValueForTypeException, InvalidValueForTypeListException
class TestNormalizeType(unittest.TestCase):
def test_nulls(self):
self.assertEqual((NoneType, [None, None, None, None, None, None]), typeinference.normalize_column_type([u'n/a', u'NA', u'.', u'null', u'none', u'']))
def test_nulls_coerce(self):
self.assertEqual((NoneType, [None, None, None, None, None, None]), typeinference.normalize_column_type([u'n/a', u'NA', u'.', u'null', u'none', u''], normal_type=NoneType))
def test_nulls_coerce_fail(self):
try:
typeinference.normalize_column_type([u'n/a', u'NA', u'.', u'1.7', u'none', u''], normal_type=NoneType)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, '1.7')
self.assertEqual(e.normal_type, NoneType)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_ints(self):
self.assertEqual((int, [1, -87, 418000000, None]), typeinference.normalize_column_type([u'1', u'-87', u'418000000', u'']))
def test_ints_coerce(self):
self.assertEqual((int, [1, -87, 418000000, None]), typeinference.normalize_column_type([u'1', u'-87', u'418000000', u''], normal_type=int))
def test_ints_coerce_fail(self):
try:
typeinference.normalize_column_type([u'1', u'-87', u'418000000', u'', u'TRUE'], normal_type=int)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 4)
self.assertEqual(e.value, 'TRUE')
self.assertEqual(e.normal_type, int)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_padded_ints(self):
self.assertEqual((unicode, [u'0001', u'0997', u'8.7', None]), typeinference.normalize_column_type([u'0001', u'0997', u'8.7', u'']))
def test_padded_ints_coerce(self):
self.assertEqual((unicode, [u'0001', u'0997', u'8.7', None]), typeinference.normalize_column_type([u'0001', u'0997', u'8.7', u''], normal_type='unicode'))
def test_padded_ints_coerce_fail(self):
try:
typeinference.normalize_column_type([u'0001', u'0997', u'8.7', u''], normal_type=int)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 0)
self.assertEqual(e.value, '0001')
self.assertEqual(e.normal_type, int)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_comma_ints(self):
self.assertEqual((int, [1, -87, 418000000, None]), typeinference.normalize_column_type([u'1', u'-87', u'418,000,000', u'']))
def test_floats(self):
self.assertEqual((float, [1.01, -87.413, 418000000.0, None]), typeinference.normalize_column_type([u'1.01', u'-87.413', u'418000000.0', u'']))
def test_floats_coerce(self):
self.assertEqual((float, [1.01, -87.413, 418000000.0, None]), typeinference.normalize_column_type([u'1.01', u'-87.413', u'418000000.0', u''], normal_type=float))
def test_floats_coerce_fail(self):
try:
typeinference.normalize_column_type([u'1', u'-87.413', u'418000000.0', u'Hello, world!'], normal_type=float)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, 'Hello, world!')
self.assertEqual(e.normal_type, float)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_comma_floats(self):
self.assertEqual((float, [1.01, -87.413, 418000000.0, None]), typeinference.normalize_column_type([u'1.01', u'-87.413', u'418,000,000.0', u'']))
def test_strings(self):
self.assertEqual((unicode, [u'Chicago Tribune', u'435 N Michigan ave', u'Chicago, IL', None]), typeinference.normalize_column_type([u'Chicago Tribune', u'435 N Michigan ave', u'Chicago, IL', u'']))
def test_strings_with_nulls(self):
self.assertEqual((unicode, [u'A', None, u'C', None]), typeinference.normalize_column_type([u'A', u'', u'C', None], blanks_as_nulls=True))
def test_strings_with_blanks(self):
self.assertEqual((unicode, [u'A', u'', u'C', None]), typeinference.normalize_column_type([u'A', u'', u'C', None], blanks_as_nulls=False))
def test_strings_coerce(self):
self.assertEqual((unicode, [u'Chicago Tribune', u'435 N Michigan ave', u'Chicago, IL', None]), typeinference.normalize_column_type([u'Chicago Tribune', u'435 N Michigan ave', u'Chicago, IL', u''], normal_type=unicode))
def test_ints_floats(self):
self.assertEqual((float, [1.01, -87, 418000000, None]), typeinference.normalize_column_type([u'1.01', u'-87', u'418000000', u'']))
def test_mixed(self):
self.assertEqual((unicode, [u'Chicago Tribune', u'-87.413', u'418000000', None]), typeinference.normalize_column_type([u'Chicago Tribune', u'-87.413', u'418000000', u'']))
def test_booleans(self):
self.assertEqual((bool, [False, True, False, True, None]), typeinference.normalize_column_type([u'False', u'TRUE', u'FALSE', u'yes', u'']))
def test_booleans_coerce(self):
self.assertEqual((bool, [False, True, False, True, None]), typeinference.normalize_column_type([u'False', u'TRUE', u'FALSE', u'yes', u''], normal_type=bool))
def test_booleans_coerce_fail(self):
try:
typeinference.normalize_column_type([u'False', u'TRUE', u'FALSE', u'17', u''], normal_type=bool)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, '17')
self.assertEqual(e.normal_type, bool)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_datetimes(self):
self.assertEqual((datetime.datetime, [datetime.datetime(2008, 1, 1, 4, 40, 0), datetime.datetime(2010, 1, 27, 3, 45, 0), datetime.datetime(2008, 3, 1, 16, 14, 45), None]), typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08 16:14:45', u'']))
def test_datetimes_coerce(self):
self.assertEqual((datetime.datetime, [datetime.datetime(2008, 1, 1, 4, 40, 0), datetime.datetime(2010, 1, 27, 3, 45, 0), datetime.datetime(2008, 3, 1, 16, 14, 45), None]), typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08 16:14:45', u''], normal_type=datetime.datetime))
def test_datetimes_coerce_fail(self):
try:
typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08 16:14:45', u'4:45 AM'], normal_type=datetime.datetime)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, '4:45 AM')
self.assertEqual(e.normal_type, datetime.datetime)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_dates(self):
self.assertEqual((datetime.date, [datetime.date(2008, 1, 1), datetime.date(2010, 1, 27), datetime.date(2008, 3, 1), None]), typeinference.normalize_column_type([u'Jan 1, 2008', u'2010-01-27', u'3/1/08', u'']))
def test_dates_coerce(self):
self.assertEqual((datetime.date, [datetime.date(2008, 1, 1), datetime.date(2010, 1, 27), datetime.date(2008, 3, 1), None]), typeinference.normalize_column_type([u'Jan 1, 2008', u'2010-01-27', u'3/1/08', u''], normal_type=datetime.date))
def test_dates_coerce_fail(self):
try:
typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08 16:14:45', u'4:45 AM'], normal_type=datetime.datetime)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, '4:45 AM')
self.assertEqual(e.normal_type, datetime.datetime)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_times(self):
self.assertEqual((datetime.time, [datetime.time(4, 40, 0), datetime.time(3, 45, 0), datetime.time(16, 14, 45), None]), typeinference.normalize_column_type([u'4:40 AM', u'03:45:00', u'16:14:45', u'']))
def test_times_coerce(self):
self.assertEqual((datetime.time, [datetime.time(4, 40, 0), datetime.time(3, 45, 0), datetime.time(16, 14, 45), None]), typeinference.normalize_column_type([u'4:40 AM', u'03:45:00', u'16:14:45', u''], normal_type=datetime.time))
def test_times_coerce_fail(self):
try:
typeinference.normalize_column_type([u'4:40 AM', u'03:45:00', u'16:14:45', u'1,000,000'], normal_type=datetime.time)
except InvalidValueForTypeException, e:
self.assertEqual(e.index, 3)
self.assertEqual(e.value, '1,000,000')
self.assertEqual(e.normal_type, datetime.time)
else:
raise AssertionError('Expected InvalidValueForTypeException')
def test_dates_and_times(self):
self.assertEqual((unicode, [u'Jan 1, 2008', u'2010-01-27', u'16:14:45', None]), typeinference.normalize_column_type([u'Jan 1, 2008', u'2010-01-27', u'16:14:45', u'']))
def test_datetimes_and_dates(self):
self.assertEqual((datetime.datetime, [datetime.datetime(2008, 1, 1, 4, 40, 0), datetime.datetime(2010, 1, 27, 3, 45, 0), datetime.datetime(2008, 3, 1, 0, 0, 0), None]), typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08', u'']))
def test_datetimes_and_dates_coerce(self):
self.assertEqual((datetime.datetime, [datetime.datetime(2008, 1, 1, 4, 40, 0), datetime.datetime(2010, 1, 27, 3, 45, 0), datetime.datetime(2008, 3, 1, 0, 0, 0), None]), typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'3/1/08', u''], normal_type=datetime.datetime))
def test_datetimes_and_times(self):
self.assertEqual((unicode, [u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'16:14:45', None]), typeinference.normalize_column_type([u'Jan 1, 2008 at 4:40 AM', u'2010-01-27T03:45:00', u'16:14:45', u'']))
def test_jeremy_singer_vine_datetimes(self):
"""
This obscure test named after Jeremy Singer-Vine, who discovered it.
"""
self.assertEqual((unicode, [u'P', u'H', u'H']), typeinference.normalize_column_type([u'P', u'H', u'H']))
def test_normalize_table(self):
expected_types = [unicode, int, float, NoneType]
data = [
[u'a', u'1', u'2.1', u''],
[u'b', u'5', u'4.1'],
[u'c', u'100', u'100.9999', u''],
[u'd', u'2', u'5.3', u'']
]
types, columns = typeinference.normalize_table(data)
self.assertEqual(4, len(types))
self.assertEqual(4, len(columns))
for i, tup in enumerate(zip(columns, types, expected_types)):
c, t, et = tup
self.assertEqual(et, t)
for row, normalized in zip(data, c):
if t is NoneType:
self.assertTrue(normalized is None)
else:
self.assertEqual(t(row[i]), normalized)
def test_normalize_table_known_types(self):
normal_types = [unicode, int, float, NoneType]
data = [
[u'a', u'1', u'2.1', u''],
[u'b', u'5', u'4.1'],
[u'c', u'100', u'100.9999', u''],
[u'd', u'2', u'5.3', u'']
]
types, columns = typeinference.normalize_table(data, normal_types)
self.assertEqual(4, len(types))
self.assertEqual(4, len(columns))
for i, tup in enumerate(zip(columns, types, normal_types)):
c, t, et = tup
self.assertEqual(et, t)
for row, normalized in zip(data, c):
if t is NoneType:
self.assertTrue(normalized is None)
else:
self.assertEqual(t(row[i]), normalized)
def test_normalize_table_known_types_invalid(self):
normal_types = [bool, int, int, NoneType]
data = [
[u'a', u'1', u'2.1', u''],
[u'b', u'5', u'4.1'],
[u'c', u'100', u'100.9999', u''],
[u'd', u'2', u'5.3', u'']
]
try:
typeinference.normalize_table(data, normal_types, accumulate_errors=True)
self.assertEqual(True, False)
except InvalidValueForTypeListException, e:
self.assertEqual(len(e.errors), 2)
self.assertEqual(e.errors[0].index, 0)
self.assertEqual(e.errors[0].value, 'a')
self.assertEqual(e.errors[0].normal_type, bool)
self.assertEqual(e.errors[2].index, 0)
self.assertEqual(e.errors[2].value, '2.1')
self.assertEqual(e.errors[2].normal_type, int)
| 52.676113 | 325 | 0.63062 | 1,847 | 13,011 | 4.327558 | 0.077964 | 0.129488 | 0.129613 | 0.14813 | 0.865758 | 0.832228 | 0.789691 | 0.76004 | 0.715126 | 0.698987 | 0 | 0.095666 | 0.205442 | 13,011 | 246 | 326 | 52.890244 | 0.6775 | 0.001537 | 0 | 0.414508 | 0 | 0 | 0.126308 | 0.017368 | 0 | 0 | 0 | 0 | 0.409326 | 0 | null | null | 0 | 0.025907 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
81873db331aed42aed0732af1943e91598c8e9eb | 19,911 | py | Python | src/jobs/job_creator.py | ZendriXXX/predict-python | fe0360b4888980421f8f91f158d6523729bfc5f7 | [
"MIT"
] | null | null | null | src/jobs/job_creator.py | ZendriXXX/predict-python | fe0360b4888980421f8f91f158d6523729bfc5f7 | [
"MIT"
] | null | null | null | src/jobs/job_creator.py | ZendriXXX/predict-python | fe0360b4888980421f8f91f158d6523729bfc5f7 | [
"MIT"
] | null | null | null | import time
from src.clustering.models import Clustering, ClusteringMethods
from src.encoding.encoding_container import UP_TO
from src.encoding.models import Encoding, ValueEncodings, DataEncodings
from src.hyperparameter_optimization.models import HyperparameterOptimization, HyperparameterOptimizationMethods
from src.jobs.models import Job, JobStatuses, JobTypes
from src.labelling.models import Labelling
from src.predictive_model.models import PredictiveModel
from src.predictive_model.models import PredictiveModels
from src.utils.django_orm import duplicate_orm_row
def generate(split, payload):
"""Returns a list of job
:param split:
:param payload:
:return:
"""
jobs = []
config = payload['config']
labelling_config = config['labelling'] if 'labelling' in config else {}
job_type = JobTypes.PREDICTION.value
prediction_type = payload['type']
for method in config['methods']:
for clustering in config['clusterings']:
for encMethod in config['encodings']:
encoding_dict = config['encoding']
if encoding_dict['generation_type'] == UP_TO:
for i in range(1, encoding_dict['prefix_length'] + 1):
predictive_model = PredictiveModel.init(
get_prediction_method_config(prediction_type, method, config))
job = Job.objects.create(
status=JobStatuses.CREATED.value,
type=job_type,
split=split,
encoding=Encoding.objects.create(
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encMethod,
add_elapsed_time=encoding_dict.get('add_elapsed_time', False),
add_remaining_time=encoding_dict.get('add_remaining_time', False),
add_executed_events=encoding_dict.get('add_executed_events', False),
add_resources_used=encoding_dict.get('add_resources_used', False),
add_new_traces=encoding_dict.get('add_new_traces', False),
prefix_length=i,
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None,
clustering=Clustering.init(clustering, configuration=config.get(clustering, {}))
if predictive_model.predictive_model != PredictiveModels.TIME_SERIES_PREDICTION.value
else Clustering.init(ClusteringMethods.NO_CLUSTER.value, configuration={}),
# TODO TEMPORARY workaround,
hyperparameter_optimizer=HyperparameterOptimization.init(
config.get('hyperparameter_optimizer', {
'type': None}) if predictive_model.predictive_model != PredictiveModels.TIME_SERIES_PREDICTION.value else {
'type': None}),
# TODO TEMPORARY workaround
predictive_model=predictive_model,
create_models=config.get('create_models', False)
)
# check_predictive_model_not_overwrite(job)
jobs.append(job)
else:
predictive_model = PredictiveModel.init(
get_prediction_method_config(prediction_type, method, config))
job = Job.objects.create(
status=JobStatuses.CREATED.value,
type=job_type,
split=split,
encoding=Encoding.objects.create(
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encMethod,
add_elapsed_time=encoding_dict.get('add_elapsed_time', False),
add_remaining_time=encoding_dict.get('add_remaining_time', False),
add_executed_events=encoding_dict.get('add_executed_events', False),
add_resources_used=encoding_dict.get('add_resources_used', False),
add_new_traces=encoding_dict.get('add_new_traces', False),
prefix_length=config['encoding']['prefix_length'],
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None,
clustering=Clustering.init(clustering, configuration=config.get(clustering, {}))
if predictive_model.predictive_model != PredictiveModels.TIME_SERIES_PREDICTION.value
else Clustering.init(ClusteringMethods.NO_CLUSTER.value, configuration={}),
hyperparameter_optimizer=HyperparameterOptimization.init(
config.get('hyperparameter_optimizer', {
'type': 'none'}) if predictive_model.predictive_model != PredictiveModels.TIME_SERIES_PREDICTION.value else {
'type': 'none'}),
# TODO TEMPORARY workaround
predictive_model=predictive_model,
create_models=config.get('create_models', False)
)
# check_predictive_model_not_overwrite(job)
jobs.append(job)
return jobs
def check_predictive_model_not_overwrite(job: Job) -> None:
if job.hyperparameter_optimizer.optimization_method != HyperparameterOptimizationMethods.NONE.value:
job.predictive_model = duplicate_orm_row(PredictiveModel.objects.filter(pk=job.predictive_model.pk)[0])
job.predictive_model.save()
job.save()
def get_prediction_method_config(predictive_model, prediction_method, payload):
"""Returns a dict contain the configuration of prediction method
:param predictive_model:
:param prediction_method:
:param payload:
:return:
"""
return {
'predictive_model': predictive_model,
'prediction_method': prediction_method,
**payload.get(predictive_model + '.' + prediction_method, {})
}
def set_model_name(job: Job) -> None:
"""Sets the model using the given job configuration
:param job:
"""
if job.create_models:
if job.predictive_model.model_path != '':
# job.predictive_model = duplicate_orm_row(PredictiveModel.objects.filter(pk=job.predictive_model.pk)[0]) #todo: replace with simple CREATE
job.predictive_model = PredictiveModel.init(
job.predictive_model.get_full_dict() #todo: doublecheck me, are you sure get_full_dict is returning everything needed?
) #todo: futurebug if object changes
job.predictive_model.save()
job.save()
if job.clustering.clustering_method != ClusteringMethods.NO_CLUSTER.value:
job.clustering.model_path = 'cache/model_cache/job_{}-split_{}-clusterer-{}-v0.sav'.format(
job.id,
job.split.id,
job.type)
job.clustering.save()
if job.type == JobTypes.UPDATE.value:
job.type = JobTypes.PREDICTION.value #TODO: Y am I doing this?
predictive_model_filename = 'cache/model_cache/job_{}-split_{}-predictive_model-{}-v{}.sav'.format(
job.id,
job.split.id,
job.type,
str(time.time()))
else:
predictive_model_filename = 'cache/model_cache/job_{}-split_{}-predictive_model-{}-v0.sav'.format(
job.id,
job.split.id,
job.type)
job.predictive_model.model_path = predictive_model_filename
job.predictive_model.save()
job.save()
def generate_labelling(split, payload):
"""Returns a list of job
:param split:
:param payload:
:return:
"""
jobs = []
encoding = payload['config']['encoding']
config = payload['config']
labelling_config = config['labelling'] if 'labelling' in config else {}
if encoding['generation_type'] == UP_TO:
for i in range(1, encoding['prefix_length'] + 1):
item = Job.objects.create(
status=JobStatuses.CREATED.value,
type=JobTypes.LABELLING.value,
split=split,
encoding=Encoding.objects.create( # TODO fixme
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encoding.get('encodings', ValueEncodings.SIMPLE_INDEX.value),
add_elapsed_time=encoding.get('add_elapsed_time', False),
add_remaining_time=encoding.get('add_remaining_time', False),
add_executed_events=encoding.get('add_executed_events', False),
add_resources_used=encoding.get('add_resources_used', False),
add_new_traces=encoding.get('add_new_traces', False),
prefix_length=i,
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None
)
jobs.append(item)
else:
item = Job.objects.create(
status=JobStatuses.CREATED.value,
type=JobTypes.LABELLING.value,
split=split,
encoding=Encoding.objects.create( # TODO fixme
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encoding.get('encodings', ValueEncodings.SIMPLE_INDEX.value),
add_elapsed_time=encoding.get('add_elapsed_time', False),
add_remaining_time=encoding.get('add_remaining_time', False),
add_executed_events=encoding.get('add_executed_events', False),
add_resources_used=encoding.get('add_resources_used', False),
add_new_traces=encoding.get('add_new_traces', False),
prefix_length=config['encoding']['prefix_length'],
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None
)
jobs.append(item)
return jobs
def update(split, payload, generation_type=PredictiveModels.CLASSIFICATION.value): # TODO adapt to allow selecting the predictive_model to update
"""Returns a list of job
:param split:
:param payload:
:param generation_type:
:return:
"""
jobs = []
config = payload['config']
labelling_config = config['labelling'] if 'labelling' in config else {}
for method in payload['config']['methods']:
for clustering in payload['config']['clusterings']:
for incremental_base_model in payload['config']['incremental_train']:
for encMethod in payload['config']['encodings']:
encoding = payload['config']['encoding']
if encoding['generation_type'] == UP_TO:
for i in range(1, encoding['prefix_length'] + 1):
job = Job.objects.create(
status=JobStatuses.CREATED.value,
type=JobTypes.UPDATE.value,
split=split,
encoding=Encoding.objects.create( # TODO fixme
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encMethod,
add_elapsed_time=encoding.get('add_elapsed_time', False),
add_remaining_time=encoding.get('add_remaining_time', False),
add_executed_events=encoding.get('add_executed_events', False),
add_resources_used=encoding.get('add_resources_used', False),
add_new_traces=encoding.get('add_new_traces', False),
prefix_length=i,
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None,
clustering=Clustering.init(clustering, configuration=config.get(clustering, {})),
predictive_model=PredictiveModel.init(
get_prediction_method_config(generation_type, method, payload)
),
hyperparameter_optimizer=HyperparameterOptimization.init(
config.get('hyperparameter_optimizer', None)),
create_models=config.get('create_models', False),
incremental_train=Job.objects.filter(
pk=incremental_base_model
)[0]
)
# check_predictive_model_not_overwrite(job)
jobs.append(job)
else:
job = Job.objects.create(
status=JobStatuses.CREATED.value,
type=JobTypes.UPDATE.value,
split=split,
encoding=Encoding.objects.create( # TODO fixme
data_encoding=DataEncodings.LABEL_ENCODER.value,
value_encoding=encMethod,
add_elapsed_time=encoding.get('add_elapsed_time', False),
add_remaining_time=encoding.get('add_remaining_time', False),
add_executed_events=encoding.get('add_executed_events', False),
add_resources_used=encoding.get('add_resources_used', False),
add_new_traces=encoding.get('add_new_traces', False),
prefix_length=config['encoding']['prefix_length'],
# TODO static check?
padding=True if config['encoding']['padding'] == 'zero_padding' else False,
task_generation_type=config['encoding'].get('generation_type', 'only_this'),
features=config['encoding'].get('features', [])
),
labelling=Labelling.objects.create(
type=labelling_config.get('type', None),
# TODO static check?
attribute_name=labelling_config.get('attribute_name', None),
threshold_type=labelling_config.get('threshold_type', None),
threshold=labelling_config.get('threshold', None),
results={}
) if labelling_config != {} else None,
clustering=Clustering.init(clustering, configuration=config.get(clustering, {})),
predictive_model=PredictiveModel.init(
get_prediction_method_config(generation_type, method, payload)
),
hyperparameter_optimizer=HyperparameterOptimization.init(
config.get('hyperparameter_optimizer', None)),
create_models=config.get('create_models', False),
incremental_train=Job.objects.filter(
pk=incremental_base_model
)[0]
)
# check_predictive_model_not_overwrite(job)
jobs.append(job)
return jobs
| 53.237968 | 152 | 0.538597 | 1,748 | 19,911 | 5.894737 | 0.088673 | 0.065509 | 0.041925 | 0.025621 | 0.823855 | 0.805415 | 0.792605 | 0.786394 | 0.786394 | 0.7796 | 0 | 0.000959 | 0.371654 | 19,911 | 373 | 153 | 53.380697 | 0.822636 | 0.063583 | 0 | 0.788732 | 0 | 0 | 0.105804 | 0.01459 | 0 | 0 | 0 | 0.010724 | 0 | 1 | 0.021127 | false | 0 | 0.035211 | 0 | 0.070423 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8192aa07723ecc966fb35af7aa5afeece5536e02 | 8,564 | py | Python | skidl/libs/valves_sklib.py | arjenroodselaar/skidl | 0bf801bd3b74e6ef94bd9aa1b68eef756b568276 | [
"MIT"
] | 700 | 2016-08-16T21:12:50.000Z | 2021-10-10T02:15:18.000Z | skidl/libs/valves_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 118 | 2016-08-16T20:51:05.000Z | 2021-10-10T08:07:18.000Z | skidl/libs/valves_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 94 | 2016-08-25T14:02:28.000Z | 2021-09-12T05:17:08.000Z | from skidl import SKIDL, TEMPLATE, Part, Pin, SchLib
SKIDL_lib_version = '0.0.1'
valves = SchLib(tool=SKIDL).add_parts(*[
Part(name='CK6418',dest=TEMPLATE,tool=SKIDL,keywords='subminiature pentode valve',description='Subminiature Pentode',ref_prefix='U',num_units=2,fplist=['VALVE*MINI*PENTODE*LINEAR*'],do_erc=True,aliases=['CK548DX', 'JAN6418', 'NOS-6418'],pins=[
Pin(num='3',name='F+,G3',func=Pin.PWRIN,do_erc=True),
Pin(num='1',name='P',func=Pin.OUTPUT,do_erc=True),
Pin(num='2',name='G2',do_erc=True),
Pin(num='4',name='G1',do_erc=True),
Pin(num='5',name='F+,G3',func=Pin.PWRIN,do_erc=True)]),
Part(name='EABC80',dest=TEMPLATE,tool=SKIDL,keywords='diode triode valve',description='triple diode triode',ref_prefix='U',num_units=4,fplist=['VALVE*NOVAL*P*'],do_erc=True,aliases=['6AK8', '9AK8', 'PABC80', 'UABC80'],pins=[
Pin(num='2',name='A2',func=Pin.OUTPUT,do_erc=True),
Pin(num='3',name='K',do_erc=True),
Pin(num='1',name='A1',do_erc=True),
Pin(num='6',name='A3',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='K',do_erc=True),
Pin(num='7',name='K',do_erc=True),
Pin(num='8',name='G',do_erc=True),
Pin(num='9',name='A2',func=Pin.OUTPUT,do_erc=True),
Pin(num='4',name='F1',do_erc=True),
Pin(num='5',name='F2',do_erc=True)]),
Part(name='EC92',dest=TEMPLATE,tool=SKIDL,keywords='triode valve',description='single triode',ref_prefix='U',num_units=2,fplist=['VALVE*MINI*P*'],do_erc=True,pins=[
Pin(num='1',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='6',name='G',do_erc=True),
Pin(num='7',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='4',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='ECC81',dest=TEMPLATE,tool=SKIDL,keywords='triode valve',description='double triode',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,aliases=['ECC83'],pins=[
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='G',do_erc=True),
Pin(num='8',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='1',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='2',name='G',do_erc=True),
Pin(num='3',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='ECC88',dest=TEMPLATE,tool=SKIDL,keywords='triode valve',description='double triode, low-noise',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='1',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='2',name='G',do_erc=True),
Pin(num='3',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='G',do_erc=True),
Pin(num='8',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='ECH81',dest=TEMPLATE,tool=SKIDL,keywords='triode heptode valve',description='triode heptode',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='3',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='9',name='G',do_erc=True),
Pin(num='1',name='G2_G4',do_erc=True),
Pin(num='2',name='G1',do_erc=True),
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='G3',do_erc=True),
Pin(num='~',name='K_G5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='ECL82',dest=TEMPLATE,tool=SKIDL,keywords='triode pentode valve',description='triode pentode',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='1',name='G',do_erc=True),
Pin(num='8',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='2',name='K_G3',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='G1',do_erc=True),
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='G2',do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='ECL86',dest=TEMPLATE,tool=SKIDL,keywords='triode pentode valve',description='triode pentode',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='1',name='G',do_erc=True),
Pin(num='2',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='3',name='G2',do_erc=True),
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='K_G3',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='G1',do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='EF80',dest=TEMPLATE,tool=SKIDL,keywords='pentode valve',description='pentode',ref_prefix='U',num_units=2,fplist=['VALVE*NOVAL*P*'],do_erc=True,aliases=['EF85'],pins=[
Pin(num='2',name='G1',do_erc=True),
Pin(num='3',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='6',name='S',do_erc=True),
Pin(num='7',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='8',name='G2',do_erc=True),
Pin(num='9',name='G3',do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='EF83',dest=TEMPLATE,tool=SKIDL,keywords='pentode valve',description='pentode',ref_prefix='U',num_units=2,fplist=['VALVE*NOVAL*P*'],do_erc=True,aliases=['EF86'],pins=[
Pin(num='1',name='G2',do_erc=True),
Pin(num='3',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='S',do_erc=True),
Pin(num='8',name='G3',do_erc=True),
Pin(num='9',name='G1',do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='EL34',dest=TEMPLATE,tool=SKIDL,keywords='pentode valve',description='pentode, 25W',ref_prefix='U',num_units=2,fplist=['VALVE*OCTAL*'],do_erc=True,pins=[
Pin(num='1',name='G3',do_erc=True),
Pin(num='3',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='4',name='G2',do_erc=True),
Pin(num='5',name='G1',do_erc=True),
Pin(num='8',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='2',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='EL84',dest=TEMPLATE,tool=SKIDL,keywords='pentode valve',description='pentode, 12W',ref_prefix='U',num_units=2,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='2',name='G1',do_erc=True),
Pin(num='3',name='K_G3',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='9',name='G2',do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='EM84',dest=TEMPLATE,tool=SKIDL,keywords='indicator tube valve magic eye',description='indicator tube "magic eye"',ref_prefix='U',num_units=3,fplist=['VALVE*NOVAL*P*'],do_erc=True,pins=[
Pin(num='2',name='K',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='L',func=Pin.OUTPUT,do_erc=True),
Pin(num='7',name='ST',do_erc=True),
Pin(num='1',name='G',do_erc=True),
Pin(num='9',name='A',func=Pin.OUTPUT,do_erc=True),
Pin(num='~',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='4',name='F1',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='F2',func=Pin.PWRIN,do_erc=True)]),
Part(name='STABI',dest=TEMPLATE,tool=SKIDL,do_erc=True)])
| 70.196721 | 251 | 0.590845 | 1,418 | 8,564 | 3.462623 | 0.074753 | 0.118126 | 0.212627 | 0.217515 | 0.898778 | 0.870468 | 0.856415 | 0.801629 | 0.760896 | 0.701426 | 0 | 0.032884 | 0.176203 | 8,564 | 121 | 252 | 70.77686 | 0.663076 | 0 | 0 | 0.453782 | 0 | 0 | 0.119921 | 0.003036 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008403 | 0 | 0.008403 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
81e6187283af5e73bf7da96dde1891e975742324 | 174,857 | py | Python | models/resnet_v2_152/converted_pytorch.py | ryujaehun/pytorch_imagnet | f7d51bb6afa7cf7a10b5822b0e4db987e283184a | [
"MIT"
] | 1 | 2020-03-28T12:41:24.000Z | 2020-03-28T12:41:24.000Z | models/resnet_v2_152/converted_pytorch.py | ryujaehun/pytorch_imagnet | f7d51bb6afa7cf7a10b5822b0e4db987e283184a | [
"MIT"
] | null | null | null | models/resnet_v2_152/converted_pytorch.py | ryujaehun/pytorch_imagnet | f7d51bb6afa7cf7a10b5822b0e4db987e283184a | [
"MIT"
] | null | null | null | import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
__weights_dict = dict()
def load_weights(weight_file):
if weight_file == None:
return
try:
weights_dict = np.load(weight_file).item()
except:
weights_dict = np.load(weight_file, encoding='bytes').item()
return weights_dict
class KitModel(nn.Module):
def __init__(self, weight_file):
super(KitModel, self).__init__()
global __weights_dict
__weights_dict = load_weights(weight_file)
self.resnet_v2_152_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/conv1/Conv2D', in_channels=3, out_channels=64, kernel_size=(7, 7), stride=(2, 2), groups=1, bias=True)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_1/bottleneck_v2/preact/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_shortcut_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_1/bottleneck_v2/shortcut/Conv2D', in_channels=64, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_1/bottleneck_v2/conv1/Conv2D', in_channels=64, out_channels=64, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_1/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_1/bottleneck_v2/conv2/Conv2D', in_channels=64, out_channels=64, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_1/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_1/bottleneck_v2/conv3/Conv2D', in_channels=64, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_2/bottleneck_v2/preact/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_2/bottleneck_v2/conv1/Conv2D', in_channels=256, out_channels=64, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_2/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_2/bottleneck_v2/conv2/Conv2D', in_channels=64, out_channels=64, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_2/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_2/bottleneck_v2/conv3/Conv2D', in_channels=64, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_3/bottleneck_v2/preact/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_3/bottleneck_v2/conv1/Conv2D', in_channels=256, out_channels=64, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_3/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_3/bottleneck_v2/conv2/Conv2D', in_channels=64, out_channels=64, kernel_size=(3, 3), stride=(2, 2), groups=1, bias=None)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block1/unit_3/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=64, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block1/unit_3/bottleneck_v2/conv3/Conv2D', in_channels=64, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_1/bottleneck_v2/preact/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_shortcut_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_1/bottleneck_v2/shortcut/Conv2D', in_channels=256, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_1/bottleneck_v2/conv1/Conv2D', in_channels=256, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_1/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_1/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_1/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_1/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_2/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_2/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_2/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_2/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_2/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_2/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_3/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_3/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_3/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_3/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_3/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_3/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_4/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_4/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_4/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_4/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_4/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_4/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_5/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_5/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_5/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_5/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_5/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_5/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_6/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_6/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_6/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_6/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_6/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_6/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_7/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_7/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_7/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_7/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_7/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_7/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_8/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_8/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=128, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_8/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_8/bottleneck_v2/conv2/Conv2D', in_channels=128, out_channels=128, kernel_size=(3, 3), stride=(2, 2), groups=1, bias=None)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block2/unit_8/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=128, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block2/unit_8/bottleneck_v2/conv3/Conv2D', in_channels=128, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_1/bottleneck_v2/preact/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_shortcut_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_1/bottleneck_v2/shortcut/Conv2D', in_channels=512, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_1/bottleneck_v2/conv1/Conv2D', in_channels=512, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_1/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_1/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_1/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_1/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_2/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_2/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_2/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_2/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_2/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_2/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_3/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_3/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_3/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_3/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_3/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_3/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_4/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_4/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_4/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_4/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_4/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_4/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_5/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_5/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_5/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_5/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_5/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_5/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_6/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_6/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_6/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_6/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_6/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_6/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_7/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_7/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_7/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_7/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_7/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_7/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_8/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_8/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_8/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_8/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_8/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_8/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_9/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_9/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_9/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_9/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_9/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_9/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_10/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_10/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_10/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_10/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_10/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_10/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_11/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_11/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_11/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_11/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_11/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_11/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_12/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_12/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_12/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_12/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_12/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_12/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_13/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_13/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_13/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_13/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_13/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_13/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_14/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_14/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_14/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_14/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_14/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_14/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_15/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_15/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_15/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_15/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_15/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_15/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_16/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_16/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_16/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_16/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_16/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_16/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_17/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_17/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_17/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_17/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_17/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_17/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_18/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_18/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_18/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_18/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_18/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_18/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_19/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_19/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_19/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_19/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_19/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_19/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_20/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_20/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_20/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_20/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_20/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_20/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_21/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_21/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_21/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_21/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_21/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_21/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_22/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_22/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_22/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_22/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_22/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_22/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_23/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_23/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_23/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_23/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_23/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_23/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_24/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_24/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_24/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_24/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_24/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_24/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_25/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_25/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_25/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_25/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_25/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_25/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_26/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_26/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_26/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_26/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_26/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_26/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_27/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_27/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_27/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_27/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_27/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_27/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_28/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_28/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_28/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_28/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_28/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_28/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_29/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_29/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_29/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_29/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_29/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_29/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_30/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_30/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_30/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_30/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_30/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_30/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_31/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_31/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_31/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_31/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_31/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_31/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_32/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_32/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_32/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_32/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_32/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_32/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_33/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_33/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_33/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_33/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_33/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_33/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_34/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_34/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_34/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_34/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_34/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_34/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_35/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_35/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_35/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_35/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_35/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_35/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_36/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_36/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=256, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_36/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_36/bottleneck_v2/conv2/Conv2D', in_channels=256, out_channels=256, kernel_size=(3, 3), stride=(2, 2), groups=1, bias=None)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block3/unit_36/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=256, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block3/unit_36/bottleneck_v2/conv3/Conv2D', in_channels=256, out_channels=1024, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_1/bottleneck_v2/preact/FusedBatchNorm', num_features=1024, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_shortcut_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_1/bottleneck_v2/shortcut/Conv2D', in_channels=1024, out_channels=2048, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_1/bottleneck_v2/conv1/Conv2D', in_channels=1024, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_1/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_1/bottleneck_v2/conv2/Conv2D', in_channels=512, out_channels=512, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_1/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_1/bottleneck_v2/conv3/Conv2D', in_channels=512, out_channels=2048, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_2/bottleneck_v2/preact/FusedBatchNorm', num_features=2048, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_2/bottleneck_v2/conv1/Conv2D', in_channels=2048, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_2/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_2/bottleneck_v2/conv2/Conv2D', in_channels=512, out_channels=512, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_2/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_2/bottleneck_v2/conv3/Conv2D', in_channels=512, out_channels=2048, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_3/bottleneck_v2/preact/FusedBatchNorm', num_features=2048, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_3/bottleneck_v2/conv1/Conv2D', in_channels=2048, out_channels=512, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_3/bottleneck_v2/conv1/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_3/bottleneck_v2/conv2/Conv2D', in_channels=512, out_channels=512, kernel_size=(3, 3), stride=(1, 1), groups=1, bias=None)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/block4/unit_3/bottleneck_v2/conv2/BatchNorm/FusedBatchNorm', num_features=512, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv3_Conv2D = self.__conv(2, name='resnet_v2_152/block4/unit_3/bottleneck_v2/conv3/Conv2D', in_channels=512, out_channels=2048, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
self.resnet_v2_152_postnorm_FusedBatchNorm = self.__batch_normalization(2, 'resnet_v2_152/postnorm/FusedBatchNorm', num_features=2048, eps=1.0009999641624745e-05, momentum=0.0)
self.resnet_v2_152_logits_Conv2D = self.__conv(2, name='resnet_v2_152/logits/Conv2D', in_channels=2048, out_channels=1001, kernel_size=(1, 1), stride=(1, 1), groups=1, bias=True)
def forward(self, x):
resnet_v2_152_Pad = F.pad(x, (3, 3, 3, 3), mode = 'constant', value = 0)
resnet_v2_152_conv1_Conv2D = self.resnet_v2_152_conv1_Conv2D(resnet_v2_152_Pad)
resnet_v2_152_pool1_MaxPool_pad = F.pad(resnet_v2_152_conv1_Conv2D, (0, 1, 0, 1), value=float('-inf'))
resnet_v2_152_pool1_MaxPool = F.max_pool2d(resnet_v2_152_pool1_MaxPool_pad, kernel_size=(3, 3), stride=(2, 2), padding=0, ceil_mode=False)
resnet_v2_152_block1_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block1_unit_1_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_pool1_MaxPool)
resnet_v2_152_block1_unit_1_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block1_unit_1_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block1_unit_1_bottleneck_v2_shortcut_Conv2D = self.resnet_v2_152_block1_unit_1_bottleneck_v2_shortcut_Conv2D(resnet_v2_152_block1_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block1_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block1_unit_1_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_1_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block1_unit_1_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block1_unit_1_bottleneck_v2_conv2_Relu)
resnet_v2_152_block1_unit_1_bottleneck_v2_add = resnet_v2_152_block1_unit_1_bottleneck_v2_shortcut_Conv2D + resnet_v2_152_block1_unit_1_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block1_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block1_unit_2_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block1_unit_1_bottleneck_v2_add)
resnet_v2_152_block1_unit_2_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block1_unit_2_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block1_unit_2_bottleneck_v2_preact_Relu)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block1_unit_2_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_2_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block1_unit_2_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block1_unit_2_bottleneck_v2_conv2_Relu)
resnet_v2_152_block1_unit_2_bottleneck_v2_add = resnet_v2_152_block1_unit_1_bottleneck_v2_add + resnet_v2_152_block1_unit_2_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block1_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block1_unit_3_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block1_unit_2_bottleneck_v2_add)
resnet_v2_152_block1_unit_3_bottleneck_v2_shortcut_MaxPool = F.max_pool2d(resnet_v2_152_block1_unit_2_bottleneck_v2_add, kernel_size=(1, 1), stride=(2, 2), padding=0, ceil_mode=False)
resnet_v2_152_block1_unit_3_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block1_unit_3_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block1_unit_3_bottleneck_v2_preact_Relu)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_3_bottleneck_v2_Pad = F.pad(resnet_v2_152_block1_unit_3_bottleneck_v2_conv1_Relu, (1, 1, 1, 1), mode = 'constant', value = 0)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block1_unit_3_bottleneck_v2_Pad)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block1_unit_3_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block1_unit_3_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block1_unit_3_bottleneck_v2_conv2_Relu)
resnet_v2_152_block1_unit_3_bottleneck_v2_add = resnet_v2_152_block1_unit_3_bottleneck_v2_shortcut_MaxPool + resnet_v2_152_block1_unit_3_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_1_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block1_unit_3_bottleneck_v2_add)
resnet_v2_152_block2_unit_1_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_1_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_1_bottleneck_v2_shortcut_Conv2D = self.resnet_v2_152_block2_unit_1_bottleneck_v2_shortcut_Conv2D(resnet_v2_152_block2_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_1_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_1_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_1_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_1_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_1_bottleneck_v2_add = resnet_v2_152_block2_unit_1_bottleneck_v2_shortcut_Conv2D + resnet_v2_152_block2_unit_1_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_2_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_1_bottleneck_v2_add)
resnet_v2_152_block2_unit_2_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_2_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_2_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_2_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_2_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_2_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_2_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_2_bottleneck_v2_add = resnet_v2_152_block2_unit_1_bottleneck_v2_add + resnet_v2_152_block2_unit_2_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_3_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_2_bottleneck_v2_add)
resnet_v2_152_block2_unit_3_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_3_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_3_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_3_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_3_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_3_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_3_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_3_bottleneck_v2_add = resnet_v2_152_block2_unit_2_bottleneck_v2_add + resnet_v2_152_block2_unit_3_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_4_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_4_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_3_bottleneck_v2_add)
resnet_v2_152_block2_unit_4_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_4_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_4_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_4_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_4_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_4_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_4_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_4_bottleneck_v2_add = resnet_v2_152_block2_unit_3_bottleneck_v2_add + resnet_v2_152_block2_unit_4_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_5_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_5_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_4_bottleneck_v2_add)
resnet_v2_152_block2_unit_5_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_5_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_5_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_5_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_5_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_5_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_5_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_5_bottleneck_v2_add = resnet_v2_152_block2_unit_4_bottleneck_v2_add + resnet_v2_152_block2_unit_5_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_6_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_6_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_5_bottleneck_v2_add)
resnet_v2_152_block2_unit_6_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_6_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_6_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_6_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_6_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_6_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_6_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_6_bottleneck_v2_add = resnet_v2_152_block2_unit_5_bottleneck_v2_add + resnet_v2_152_block2_unit_6_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_7_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_7_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_6_bottleneck_v2_add)
resnet_v2_152_block2_unit_7_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_7_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_7_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block2_unit_7_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_7_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_7_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_7_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_7_bottleneck_v2_add = resnet_v2_152_block2_unit_6_bottleneck_v2_add + resnet_v2_152_block2_unit_7_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block2_unit_8_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block2_unit_8_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_7_bottleneck_v2_add)
resnet_v2_152_block2_unit_8_bottleneck_v2_shortcut_MaxPool = F.max_pool2d(resnet_v2_152_block2_unit_7_bottleneck_v2_add, kernel_size=(1, 1), stride=(2, 2), padding=0, ceil_mode=False)
resnet_v2_152_block2_unit_8_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block2_unit_8_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block2_unit_8_bottleneck_v2_preact_Relu)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_8_bottleneck_v2_Pad = F.pad(resnet_v2_152_block2_unit_8_bottleneck_v2_conv1_Relu, (1, 1, 1, 1), mode = 'constant', value = 0)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block2_unit_8_bottleneck_v2_Pad)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block2_unit_8_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block2_unit_8_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block2_unit_8_bottleneck_v2_conv2_Relu)
resnet_v2_152_block2_unit_8_bottleneck_v2_add = resnet_v2_152_block2_unit_8_bottleneck_v2_shortcut_MaxPool + resnet_v2_152_block2_unit_8_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_1_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block2_unit_8_bottleneck_v2_add)
resnet_v2_152_block3_unit_1_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_1_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_1_bottleneck_v2_shortcut_Conv2D = self.resnet_v2_152_block3_unit_1_bottleneck_v2_shortcut_Conv2D(resnet_v2_152_block3_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_1_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_1_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_1_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_1_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_1_bottleneck_v2_add = resnet_v2_152_block3_unit_1_bottleneck_v2_shortcut_Conv2D + resnet_v2_152_block3_unit_1_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_2_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_1_bottleneck_v2_add)
resnet_v2_152_block3_unit_2_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_2_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_2_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_2_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_2_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_2_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_2_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_2_bottleneck_v2_add = resnet_v2_152_block3_unit_1_bottleneck_v2_add + resnet_v2_152_block3_unit_2_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_3_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_2_bottleneck_v2_add)
resnet_v2_152_block3_unit_3_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_3_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_3_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_3_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_3_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_3_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_3_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_3_bottleneck_v2_add = resnet_v2_152_block3_unit_2_bottleneck_v2_add + resnet_v2_152_block3_unit_3_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_4_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_4_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_3_bottleneck_v2_add)
resnet_v2_152_block3_unit_4_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_4_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_4_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_4_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_4_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_4_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_4_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_4_bottleneck_v2_add = resnet_v2_152_block3_unit_3_bottleneck_v2_add + resnet_v2_152_block3_unit_4_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_5_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_5_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_4_bottleneck_v2_add)
resnet_v2_152_block3_unit_5_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_5_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_5_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_5_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_5_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_5_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_5_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_5_bottleneck_v2_add = resnet_v2_152_block3_unit_4_bottleneck_v2_add + resnet_v2_152_block3_unit_5_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_6_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_6_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_5_bottleneck_v2_add)
resnet_v2_152_block3_unit_6_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_6_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_6_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_6_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_6_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_6_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_6_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_6_bottleneck_v2_add = resnet_v2_152_block3_unit_5_bottleneck_v2_add + resnet_v2_152_block3_unit_6_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_7_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_7_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_6_bottleneck_v2_add)
resnet_v2_152_block3_unit_7_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_7_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_7_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_7_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_7_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_7_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_7_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_7_bottleneck_v2_add = resnet_v2_152_block3_unit_6_bottleneck_v2_add + resnet_v2_152_block3_unit_7_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_8_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_8_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_7_bottleneck_v2_add)
resnet_v2_152_block3_unit_8_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_8_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_8_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_8_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_8_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_8_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_8_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_8_bottleneck_v2_add = resnet_v2_152_block3_unit_7_bottleneck_v2_add + resnet_v2_152_block3_unit_8_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_9_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_9_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_8_bottleneck_v2_add)
resnet_v2_152_block3_unit_9_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_9_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_9_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_9_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_9_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_9_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_9_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_9_bottleneck_v2_add = resnet_v2_152_block3_unit_8_bottleneck_v2_add + resnet_v2_152_block3_unit_9_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_10_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_10_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_9_bottleneck_v2_add)
resnet_v2_152_block3_unit_10_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_10_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_10_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_10_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_10_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_10_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_10_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_10_bottleneck_v2_add = resnet_v2_152_block3_unit_9_bottleneck_v2_add + resnet_v2_152_block3_unit_10_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_11_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_11_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_10_bottleneck_v2_add)
resnet_v2_152_block3_unit_11_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_11_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_11_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_11_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_11_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_11_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_11_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_11_bottleneck_v2_add = resnet_v2_152_block3_unit_10_bottleneck_v2_add + resnet_v2_152_block3_unit_11_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_12_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_12_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_11_bottleneck_v2_add)
resnet_v2_152_block3_unit_12_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_12_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_12_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_12_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_12_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_12_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_12_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_12_bottleneck_v2_add = resnet_v2_152_block3_unit_11_bottleneck_v2_add + resnet_v2_152_block3_unit_12_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_13_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_13_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_12_bottleneck_v2_add)
resnet_v2_152_block3_unit_13_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_13_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_13_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_13_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_13_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_13_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_13_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_13_bottleneck_v2_add = resnet_v2_152_block3_unit_12_bottleneck_v2_add + resnet_v2_152_block3_unit_13_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_14_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_14_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_13_bottleneck_v2_add)
resnet_v2_152_block3_unit_14_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_14_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_14_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_14_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_14_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_14_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_14_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_14_bottleneck_v2_add = resnet_v2_152_block3_unit_13_bottleneck_v2_add + resnet_v2_152_block3_unit_14_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_15_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_15_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_14_bottleneck_v2_add)
resnet_v2_152_block3_unit_15_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_15_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_15_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_15_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_15_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_15_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_15_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_15_bottleneck_v2_add = resnet_v2_152_block3_unit_14_bottleneck_v2_add + resnet_v2_152_block3_unit_15_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_16_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_16_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_15_bottleneck_v2_add)
resnet_v2_152_block3_unit_16_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_16_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_16_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_16_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_16_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_16_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_16_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_16_bottleneck_v2_add = resnet_v2_152_block3_unit_15_bottleneck_v2_add + resnet_v2_152_block3_unit_16_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_17_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_17_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_16_bottleneck_v2_add)
resnet_v2_152_block3_unit_17_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_17_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_17_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_17_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_17_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_17_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_17_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_17_bottleneck_v2_add = resnet_v2_152_block3_unit_16_bottleneck_v2_add + resnet_v2_152_block3_unit_17_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_18_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_18_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_17_bottleneck_v2_add)
resnet_v2_152_block3_unit_18_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_18_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_18_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_18_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_18_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_18_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_18_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_18_bottleneck_v2_add = resnet_v2_152_block3_unit_17_bottleneck_v2_add + resnet_v2_152_block3_unit_18_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_19_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_19_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_18_bottleneck_v2_add)
resnet_v2_152_block3_unit_19_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_19_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_19_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_19_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_19_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_19_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_19_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_19_bottleneck_v2_add = resnet_v2_152_block3_unit_18_bottleneck_v2_add + resnet_v2_152_block3_unit_19_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_20_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_20_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_19_bottleneck_v2_add)
resnet_v2_152_block3_unit_20_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_20_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_20_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_20_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_20_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_20_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_20_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_20_bottleneck_v2_add = resnet_v2_152_block3_unit_19_bottleneck_v2_add + resnet_v2_152_block3_unit_20_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_21_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_21_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_20_bottleneck_v2_add)
resnet_v2_152_block3_unit_21_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_21_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_21_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_21_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_21_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_21_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_21_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_21_bottleneck_v2_add = resnet_v2_152_block3_unit_20_bottleneck_v2_add + resnet_v2_152_block3_unit_21_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_22_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_22_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_21_bottleneck_v2_add)
resnet_v2_152_block3_unit_22_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_22_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_22_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_22_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_22_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_22_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_22_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_22_bottleneck_v2_add = resnet_v2_152_block3_unit_21_bottleneck_v2_add + resnet_v2_152_block3_unit_22_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_23_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_23_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_22_bottleneck_v2_add)
resnet_v2_152_block3_unit_23_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_23_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_23_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_23_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_23_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_23_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_23_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_23_bottleneck_v2_add = resnet_v2_152_block3_unit_22_bottleneck_v2_add + resnet_v2_152_block3_unit_23_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_24_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_24_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_23_bottleneck_v2_add)
resnet_v2_152_block3_unit_24_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_24_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_24_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_24_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_24_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_24_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_24_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_24_bottleneck_v2_add = resnet_v2_152_block3_unit_23_bottleneck_v2_add + resnet_v2_152_block3_unit_24_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_25_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_25_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_24_bottleneck_v2_add)
resnet_v2_152_block3_unit_25_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_25_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_25_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_25_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_25_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_25_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_25_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_25_bottleneck_v2_add = resnet_v2_152_block3_unit_24_bottleneck_v2_add + resnet_v2_152_block3_unit_25_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_26_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_26_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_25_bottleneck_v2_add)
resnet_v2_152_block3_unit_26_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_26_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_26_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_26_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_26_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_26_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_26_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_26_bottleneck_v2_add = resnet_v2_152_block3_unit_25_bottleneck_v2_add + resnet_v2_152_block3_unit_26_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_27_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_27_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_26_bottleneck_v2_add)
resnet_v2_152_block3_unit_27_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_27_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_27_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_27_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_27_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_27_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_27_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_27_bottleneck_v2_add = resnet_v2_152_block3_unit_26_bottleneck_v2_add + resnet_v2_152_block3_unit_27_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_28_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_28_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_27_bottleneck_v2_add)
resnet_v2_152_block3_unit_28_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_28_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_28_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_28_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_28_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_28_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_28_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_28_bottleneck_v2_add = resnet_v2_152_block3_unit_27_bottleneck_v2_add + resnet_v2_152_block3_unit_28_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_29_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_29_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_28_bottleneck_v2_add)
resnet_v2_152_block3_unit_29_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_29_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_29_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_29_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_29_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_29_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_29_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_29_bottleneck_v2_add = resnet_v2_152_block3_unit_28_bottleneck_v2_add + resnet_v2_152_block3_unit_29_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_30_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_30_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_29_bottleneck_v2_add)
resnet_v2_152_block3_unit_30_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_30_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_30_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_30_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_30_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_30_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_30_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_30_bottleneck_v2_add = resnet_v2_152_block3_unit_29_bottleneck_v2_add + resnet_v2_152_block3_unit_30_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_31_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_31_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_30_bottleneck_v2_add)
resnet_v2_152_block3_unit_31_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_31_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_31_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_31_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_31_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_31_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_31_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_31_bottleneck_v2_add = resnet_v2_152_block3_unit_30_bottleneck_v2_add + resnet_v2_152_block3_unit_31_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_32_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_32_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_31_bottleneck_v2_add)
resnet_v2_152_block3_unit_32_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_32_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_32_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_32_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_32_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_32_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_32_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_32_bottleneck_v2_add = resnet_v2_152_block3_unit_31_bottleneck_v2_add + resnet_v2_152_block3_unit_32_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_33_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_33_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_32_bottleneck_v2_add)
resnet_v2_152_block3_unit_33_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_33_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_33_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_33_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_33_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_33_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_33_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_33_bottleneck_v2_add = resnet_v2_152_block3_unit_32_bottleneck_v2_add + resnet_v2_152_block3_unit_33_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_34_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_34_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_33_bottleneck_v2_add)
resnet_v2_152_block3_unit_34_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_34_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_34_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_34_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_34_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_34_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_34_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_34_bottleneck_v2_add = resnet_v2_152_block3_unit_33_bottleneck_v2_add + resnet_v2_152_block3_unit_34_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_35_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_35_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_34_bottleneck_v2_add)
resnet_v2_152_block3_unit_35_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_35_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_35_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block3_unit_35_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_35_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_35_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_35_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_35_bottleneck_v2_add = resnet_v2_152_block3_unit_34_bottleneck_v2_add + resnet_v2_152_block3_unit_35_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block3_unit_36_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block3_unit_36_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_35_bottleneck_v2_add)
resnet_v2_152_block3_unit_36_bottleneck_v2_shortcut_MaxPool = F.max_pool2d(resnet_v2_152_block3_unit_35_bottleneck_v2_add, kernel_size=(1, 1), stride=(2, 2), padding=0, ceil_mode=False)
resnet_v2_152_block3_unit_36_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block3_unit_36_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block3_unit_36_bottleneck_v2_preact_Relu)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_36_bottleneck_v2_Pad = F.pad(resnet_v2_152_block3_unit_36_bottleneck_v2_conv1_Relu, (1, 1, 1, 1), mode = 'constant', value = 0)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block3_unit_36_bottleneck_v2_Pad)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block3_unit_36_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block3_unit_36_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block3_unit_36_bottleneck_v2_conv2_Relu)
resnet_v2_152_block3_unit_36_bottleneck_v2_add = resnet_v2_152_block3_unit_36_bottleneck_v2_shortcut_MaxPool + resnet_v2_152_block3_unit_36_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block4_unit_1_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block4_unit_1_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block3_unit_36_bottleneck_v2_add)
resnet_v2_152_block4_unit_1_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block4_unit_1_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block4_unit_1_bottleneck_v2_shortcut_Conv2D = self.resnet_v2_152_block4_unit_1_bottleneck_v2_shortcut_Conv2D(resnet_v2_152_block4_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block4_unit_1_bottleneck_v2_preact_Relu)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block4_unit_1_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_1_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block4_unit_1_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block4_unit_1_bottleneck_v2_conv2_Relu)
resnet_v2_152_block4_unit_1_bottleneck_v2_add = resnet_v2_152_block4_unit_1_bottleneck_v2_shortcut_Conv2D + resnet_v2_152_block4_unit_1_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block4_unit_2_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block4_unit_2_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block4_unit_1_bottleneck_v2_add)
resnet_v2_152_block4_unit_2_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block4_unit_2_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block4_unit_2_bottleneck_v2_preact_Relu)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block4_unit_2_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_2_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block4_unit_2_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block4_unit_2_bottleneck_v2_conv2_Relu)
resnet_v2_152_block4_unit_2_bottleneck_v2_add = resnet_v2_152_block4_unit_1_bottleneck_v2_add + resnet_v2_152_block4_unit_2_bottleneck_v2_conv3_Conv2D
resnet_v2_152_block4_unit_3_bottleneck_v2_preact_FusedBatchNorm = self.resnet_v2_152_block4_unit_3_bottleneck_v2_preact_FusedBatchNorm(resnet_v2_152_block4_unit_2_bottleneck_v2_add)
resnet_v2_152_block4_unit_3_bottleneck_v2_preact_Relu = F.relu(resnet_v2_152_block4_unit_3_bottleneck_v2_preact_FusedBatchNorm)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Conv2D = self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Conv2D(resnet_v2_152_block4_unit_3_bottleneck_v2_preact_Relu)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Conv2D)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Relu = F.relu(resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D_pad = F.pad(resnet_v2_152_block4_unit_3_bottleneck_v2_conv1_Relu, (1, 1, 1, 1))
resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D = self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D(resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D_pad)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm = self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm(resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Conv2D)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Relu = F.relu(resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_BatchNorm_FusedBatchNorm)
resnet_v2_152_block4_unit_3_bottleneck_v2_conv3_Conv2D = self.resnet_v2_152_block4_unit_3_bottleneck_v2_conv3_Conv2D(resnet_v2_152_block4_unit_3_bottleneck_v2_conv2_Relu)
resnet_v2_152_block4_unit_3_bottleneck_v2_add = resnet_v2_152_block4_unit_2_bottleneck_v2_add + resnet_v2_152_block4_unit_3_bottleneck_v2_conv3_Conv2D
resnet_v2_152_postnorm_FusedBatchNorm = self.resnet_v2_152_postnorm_FusedBatchNorm(resnet_v2_152_block4_unit_3_bottleneck_v2_add)
resnet_v2_152_postnorm_Relu = F.relu(resnet_v2_152_postnorm_FusedBatchNorm)
resnet_v2_152_pool5 = torch.mean(resnet_v2_152_postnorm_Relu, 3, True)
resnet_v2_152_pool5 = torch.mean(resnet_v2_152_pool5, 2, True)
resnet_v2_152_logits_Conv2D = self.resnet_v2_152_logits_Conv2D(resnet_v2_152_pool5)
MMdnn_Output = torch.squeeze(resnet_v2_152_logits_Conv2D)
return MMdnn_Output
@staticmethod
def __batch_normalization(dim, name, **kwargs):
if dim == 1: layer = nn.BatchNorm1d(**kwargs)
elif dim == 2: layer = nn.BatchNorm2d(**kwargs)
elif dim == 3: layer = nn.BatchNorm3d(**kwargs)
else: raise NotImplementedError()
if 'scale' in __weights_dict[name]:
layer.state_dict()['weight'].copy_(torch.from_numpy(__weights_dict[name]['scale']))
else:
layer.weight.data.fill_(1)
if 'bias' in __weights_dict[name]:
layer.state_dict()['bias'].copy_(torch.from_numpy(__weights_dict[name]['bias']))
else:
layer.bias.data.fill_(0)
layer.state_dict()['running_mean'].copy_(torch.from_numpy(__weights_dict[name]['mean']))
layer.state_dict()['running_var'].copy_(torch.from_numpy(__weights_dict[name]['var']))
return layer
@staticmethod
def __conv(dim, name, **kwargs):
if dim == 1: layer = nn.Conv1d(**kwargs)
elif dim == 2: layer = nn.Conv2d(**kwargs)
elif dim == 3: layer = nn.Conv3d(**kwargs)
else: raise NotImplementedError()
layer.state_dict()['weight'].copy_(torch.from_numpy(__weights_dict[name]['weights']))
if 'bias' in __weights_dict[name]:
layer.state_dict()['bias'].copy_(torch.from_numpy(__weights_dict[name]['bias']))
return layer
| 186.414712 | 255 | 0.864358 | 28,292 | 174,857 | 4.667574 | 0.005443 | 0.127401 | 0.175177 | 0.190913 | 0.994328 | 0.993313 | 0.990277 | 0.989126 | 0.987498 | 0.985726 | 0 | 0.148875 | 0.071327 | 174,857 | 937 | 256 | 186.613661 | 0.664347 | 0 | 0 | 0.013029 | 0 | 0 | 0.108969 | 0.10826 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005429 | false | 0 | 0.004343 | 0 | 0.016287 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c4caad734af8a5d25bafda41f334683e658821a3 | 89,983 | py | Python | splunk_sdk/search/v2/gen_models.py | ianlee4/splunk-cloud-sdk-python | d2870cd1e506d3844869d17becdcdf9d8d60a9a1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | splunk_sdk/search/v2/gen_models.py | ianlee4/splunk-cloud-sdk-python | d2870cd1e506d3844869d17becdcdf9d8d60a9a1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | splunk_sdk/search/v2/gen_models.py | ianlee4/splunk-cloud-sdk-python | d2870cd1e506d3844869d17becdcdf9d8d60a9a1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # Copyright © 2021 Splunk, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"): you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# [http://www.apache.org/licenses/LICENSE-2.0]
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
############# This file is auto-generated. Do not edit! #############
"""
SDC Service: Splunk Search service
Use the Search service in Splunk Cloud Services to dispatch, review, and manage searches and search jobs. You can finalize or cancel jobs, retrieve search results, and request search-related configurations from the Metadata Catalog service in Splunk Cloud Services.
OpenAPI spec version: v2 (recommended default)
Generated by: https://openapi-generator.tech
"""
from datetime import datetime
from typing import List, Dict
from splunk_sdk.common.sscmodel import SSCModel
from splunk_sdk.base_client import dictify, inflate
from enum import Enum
class TypeEnum(str, Enum):
INFO = "INFO"
DEBUG = "DEBUG"
FATAL = "FATAL"
ERROR = "ERROR"
@staticmethod
def from_value(value: str):
if value == "INFO":
return TypeEnum.INFO
if value == "DEBUG":
return TypeEnum.DEBUG
if value == "FATAL":
return TypeEnum.FATAL
if value == "ERROR":
return TypeEnum.ERROR
class Message(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "Message":
instance = Message.__new__(Message)
instance._attrs = model
return instance
def __init__(self, text: "str" = None, type: "str" = None, **extra):
"""Message"""
self._attrs = dict()
if text is not None:
self._attrs["text"] = text
if type is not None:
self._attrs["type"] = type
for k, v in extra.items():
self._attrs[k] = v
@property
def text(self) -> "str":
""" Gets the text of this Message.
"""
return self._attrs.get("text")
@text.setter
def text(self, text: "str"):
"""Sets the text of this Message.
:param text: The text of this Message.
:type: str
"""
self._attrs["text"] = text
@property
def type(self) -> "TypeEnum":
""" Gets the type of this Message.
"""
return TypeEnum.from_value(self._attrs.get("type"))
@type.setter
def type(self, type: "str"):
"""Sets the type of this Message.
:param type: The type of this Message.
:type: str
"""
if isinstance(type, Enum):
self._attrs["type"] = type.value
else:
self._attrs["type"] = type # If you supply a string, we presume you know the service will take it.
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class QueryParameters(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "QueryParameters":
instance = QueryParameters.__new__(QueryParameters)
instance._attrs = model
return instance
def __init__(self, earliest: "str" = '-24h@h', latest: "str" = 'now', relative_time_anchor: "datetime" = None, timezone: "object" = None, **extra):
"""QueryParameters"""
self._attrs = dict()
if earliest is not None:
self._attrs["earliest"] = earliest
if latest is not None:
self._attrs["latest"] = latest
if relative_time_anchor is not None:
self._attrs["relativeTimeAnchor"] = relative_time_anchor
if timezone is not None:
self._attrs["timezone"] = timezone
for k, v in extra.items():
self._attrs[k] = v
@property
def earliest(self) -> "str":
""" Gets the earliest of this QueryParameters.
The earliest time, in absolute or relative format, to retrieve events. When specifying an absolute time specify either UNIX time, or UTC in seconds using the ISO-8601 (%FT%T.%Q) format. For example 2021-01-25T13:15:30Z. GMT is the default timezone. You must specify GMT when you specify UTC. Any offset specified is ignored.
"""
return self._attrs.get("earliest")
@earliest.setter
def earliest(self, earliest: "str"):
"""Sets the earliest of this QueryParameters.
The earliest time, in absolute or relative format, to retrieve events. When specifying an absolute time specify either UNIX time, or UTC in seconds using the ISO-8601 (%FT%T.%Q) format. For example 2021-01-25T13:15:30Z. GMT is the default timezone. You must specify GMT when you specify UTC. Any offset specified is ignored.
:param earliest: The earliest of this QueryParameters.
:type: str
"""
self._attrs["earliest"] = earliest
@property
def latest(self) -> "str":
""" Gets the latest of this QueryParameters.
The latest time, in absolute or relative format, to retrieve events. When specifying an absolute time specify either UNIX time, or UTC in seconds using the ISO-8601 (%FT%T.%Q) format. For example 2021-01-25T13:15:30Z. GMT is the default timezone. You must specify GMT when you specify UTC. Any offset specified is ignored.
"""
return self._attrs.get("latest")
@latest.setter
def latest(self, latest: "str"):
"""Sets the latest of this QueryParameters.
The latest time, in absolute or relative format, to retrieve events. When specifying an absolute time specify either UNIX time, or UTC in seconds using the ISO-8601 (%FT%T.%Q) format. For example 2021-01-25T13:15:30Z. GMT is the default timezone. You must specify GMT when you specify UTC. Any offset specified is ignored.
:param latest: The latest of this QueryParameters.
:type: str
"""
self._attrs["latest"] = latest
@property
def relative_time_anchor(self) -> "datetime":
""" Gets the relative_time_anchor of this QueryParameters.
Specify a time string to set the absolute time used for any relative time specifier in the search. Defaults to the current system time. You can specify a relative time modifier ('earliest' or 'latest') for this parameter. For example, if 'earliest' is set to -d and the 'relativeTimeAnchor' is set to '2021-01-05T13:15:30Z' then 'resolvedEarliest' is '2021-01-04T13:15:30Z'.
"""
return self._attrs.get("relativeTimeAnchor")
@relative_time_anchor.setter
def relative_time_anchor(self, relative_time_anchor: "datetime"):
"""Sets the relative_time_anchor of this QueryParameters.
Specify a time string to set the absolute time used for any relative time specifier in the search. Defaults to the current system time. You can specify a relative time modifier ('earliest' or 'latest') for this parameter. For example, if 'earliest' is set to -d and the 'relativeTimeAnchor' is set to '2021-01-05T13:15:30Z' then 'resolvedEarliest' is '2021-01-04T13:15:30Z'.
:param relative_time_anchor: The relative_time_anchor of this QueryParameters.
:type: datetime
"""
self._attrs["relativeTimeAnchor"] = relative_time_anchor
@property
def timezone(self) -> "object":
""" Gets the timezone of this QueryParameters.
The timezone that relative time modifiers are based off of. Timezone only applies to relative time literals for 'earliest' and 'latest'. If UNIX time or UTC format is used for 'earliest' and 'latest', this field is ignored. For the list of supported timezone formats, see https://docs.splunk.com/Documentation/Splunk/latest/Data/Applytimezoneoffsetstotimestamps#zoneinfo_.28TZ.29_database type: string default: \"GMT\"
"""
return self._attrs.get("timezone")
@timezone.setter
def timezone(self, timezone: "object"):
"""Sets the timezone of this QueryParameters.
The timezone that relative time modifiers are based off of. Timezone only applies to relative time literals for 'earliest' and 'latest'. If UNIX time or UTC format is used for 'earliest' and 'latest', this field is ignored. For the list of supported timezone formats, see https://docs.splunk.com/Documentation/Splunk/latest/Data/Applytimezoneoffsetstotimestamps#zoneinfo_.28TZ.29_database type: string default: \"GMT\"
:param timezone: The timezone of this QueryParameters.
:type: object
"""
self._attrs["timezone"] = timezone
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class SearchStatus(str, Enum):
RUNNING = "running"
DONE = "done"
CANCELED = "canceled"
FAILED = "failed"
@staticmethod
def from_value(value: str):
if value == "running":
return SearchStatus.RUNNING
if value == "done":
return SearchStatus.DONE
if value == "canceled":
return SearchStatus.CANCELED
if value == "failed":
return SearchStatus.FAILED
class DeleteSearchJob(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "DeleteSearchJob":
instance = DeleteSearchJob.__new__(DeleteSearchJob)
instance._attrs = model
return instance
def __init__(self, index: "str", module: "str", predicate: "str", allow_side_effects: "bool" = True, collect_event_summary: "bool" = False, collect_field_summary: "bool" = False, collect_time_buckets: "bool" = False, completion_time: "str" = None, dispatch_time: "str" = None, enable_preview: "bool" = False, extract_fields: "str" = 'none', max_time: "int" = 3600, messages: "List[Message]" = None, name: "str" = None, percent_complete: "int" = 0, preview_available: "str" = 'false', query: "str" = None, query_parameters: "QueryParameters" = None, required_freshness: "int" = 0, resolved_earliest: "str" = None, resolved_latest: "str" = None, results_available: "int" = 0, results_preview_available: "int" = 0, sid: "str" = None, status: "SearchStatus" = None, **extra):
"""DeleteSearchJob"""
self._attrs = dict()
if index is not None:
self._attrs["index"] = index
if module is not None:
self._attrs["module"] = module
if predicate is not None:
self._attrs["predicate"] = predicate
if allow_side_effects is not None:
self._attrs["allowSideEffects"] = allow_side_effects
if collect_event_summary is not None:
self._attrs["collectEventSummary"] = collect_event_summary
if collect_field_summary is not None:
self._attrs["collectFieldSummary"] = collect_field_summary
if collect_time_buckets is not None:
self._attrs["collectTimeBuckets"] = collect_time_buckets
if completion_time is not None:
self._attrs["completionTime"] = completion_time
if dispatch_time is not None:
self._attrs["dispatchTime"] = dispatch_time
if enable_preview is not None:
self._attrs["enablePreview"] = enable_preview
if extract_fields is not None:
self._attrs["extractFields"] = extract_fields
if max_time is not None:
self._attrs["maxTime"] = max_time
if messages is not None:
self._attrs["messages"] = messages
if name is not None:
self._attrs["name"] = name
if percent_complete is not None:
self._attrs["percentComplete"] = percent_complete
if preview_available is not None:
self._attrs["previewAvailable"] = preview_available
if query is not None:
self._attrs["query"] = query
if query_parameters is not None:
self._attrs["queryParameters"] = query_parameters.to_dict()
if required_freshness is not None:
self._attrs["requiredFreshness"] = required_freshness
if resolved_earliest is not None:
self._attrs["resolvedEarliest"] = resolved_earliest
if resolved_latest is not None:
self._attrs["resolvedLatest"] = resolved_latest
if results_available is not None:
self._attrs["resultsAvailable"] = results_available
if results_preview_available is not None:
self._attrs["resultsPreviewAvailable"] = results_preview_available
if sid is not None:
self._attrs["sid"] = sid
if status is not None:
self._attrs["status"] = status
for k, v in extra.items():
self._attrs[k] = v
@property
def index(self) -> "str":
""" Gets the index of this DeleteSearchJob.
The index to delete the events from.
"""
return self._attrs.get("index")
@index.setter
def index(self, index: "str"):
"""Sets the index of this DeleteSearchJob.
The index to delete the events from.
:param index: The index of this DeleteSearchJob.
:type: str
"""
if index is None:
raise ValueError("Invalid value for `index`, must not be `None`")
self._attrs["index"] = index
@property
def module(self) -> "str":
""" Gets the module of this DeleteSearchJob.
The module to run the delete search job in. The default module is used if the module field is empty.
"""
return self._attrs.get("module")
@module.setter
def module(self, module: "str"):
"""Sets the module of this DeleteSearchJob.
The module to run the delete search job in. The default module is used if the module field is empty.
:param module: The module of this DeleteSearchJob.
:type: str
"""
if module is None:
raise ValueError("Invalid value for `module`, must not be `None`")
self._attrs["module"] = module
@property
def predicate(self) -> "str":
""" Gets the predicate of this DeleteSearchJob.
The predicate expression that identifies the events to delete from the index. This expression must return true or false. To delete all events from the index, specify \"true\" instead of an expression.
"""
return self._attrs.get("predicate")
@predicate.setter
def predicate(self, predicate: "str"):
"""Sets the predicate of this DeleteSearchJob.
The predicate expression that identifies the events to delete from the index. This expression must return true or false. To delete all events from the index, specify \"true\" instead of an expression.
:param predicate: The predicate of this DeleteSearchJob.
:type: str
"""
if predicate is None:
raise ValueError("Invalid value for `predicate`, must not be `None`")
self._attrs["predicate"] = predicate
@property
def allow_side_effects(self) -> "bool":
""" Gets the allow_side_effects of this DeleteSearchJob.
Specifies that the delete search job contains side effects, with possible security risks.
"""
return self._attrs.get("allowSideEffects")
@allow_side_effects.setter
def allow_side_effects(self, allow_side_effects: "bool"):
"""Sets the allow_side_effects of this DeleteSearchJob.
Specifies that the delete search job contains side effects, with possible security risks.
:param allow_side_effects: The allow_side_effects of this DeleteSearchJob.
:type: bool
"""
self._attrs["allowSideEffects"] = allow_side_effects
@property
def collect_event_summary(self) -> "bool":
""" Gets the collect_event_summary of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
"""
return self._attrs.get("collectEventSummary")
@collect_event_summary.setter
def collect_event_summary(self, collect_event_summary: "bool"):
"""Sets the collect_event_summary of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
:param collect_event_summary: The collect_event_summary of this DeleteSearchJob.
:type: bool
"""
self._attrs["collectEventSummary"] = collect_event_summary
@property
def collect_field_summary(self) -> "bool":
""" Gets the collect_field_summary of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
"""
return self._attrs.get("collectFieldSummary")
@collect_field_summary.setter
def collect_field_summary(self, collect_field_summary: "bool"):
"""Sets the collect_field_summary of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
:param collect_field_summary: The collect_field_summary of this DeleteSearchJob.
:type: bool
"""
self._attrs["collectFieldSummary"] = collect_field_summary
@property
def collect_time_buckets(self) -> "bool":
""" Gets the collect_time_buckets of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
"""
return self._attrs.get("collectTimeBuckets")
@collect_time_buckets.setter
def collect_time_buckets(self, collect_time_buckets: "bool"):
"""Sets the collect_time_buckets of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
:param collect_time_buckets: The collect_time_buckets of this DeleteSearchJob.
:type: bool
"""
self._attrs["collectTimeBuckets"] = collect_time_buckets
@property
def completion_time(self) -> "str":
""" Gets the completion_time of this DeleteSearchJob.
The time, in GMT, that the search job is finished. Empty if the search job has not completed.
"""
return self._attrs.get("completionTime")
@completion_time.setter
def completion_time(self, completion_time: "str"):
"""Sets the completion_time of this DeleteSearchJob.
The time, in GMT, that the search job is finished. Empty if the search job has not completed.
:param completion_time: The completion_time of this DeleteSearchJob.
:type: str
"""
self._attrs["completionTime"] = completion_time
@property
def dispatch_time(self) -> "str":
""" Gets the dispatch_time of this DeleteSearchJob.
The time, in GMT, that the search job is dispatched.
"""
return self._attrs.get("dispatchTime")
@dispatch_time.setter
def dispatch_time(self, dispatch_time: "str"):
"""Sets the dispatch_time of this DeleteSearchJob.
The time, in GMT, that the search job is dispatched.
:param dispatch_time: The dispatch_time of this DeleteSearchJob.
:type: str
"""
self._attrs["dispatchTime"] = dispatch_time
@property
def enable_preview(self) -> "bool":
""" Gets the enable_preview of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
"""
return self._attrs.get("enablePreview")
@enable_preview.setter
def enable_preview(self, enable_preview: "bool"):
"""Sets the enable_preview of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
:param enable_preview: The enable_preview of this DeleteSearchJob.
:type: bool
"""
self._attrs["enablePreview"] = enable_preview
@property
def extract_fields(self) -> "str":
""" Gets the extract_fields of this DeleteSearchJob.
Specifies how the Search service should extract fields. Valid values include 'all', 'none', or 'indexed'. 'all' will extract all fields, 'indexed' will extract only indexed fields, and 'none' will extract only the default fields. This parameter overwrites the value of the 'extractAllFields' parameter. Set to 'none' for better search performance.
"""
return self._attrs.get("extractFields")
@extract_fields.setter
def extract_fields(self, extract_fields: "str"):
"""Sets the extract_fields of this DeleteSearchJob.
Specifies how the Search service should extract fields. Valid values include 'all', 'none', or 'indexed'. 'all' will extract all fields, 'indexed' will extract only indexed fields, and 'none' will extract only the default fields. This parameter overwrites the value of the 'extractAllFields' parameter. Set to 'none' for better search performance.
:param extract_fields: The extract_fields of this DeleteSearchJob.
:type: str
"""
self._attrs["extractFields"] = extract_fields
@property
def max_time(self) -> "int":
""" Gets the max_time of this DeleteSearchJob.
The amount of time, in seconds, to run the delete search job before finalizing the search. The maximum value is 3600 seconds (1 hour).
"""
return self._attrs.get("maxTime")
@max_time.setter
def max_time(self, max_time: "int"):
"""Sets the max_time of this DeleteSearchJob.
The amount of time, in seconds, to run the delete search job before finalizing the search. The maximum value is 3600 seconds (1 hour).
:param max_time: The max_time of this DeleteSearchJob.
:type: int
"""
self._attrs["maxTime"] = max_time
@property
def messages(self) -> "List[Message]":
""" Gets the messages of this DeleteSearchJob.
"""
return [Message._from_dict(i) for i in self._attrs.get("messages")]
@messages.setter
def messages(self, messages: "List[Message]"):
"""Sets the messages of this DeleteSearchJob.
:param messages: The messages of this DeleteSearchJob.
:type: List[Message]
"""
self._attrs["messages"] = messages
@property
def name(self) -> "str":
""" Gets the name of this DeleteSearchJob.
The name of the search job.
"""
return self._attrs.get("name")
@name.setter
def name(self, name: "str"):
"""Sets the name of this DeleteSearchJob.
The name of the search job.
:param name: The name of this DeleteSearchJob.
:type: str
"""
self._attrs["name"] = name
@property
def percent_complete(self) -> "int":
""" Gets the percent_complete of this DeleteSearchJob.
An estimate of the percent of time remaining before the delete search job completes.
"""
return self._attrs.get("percentComplete")
@percent_complete.setter
def percent_complete(self, percent_complete: "int"):
"""Sets the percent_complete of this DeleteSearchJob.
An estimate of the percent of time remaining before the delete search job completes.
:param percent_complete: The percent_complete of this DeleteSearchJob.
:type: int
"""
self._attrs["percentComplete"] = percent_complete
@property
def preview_available(self) -> "str":
""" Gets the preview_available of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
"""
return self._attrs.get("previewAvailable")
@preview_available.setter
def preview_available(self, preview_available: "str"):
"""Sets the preview_available of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to false by default.
:param preview_available: The preview_available of this DeleteSearchJob.
:type: str
"""
self._attrs["previewAvailable"] = preview_available
@property
def query(self) -> "str":
""" Gets the query of this DeleteSearchJob.
The SPL search string that includes the index, module, and predicate that you specify.
"""
return self._attrs.get("query")
@query.setter
def query(self, query: "str"):
"""Sets the query of this DeleteSearchJob.
The SPL search string that includes the index, module, and predicate that you specify.
:param query: The query of this DeleteSearchJob.
:type: str
"""
self._attrs["query"] = query
@property
def query_parameters(self) -> "QueryParameters":
""" Gets the query_parameters of this DeleteSearchJob.
Represents parameters on the search job such as 'earliest' and 'latest'.
"""
return QueryParameters._from_dict(self._attrs["queryParameters"])
@query_parameters.setter
def query_parameters(self, query_parameters: "QueryParameters"):
"""Sets the query_parameters of this DeleteSearchJob.
Represents parameters on the search job such as 'earliest' and 'latest'.
:param query_parameters: The query_parameters of this DeleteSearchJob.
:type: QueryParameters
"""
self._attrs["queryParameters"] = query_parameters.to_dict()
@property
def required_freshness(self) -> "int":
""" Gets the required_freshness of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to 0 by default.
"""
return self._attrs.get("requiredFreshness")
@required_freshness.setter
def required_freshness(self, required_freshness: "int"):
"""Sets the required_freshness of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to 0 by default.
:param required_freshness: The required_freshness of this DeleteSearchJob.
:type: int
"""
self._attrs["requiredFreshness"] = required_freshness
@property
def resolved_earliest(self) -> "str":
""" Gets the resolved_earliest of this DeleteSearchJob.
The earliest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
"""
return self._attrs.get("resolvedEarliest")
@resolved_earliest.setter
def resolved_earliest(self, resolved_earliest: "str"):
"""Sets the resolved_earliest of this DeleteSearchJob.
The earliest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
:param resolved_earliest: The resolved_earliest of this DeleteSearchJob.
:type: str
"""
self._attrs["resolvedEarliest"] = resolved_earliest
@property
def resolved_latest(self) -> "str":
""" Gets the resolved_latest of this DeleteSearchJob.
The latest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
"""
return self._attrs.get("resolvedLatest")
@resolved_latest.setter
def resolved_latest(self, resolved_latest: "str"):
"""Sets the resolved_latest of this DeleteSearchJob.
The latest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
:param resolved_latest: The resolved_latest of this DeleteSearchJob.
:type: str
"""
self._attrs["resolvedLatest"] = resolved_latest
@property
def results_available(self) -> "int":
""" Gets the results_available of this DeleteSearchJob.
The number of results produced so far by the delete search job that are going to be deleted.
"""
return self._attrs.get("resultsAvailable")
@results_available.setter
def results_available(self, results_available: "int"):
"""Sets the results_available of this DeleteSearchJob.
The number of results produced so far by the delete search job that are going to be deleted.
:param results_available: The results_available of this DeleteSearchJob.
:type: int
"""
self._attrs["resultsAvailable"] = results_available
@property
def results_preview_available(self) -> "int":
""" Gets the results_preview_available of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to 0 by default.
"""
return self._attrs.get("resultsPreviewAvailable")
@results_preview_available.setter
def results_preview_available(self, results_preview_available: "int"):
"""Sets the results_preview_available of this DeleteSearchJob.
This property does not apply to delete search jobs endpoint and is set to 0 by default.
:param results_preview_available: The results_preview_available of this DeleteSearchJob.
:type: int
"""
self._attrs["resultsPreviewAvailable"] = results_preview_available
@property
def sid(self) -> "str":
""" Gets the sid of this DeleteSearchJob.
The ID assigned to the delete search job.
"""
return self._attrs.get("sid")
@sid.setter
def sid(self, sid: "str"):
"""Sets the sid of this DeleteSearchJob.
The ID assigned to the delete search job.
:param sid: The sid of this DeleteSearchJob.
:type: str
"""
self._attrs["sid"] = sid
@property
def status(self) -> "SearchStatus":
""" Gets the status of this DeleteSearchJob.
"""
return SearchStatus.from_value(self._attrs.get("status"))
@status.setter
def status(self, status: "SearchStatus"):
"""Sets the status of this DeleteSearchJob.
:param status: The status of this DeleteSearchJob.
:type: SearchStatus
"""
if isinstance(status, Enum):
self._attrs["status"] = status.value
else:
self._attrs["status"] = status # If you supply a string, we presume you know the service will take it.
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class SingleFieldSummary(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "SingleFieldSummary":
instance = SingleFieldSummary.__new__(SingleFieldSummary)
instance._attrs = model
return instance
def __init__(self, count: "int" = None, distinct_count: "int" = None, is_exact: "bool" = None, max: "str" = None, mean: "float" = None, min: "str" = None, modes: "List[SingleValueMode]" = None, numeric_count: "int" = None, relevant: "bool" = None, stddev: "float" = None, **extra):
"""SingleFieldSummary"""
self._attrs = dict()
if count is not None:
self._attrs["count"] = count
if distinct_count is not None:
self._attrs["distinctCount"] = distinct_count
if is_exact is not None:
self._attrs["isExact"] = is_exact
if max is not None:
self._attrs["max"] = max
if mean is not None:
self._attrs["mean"] = mean
if min is not None:
self._attrs["min"] = min
if modes is not None:
self._attrs["modes"] = modes
if numeric_count is not None:
self._attrs["numericCount"] = numeric_count
if relevant is not None:
self._attrs["relevant"] = relevant
if stddev is not None:
self._attrs["stddev"] = stddev
for k, v in extra.items():
self._attrs[k] = v
@property
def count(self) -> "int":
""" Gets the count of this SingleFieldSummary.
The total number of events that the field appears in.
"""
return self._attrs.get("count")
@count.setter
def count(self, count: "int"):
"""Sets the count of this SingleFieldSummary.
The total number of events that the field appears in.
:param count: The count of this SingleFieldSummary.
:type: int
"""
self._attrs["count"] = count
@property
def distinct_count(self) -> "int":
""" Gets the distinct_count of this SingleFieldSummary.
The total number of unique values in the field.
"""
return self._attrs.get("distinctCount")
@distinct_count.setter
def distinct_count(self, distinct_count: "int"):
"""Sets the distinct_count of this SingleFieldSummary.
The total number of unique values in the field.
:param distinct_count: The distinct_count of this SingleFieldSummary.
:type: int
"""
self._attrs["distinctCount"] = distinct_count
@property
def is_exact(self) -> "bool":
""" Gets the is_exact of this SingleFieldSummary.
Specifies if the 'distinctCount' is accurate. When the count exceeds the maximum count, an approximate count is computed instead and the 'isExact' property is FALSE.
"""
return self._attrs.get("isExact")
@is_exact.setter
def is_exact(self, is_exact: "bool"):
"""Sets the is_exact of this SingleFieldSummary.
Specifies if the 'distinctCount' is accurate. When the count exceeds the maximum count, an approximate count is computed instead and the 'isExact' property is FALSE.
:param is_exact: The is_exact of this SingleFieldSummary.
:type: bool
"""
self._attrs["isExact"] = is_exact
@property
def max(self) -> "str":
""" Gets the max of this SingleFieldSummary.
The maximum numeric value in the field.
"""
return self._attrs.get("max")
@max.setter
def max(self, max: "str"):
"""Sets the max of this SingleFieldSummary.
The maximum numeric value in the field.
:param max: The max of this SingleFieldSummary.
:type: str
"""
self._attrs["max"] = max
@property
def mean(self) -> "float":
""" Gets the mean of this SingleFieldSummary.
The mean (average) for the numeric value in the field.
"""
return self._attrs.get("mean")
@mean.setter
def mean(self, mean: "float"):
"""Sets the mean of this SingleFieldSummary.
The mean (average) for the numeric value in the field.
:param mean: The mean of this SingleFieldSummary.
:type: float
"""
self._attrs["mean"] = mean
@property
def min(self) -> "str":
""" Gets the min of this SingleFieldSummary.
The minimum numeric value in the field.
"""
return self._attrs.get("min")
@min.setter
def min(self, min: "str"):
"""Sets the min of this SingleFieldSummary.
The minimum numeric value in the field.
:param min: The min of this SingleFieldSummary.
:type: str
"""
self._attrs["min"] = min
@property
def modes(self) -> "List[SingleValueMode]":
""" Gets the modes of this SingleFieldSummary.
An array of the values in the field.
"""
return [SingleValueMode._from_dict(i) for i in self._attrs.get("modes")]
@modes.setter
def modes(self, modes: "List[SingleValueMode]"):
"""Sets the modes of this SingleFieldSummary.
An array of the values in the field.
:param modes: The modes of this SingleFieldSummary.
:type: List[SingleValueMode]
"""
self._attrs["modes"] = modes
@property
def numeric_count(self) -> "int":
""" Gets the numeric_count of this SingleFieldSummary.
The count of the numeric values in the field.
"""
return self._attrs.get("numericCount")
@numeric_count.setter
def numeric_count(self, numeric_count: "int"):
"""Sets the numeric_count of this SingleFieldSummary.
The count of the numeric values in the field.
:param numeric_count: The numeric_count of this SingleFieldSummary.
:type: int
"""
self._attrs["numericCount"] = numeric_count
@property
def relevant(self) -> "bool":
""" Gets the relevant of this SingleFieldSummary.
Specifies if the field was added or changed by the search.
"""
return self._attrs.get("relevant")
@relevant.setter
def relevant(self, relevant: "bool"):
"""Sets the relevant of this SingleFieldSummary.
Specifies if the field was added or changed by the search.
:param relevant: The relevant of this SingleFieldSummary.
:type: bool
"""
self._attrs["relevant"] = relevant
@property
def stddev(self) -> "float":
""" Gets the stddev of this SingleFieldSummary.
The standard deviation for the numeric values in the field.
"""
return self._attrs.get("stddev")
@stddev.setter
def stddev(self, stddev: "float"):
"""Sets the stddev of this SingleFieldSummary.
The standard deviation for the numeric values in the field.
:param stddev: The stddev of this SingleFieldSummary.
:type: float
"""
self._attrs["stddev"] = stddev
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class SingleValueMode(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "SingleValueMode":
instance = SingleValueMode.__new__(SingleValueMode)
instance._attrs = model
return instance
def __init__(self, count: "int" = None, is_exact: "bool" = None, value: "str" = None, **extra):
"""SingleValueMode"""
self._attrs = dict()
if count is not None:
self._attrs["count"] = count
if is_exact is not None:
self._attrs["isExact"] = is_exact
if value is not None:
self._attrs["value"] = value
for k, v in extra.items():
self._attrs[k] = v
@property
def count(self) -> "int":
""" Gets the count of this SingleValueMode.
The number of occurrences that the value appears in a field.
"""
return self._attrs.get("count")
@count.setter
def count(self, count: "int"):
"""Sets the count of this SingleValueMode.
The number of occurrences that the value appears in a field.
:param count: The count of this SingleValueMode.
:type: int
"""
self._attrs["count"] = count
@property
def is_exact(self) -> "bool":
""" Gets the is_exact of this SingleValueMode.
Specifies if the count is accurate. When the count exceeds the maximum count, an approximate count is computed instead and the 'isExact' property is FALSE.
"""
return self._attrs.get("isExact")
@is_exact.setter
def is_exact(self, is_exact: "bool"):
"""Sets the is_exact of this SingleValueMode.
Specifies if the count is accurate. When the count exceeds the maximum count, an approximate count is computed instead and the 'isExact' property is FALSE.
:param is_exact: The is_exact of this SingleValueMode.
:type: bool
"""
self._attrs["isExact"] = is_exact
@property
def value(self) -> "str":
""" Gets the value of this SingleValueMode.
The value in the field.
"""
return self._attrs.get("value")
@value.setter
def value(self, value: "str"):
"""Sets the value of this SingleValueMode.
The value in the field.
:param value: The value of this SingleValueMode.
:type: str
"""
self._attrs["value"] = value
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class FieldsSummary(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "FieldsSummary":
instance = FieldsSummary.__new__(FieldsSummary)
instance._attrs = model
return instance
def __init__(self, duration: "float" = None, earliest_time: "str" = None, event_count: "int" = None, fields: "Dict[str, SingleFieldSummary]" = None, latest_time: "str" = None, **extra):
"""FieldsSummary"""
self._attrs = dict()
if duration is not None:
self._attrs["duration"] = duration
if earliest_time is not None:
self._attrs["earliestTime"] = earliest_time
if event_count is not None:
self._attrs["eventCount"] = event_count
if fields is not None:
self._attrs["fields"] = fields
if latest_time is not None:
self._attrs["latestTime"] = latest_time
for k, v in extra.items():
self._attrs[k] = v
@property
def duration(self) -> "float":
""" Gets the duration of this FieldsSummary.
The amount of time, in seconds, that a time bucket spans from the earliest to the latest time.
"""
return self._attrs.get("duration")
@duration.setter
def duration(self, duration: "float"):
"""Sets the duration of this FieldsSummary.
The amount of time, in seconds, that a time bucket spans from the earliest to the latest time.
:param duration: The duration of this FieldsSummary.
:type: float
"""
self._attrs["duration"] = duration
@property
def earliest_time(self) -> "str":
""" Gets the earliest_time of this FieldsSummary.
The earliest timestamp, in UTC format, of the events to process.
"""
return self._attrs.get("earliestTime")
@earliest_time.setter
def earliest_time(self, earliest_time: "str"):
"""Sets the earliest_time of this FieldsSummary.
The earliest timestamp, in UTC format, of the events to process.
:param earliest_time: The earliest_time of this FieldsSummary.
:type: str
"""
self._attrs["earliestTime"] = earliest_time
@property
def event_count(self) -> "int":
""" Gets the event_count of this FieldsSummary.
The total number of events for all fields returned in the time range (earliestTime and latestTime) specified.
"""
return self._attrs.get("eventCount")
@event_count.setter
def event_count(self, event_count: "int"):
"""Sets the event_count of this FieldsSummary.
The total number of events for all fields returned in the time range (earliestTime and latestTime) specified.
:param event_count: The event_count of this FieldsSummary.
:type: int
"""
self._attrs["eventCount"] = event_count
@property
def fields(self) -> "Dict[str, SingleFieldSummary]":
""" Gets the fields of this FieldsSummary.
A list of the fields in the time range specified.
"""
return self._attrs.get("fields")
@fields.setter
def fields(self, fields: "Dict[str, SingleFieldSummary]"):
"""Sets the fields of this FieldsSummary.
A list of the fields in the time range specified.
:param fields: The fields of this FieldsSummary.
:type: Dict[str, SingleFieldSummary]
"""
self._attrs["fields"] = fields
@property
def latest_time(self) -> "str":
""" Gets the latest_time of this FieldsSummary.
The latest timestamp, in UTC format, of the events to process.
"""
return self._attrs.get("latestTime")
@latest_time.setter
def latest_time(self, latest_time: "str"):
"""Sets the latest_time of this FieldsSummary.
The latest timestamp, in UTC format, of the events to process.
:param latest_time: The latest_time of this FieldsSummary.
:type: str
"""
self._attrs["latestTime"] = latest_time
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class ListPreviewResultsResponseFields(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "ListPreviewResultsResponseFields":
instance = ListPreviewResultsResponseFields.__new__(ListPreviewResultsResponseFields)
instance._attrs = model
return instance
def __init__(self, name: "str", data_source: "str" = None, groupby_rank: "str" = None, split_field: "str" = None, split_value: "str" = None, splitby_special: "str" = None, type_special: "str" = None, **extra):
"""ListPreviewResultsResponseFields"""
self._attrs = dict()
if name is not None:
self._attrs["name"] = name
if data_source is not None:
self._attrs["dataSource"] = data_source
if groupby_rank is not None:
self._attrs["groupbyRank"] = groupby_rank
if split_field is not None:
self._attrs["splitField"] = split_field
if split_value is not None:
self._attrs["splitValue"] = split_value
if splitby_special is not None:
self._attrs["splitbySpecial"] = splitby_special
if type_special is not None:
self._attrs["typeSpecial"] = type_special
for k, v in extra.items():
self._attrs[k] = v
@property
def name(self) -> "str":
""" Gets the name of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("name")
@name.setter
def name(self, name: "str"):
"""Sets the name of this ListPreviewResultsResponseFields.
:param name: The name of this ListPreviewResultsResponseFields.
:type: str
"""
if name is None:
raise ValueError("Invalid value for `name`, must not be `None`")
self._attrs["name"] = name
@property
def data_source(self) -> "str":
""" Gets the data_source of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("dataSource")
@data_source.setter
def data_source(self, data_source: "str"):
"""Sets the data_source of this ListPreviewResultsResponseFields.
:param data_source: The data_source of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["dataSource"] = data_source
@property
def groupby_rank(self) -> "str":
""" Gets the groupby_rank of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("groupbyRank")
@groupby_rank.setter
def groupby_rank(self, groupby_rank: "str"):
"""Sets the groupby_rank of this ListPreviewResultsResponseFields.
:param groupby_rank: The groupby_rank of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["groupbyRank"] = groupby_rank
@property
def split_field(self) -> "str":
""" Gets the split_field of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("splitField")
@split_field.setter
def split_field(self, split_field: "str"):
"""Sets the split_field of this ListPreviewResultsResponseFields.
:param split_field: The split_field of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["splitField"] = split_field
@property
def split_value(self) -> "str":
""" Gets the split_value of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("splitValue")
@split_value.setter
def split_value(self, split_value: "str"):
"""Sets the split_value of this ListPreviewResultsResponseFields.
:param split_value: The split_value of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["splitValue"] = split_value
@property
def splitby_special(self) -> "str":
""" Gets the splitby_special of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("splitbySpecial")
@splitby_special.setter
def splitby_special(self, splitby_special: "str"):
"""Sets the splitby_special of this ListPreviewResultsResponseFields.
:param splitby_special: The splitby_special of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["splitbySpecial"] = splitby_special
@property
def type_special(self) -> "str":
""" Gets the type_special of this ListPreviewResultsResponseFields.
"""
return self._attrs.get("typeSpecial")
@type_special.setter
def type_special(self, type_special: "str"):
"""Sets the type_special of this ListPreviewResultsResponseFields.
:param type_special: The type_special of this ListPreviewResultsResponseFields.
:type: str
"""
self._attrs["typeSpecial"] = type_special
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class ListPreviewResultsResponse(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "ListPreviewResultsResponse":
instance = ListPreviewResultsResponse.__new__(ListPreviewResultsResponse)
instance._attrs = model
return instance
def __init__(self, is_preview_stable: "bool", results: "List[object]", fields: "List[ListPreviewResultsResponseFields]" = None, messages: "List[Message]" = None, next_link: "str" = None, wait: "str" = None, **extra):
"""ListPreviewResultsResponse"""
self._attrs = dict()
if is_preview_stable is not None:
self._attrs["isPreviewStable"] = is_preview_stable
if results is not None:
self._attrs["results"] = results
if fields is not None:
self._attrs["fields"] = fields
if messages is not None:
self._attrs["messages"] = messages
if next_link is not None:
self._attrs["nextLink"] = next_link
if wait is not None:
self._attrs["wait"] = wait
for k, v in extra.items():
self._attrs[k] = v
@property
def is_preview_stable(self) -> "bool":
""" Gets the is_preview_stable of this ListPreviewResultsResponse.
"""
return self._attrs.get("isPreviewStable")
@is_preview_stable.setter
def is_preview_stable(self, is_preview_stable: "bool"):
"""Sets the is_preview_stable of this ListPreviewResultsResponse.
:param is_preview_stable: The is_preview_stable of this ListPreviewResultsResponse.
:type: bool
"""
if is_preview_stable is None:
raise ValueError("Invalid value for `is_preview_stable`, must not be `None`")
self._attrs["isPreviewStable"] = is_preview_stable
@property
def results(self) -> "List[object]":
""" Gets the results of this ListPreviewResultsResponse.
"""
return self._attrs.get("results")
@results.setter
def results(self, results: "List[object]"):
"""Sets the results of this ListPreviewResultsResponse.
:param results: The results of this ListPreviewResultsResponse.
:type: List[object]
"""
if results is None:
raise ValueError("Invalid value for `results`, must not be `None`")
self._attrs["results"] = results
@property
def fields(self) -> "List[ListPreviewResultsResponseFields]":
""" Gets the fields of this ListPreviewResultsResponse.
"""
return [ListPreviewResultsResponseFields._from_dict(i) for i in self._attrs.get("fields")]
@fields.setter
def fields(self, fields: "List[ListPreviewResultsResponseFields]"):
"""Sets the fields of this ListPreviewResultsResponse.
:param fields: The fields of this ListPreviewResultsResponse.
:type: List[ListPreviewResultsResponseFields]
"""
self._attrs["fields"] = fields
@property
def messages(self) -> "List[Message]":
""" Gets the messages of this ListPreviewResultsResponse.
"""
return [Message._from_dict(i) for i in self._attrs.get("messages")]
@messages.setter
def messages(self, messages: "List[Message]"):
"""Sets the messages of this ListPreviewResultsResponse.
:param messages: The messages of this ListPreviewResultsResponse.
:type: List[Message]
"""
self._attrs["messages"] = messages
@property
def next_link(self) -> "str":
""" Gets the next_link of this ListPreviewResultsResponse.
"""
return self._attrs.get("nextLink")
@next_link.setter
def next_link(self, next_link: "str"):
"""Sets the next_link of this ListPreviewResultsResponse.
:param next_link: The next_link of this ListPreviewResultsResponse.
:type: str
"""
self._attrs["nextLink"] = next_link
@property
def wait(self) -> "str":
""" Gets the wait of this ListPreviewResultsResponse.
"""
return self._attrs.get("wait")
@wait.setter
def wait(self, wait: "str"):
"""Sets the wait of this ListPreviewResultsResponse.
:param wait: The wait of this ListPreviewResultsResponse.
:type: str
"""
self._attrs["wait"] = wait
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class ListSearchResultsResponse(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "ListSearchResultsResponse":
instance = ListSearchResultsResponse.__new__(ListSearchResultsResponse)
instance._attrs = model
return instance
def __init__(self, results: "List[object]", fields: "List[ListPreviewResultsResponseFields]" = None, messages: "List[Message]" = None, next_link: "str" = None, wait: "str" = None, **extra):
"""ListSearchResultsResponse"""
self._attrs = dict()
if results is not None:
self._attrs["results"] = results
if fields is not None:
self._attrs["fields"] = fields
if messages is not None:
self._attrs["messages"] = messages
if next_link is not None:
self._attrs["nextLink"] = next_link
if wait is not None:
self._attrs["wait"] = wait
for k, v in extra.items():
self._attrs[k] = v
@property
def results(self) -> "List[object]":
""" Gets the results of this ListSearchResultsResponse.
"""
return self._attrs.get("results")
@results.setter
def results(self, results: "List[object]"):
"""Sets the results of this ListSearchResultsResponse.
:param results: The results of this ListSearchResultsResponse.
:type: List[object]
"""
if results is None:
raise ValueError("Invalid value for `results`, must not be `None`")
self._attrs["results"] = results
@property
def fields(self) -> "List[ListPreviewResultsResponseFields]":
""" Gets the fields of this ListSearchResultsResponse.
"""
return [ListPreviewResultsResponseFields._from_dict(i) for i in self._attrs.get("fields")]
@fields.setter
def fields(self, fields: "List[ListPreviewResultsResponseFields]"):
"""Sets the fields of this ListSearchResultsResponse.
:param fields: The fields of this ListSearchResultsResponse.
:type: List[ListPreviewResultsResponseFields]
"""
self._attrs["fields"] = fields
@property
def messages(self) -> "List[Message]":
""" Gets the messages of this ListSearchResultsResponse.
"""
return [Message._from_dict(i) for i in self._attrs.get("messages")]
@messages.setter
def messages(self, messages: "List[Message]"):
"""Sets the messages of this ListSearchResultsResponse.
:param messages: The messages of this ListSearchResultsResponse.
:type: List[Message]
"""
self._attrs["messages"] = messages
@property
def next_link(self) -> "str":
""" Gets the next_link of this ListSearchResultsResponse.
"""
return self._attrs.get("nextLink")
@next_link.setter
def next_link(self, next_link: "str"):
"""Sets the next_link of this ListSearchResultsResponse.
:param next_link: The next_link of this ListSearchResultsResponse.
:type: str
"""
self._attrs["nextLink"] = next_link
@property
def wait(self) -> "str":
""" Gets the wait of this ListSearchResultsResponse.
"""
return self._attrs.get("wait")
@wait.setter
def wait(self, wait: "str"):
"""Sets the wait of this ListSearchResultsResponse.
:param wait: The wait of this ListSearchResultsResponse.
:type: str
"""
self._attrs["wait"] = wait
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class SearchJob(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "SearchJob":
instance = SearchJob.__new__(SearchJob)
instance._attrs = model
return instance
def __init__(self, query: "str", allow_side_effects: "bool" = False, collect_event_summary: "bool" = False, collect_field_summary: "bool" = False, collect_time_buckets: "bool" = False, completion_time: "str" = None, dispatch_time: "str" = None, enable_preview: "bool" = False, extract_all_fields: "bool" = False, extract_fields: "str" = 'none', max_time: "int" = 3600, messages: "List[Message]" = None, module: "str" = '', name: "str" = None, percent_complete: "int" = 0, preview_available: "str" = 'false', query_parameters: "QueryParameters" = None, required_freshness: "int" = 0, resolved_earliest: "str" = None, resolved_latest: "str" = None, results_available: "int" = 0, results_preview_available: "int" = 0, sid: "str" = None, status: "SearchStatus" = None, **extra):
"""SearchJob"""
self._attrs = dict()
if query is not None:
self._attrs["query"] = query
if allow_side_effects is not None:
self._attrs["allowSideEffects"] = allow_side_effects
if collect_event_summary is not None:
self._attrs["collectEventSummary"] = collect_event_summary
if collect_field_summary is not None:
self._attrs["collectFieldSummary"] = collect_field_summary
if collect_time_buckets is not None:
self._attrs["collectTimeBuckets"] = collect_time_buckets
if completion_time is not None:
self._attrs["completionTime"] = completion_time
if dispatch_time is not None:
self._attrs["dispatchTime"] = dispatch_time
if enable_preview is not None:
self._attrs["enablePreview"] = enable_preview
if extract_all_fields is not None:
self._attrs["extractAllFields"] = extract_all_fields
if extract_fields is not None:
self._attrs["extractFields"] = extract_fields
if max_time is not None:
self._attrs["maxTime"] = max_time
if messages is not None:
self._attrs["messages"] = messages
if module is not None:
self._attrs["module"] = module
if name is not None:
self._attrs["name"] = name
if percent_complete is not None:
self._attrs["percentComplete"] = percent_complete
if preview_available is not None:
self._attrs["previewAvailable"] = preview_available
if query_parameters is not None:
self._attrs["queryParameters"] = query_parameters.to_dict()
if required_freshness is not None:
self._attrs["requiredFreshness"] = required_freshness
if resolved_earliest is not None:
self._attrs["resolvedEarliest"] = resolved_earliest
if resolved_latest is not None:
self._attrs["resolvedLatest"] = resolved_latest
if results_available is not None:
self._attrs["resultsAvailable"] = results_available
if results_preview_available is not None:
self._attrs["resultsPreviewAvailable"] = results_preview_available
if sid is not None:
self._attrs["sid"] = sid
if status is not None:
self._attrs["status"] = status
for k, v in extra.items():
self._attrs[k] = v
@property
def query(self) -> "str":
""" Gets the query of this SearchJob.
The SPL search string.
"""
return self._attrs.get("query")
@query.setter
def query(self, query: "str"):
"""Sets the query of this SearchJob.
The SPL search string.
:param query: The query of this SearchJob.
:type: str
"""
if query is None:
raise ValueError("Invalid value for `query`, must not be `None`")
self._attrs["query"] = query
@property
def allow_side_effects(self) -> "bool":
""" Gets the allow_side_effects of this SearchJob.
Specifies whether a search that contains commands with side effects (with possible security risks) is allowed to run.
"""
return self._attrs.get("allowSideEffects")
@allow_side_effects.setter
def allow_side_effects(self, allow_side_effects: "bool"):
"""Sets the allow_side_effects of this SearchJob.
Specifies whether a search that contains commands with side effects (with possible security risks) is allowed to run.
:param allow_side_effects: The allow_side_effects of this SearchJob.
:type: bool
"""
self._attrs["allowSideEffects"] = allow_side_effects
@property
def collect_event_summary(self) -> "bool":
""" Gets the collect_event_summary of this SearchJob.
Specifies whether a search is allowed to collect event summary information during the run time.
"""
return self._attrs.get("collectEventSummary")
@collect_event_summary.setter
def collect_event_summary(self, collect_event_summary: "bool"):
"""Sets the collect_event_summary of this SearchJob.
Specifies whether a search is allowed to collect event summary information during the run time.
:param collect_event_summary: The collect_event_summary of this SearchJob.
:type: bool
"""
self._attrs["collectEventSummary"] = collect_event_summary
@property
def collect_field_summary(self) -> "bool":
""" Gets the collect_field_summary of this SearchJob.
Specifies whether a search is allowed to collect field summary information during the run time.
"""
return self._attrs.get("collectFieldSummary")
@collect_field_summary.setter
def collect_field_summary(self, collect_field_summary: "bool"):
"""Sets the collect_field_summary of this SearchJob.
Specifies whether a search is allowed to collect field summary information during the run time.
:param collect_field_summary: The collect_field_summary of this SearchJob.
:type: bool
"""
self._attrs["collectFieldSummary"] = collect_field_summary
@property
def collect_time_buckets(self) -> "bool":
""" Gets the collect_time_buckets of this SearchJob.
Specifies whether a search is allowed to collect timeline bucket summary information during the run time.
"""
return self._attrs.get("collectTimeBuckets")
@collect_time_buckets.setter
def collect_time_buckets(self, collect_time_buckets: "bool"):
"""Sets the collect_time_buckets of this SearchJob.
Specifies whether a search is allowed to collect timeline bucket summary information during the run time.
:param collect_time_buckets: The collect_time_buckets of this SearchJob.
:type: bool
"""
self._attrs["collectTimeBuckets"] = collect_time_buckets
@property
def completion_time(self) -> "str":
""" Gets the completion_time of this SearchJob.
The time, in GMT, that the search job is finished. Empty if the search job has not completed.
"""
return self._attrs.get("completionTime")
@completion_time.setter
def completion_time(self, completion_time: "str"):
"""Sets the completion_time of this SearchJob.
The time, in GMT, that the search job is finished. Empty if the search job has not completed.
:param completion_time: The completion_time of this SearchJob.
:type: str
"""
self._attrs["completionTime"] = completion_time
@property
def dispatch_time(self) -> "str":
""" Gets the dispatch_time of this SearchJob.
The time, in GMT, that the search job is dispatched.
"""
return self._attrs.get("dispatchTime")
@dispatch_time.setter
def dispatch_time(self, dispatch_time: "str"):
"""Sets the dispatch_time of this SearchJob.
The time, in GMT, that the search job is dispatched.
:param dispatch_time: The dispatch_time of this SearchJob.
:type: str
"""
self._attrs["dispatchTime"] = dispatch_time
@property
def enable_preview(self) -> "bool":
""" Gets the enable_preview of this SearchJob.
Specifies whether a search is allowed to collect preview results during the run time.
"""
return self._attrs.get("enablePreview")
@enable_preview.setter
def enable_preview(self, enable_preview: "bool"):
"""Sets the enable_preview of this SearchJob.
Specifies whether a search is allowed to collect preview results during the run time.
:param enable_preview: The enable_preview of this SearchJob.
:type: bool
"""
self._attrs["enablePreview"] = enable_preview
@property
def extract_all_fields(self) -> "bool":
""" Gets the extract_all_fields of this SearchJob.
Specifies whether the Search service should extract all of the available fields in the data, including fields not mentioned in the SPL, for the search job. Set to 'false' for better search performance. The 'extractAllFields' parameter is deprecated as of version v3alpha1. Although this parameter continues to function, it might be removed in a future version. Use the 'extractFields' parameter instead.
"""
return self._attrs.get("extractAllFields")
@extract_all_fields.setter
def extract_all_fields(self, extract_all_fields: "bool"):
"""Sets the extract_all_fields of this SearchJob.
Specifies whether the Search service should extract all of the available fields in the data, including fields not mentioned in the SPL, for the search job. Set to 'false' for better search performance. The 'extractAllFields' parameter is deprecated as of version v3alpha1. Although this parameter continues to function, it might be removed in a future version. Use the 'extractFields' parameter instead.
:param extract_all_fields: The extract_all_fields of this SearchJob.
:type: bool
"""
self._attrs["extractAllFields"] = extract_all_fields
@property
def extract_fields(self) -> "str":
""" Gets the extract_fields of this SearchJob.
Specifies how the Search service should extract fields. Valid values include 'all', 'none', or 'indexed'. Use 'all' to extract all fields. Use 'indexed' to extract only indexed fields. Use 'none' to extract only the default fields.
"""
return self._attrs.get("extractFields")
@extract_fields.setter
def extract_fields(self, extract_fields: "str"):
"""Sets the extract_fields of this SearchJob.
Specifies how the Search service should extract fields. Valid values include 'all', 'none', or 'indexed'. Use 'all' to extract all fields. Use 'indexed' to extract only indexed fields. Use 'none' to extract only the default fields.
:param extract_fields: The extract_fields of this SearchJob.
:type: str
"""
self._attrs["extractFields"] = extract_fields
@property
def max_time(self) -> "int":
""" Gets the max_time of this SearchJob.
The number of seconds to run the search before finalizing the search. The maximum value is 3600 seconds (1 hour).
"""
return self._attrs.get("maxTime")
@max_time.setter
def max_time(self, max_time: "int"):
"""Sets the max_time of this SearchJob.
The number of seconds to run the search before finalizing the search. The maximum value is 3600 seconds (1 hour).
:param max_time: The max_time of this SearchJob.
:type: int
"""
self._attrs["maxTime"] = max_time
@property
def messages(self) -> "List[Message]":
""" Gets the messages of this SearchJob.
"""
return [Message._from_dict(i) for i in self._attrs.get("messages")]
@messages.setter
def messages(self, messages: "List[Message]"):
"""Sets the messages of this SearchJob.
:param messages: The messages of this SearchJob.
:type: List[Message]
"""
self._attrs["messages"] = messages
@property
def module(self) -> "str":
""" Gets the module of this SearchJob.
The module to run the search in. The default module is used if a module is not specified.
"""
return self._attrs.get("module")
@module.setter
def module(self, module: "str"):
"""Sets the module of this SearchJob.
The module to run the search in. The default module is used if a module is not specified.
:param module: The module of this SearchJob.
:type: str
"""
self._attrs["module"] = module
@property
def name(self) -> "str":
""" Gets the name of this SearchJob.
The name of the search job.
"""
return self._attrs.get("name")
@name.setter
def name(self, name: "str"):
"""Sets the name of this SearchJob.
The name of the search job.
:param name: The name of this SearchJob.
:type: str
"""
self._attrs["name"] = name
@property
def percent_complete(self) -> "int":
""" Gets the percent_complete of this SearchJob.
An estimate of the percent of time remaining before the job completes.
"""
return self._attrs.get("percentComplete")
@percent_complete.setter
def percent_complete(self, percent_complete: "int"):
"""Sets the percent_complete of this SearchJob.
An estimate of the percent of time remaining before the job completes.
:param percent_complete: The percent_complete of this SearchJob.
:type: int
"""
self._attrs["percentComplete"] = percent_complete
@property
def preview_available(self) -> "str":
""" Gets the preview_available of this SearchJob.
Specifies if preview results for the search job are available. The valid status values are 'unknown', 'true', and 'false'. You must set the 'enablePreview=true' parameter to return preview search results.
"""
return self._attrs.get("previewAvailable")
@preview_available.setter
def preview_available(self, preview_available: "str"):
"""Sets the preview_available of this SearchJob.
Specifies if preview results for the search job are available. The valid status values are 'unknown', 'true', and 'false'. You must set the 'enablePreview=true' parameter to return preview search results.
:param preview_available: The preview_available of this SearchJob.
:type: str
"""
self._attrs["previewAvailable"] = preview_available
@property
def query_parameters(self) -> "QueryParameters":
""" Gets the query_parameters of this SearchJob.
Represents parameters on the search job such as 'earliest' and 'latest'.
"""
return QueryParameters._from_dict(self._attrs["queryParameters"])
@query_parameters.setter
def query_parameters(self, query_parameters: "QueryParameters"):
"""Sets the query_parameters of this SearchJob.
Represents parameters on the search job such as 'earliest' and 'latest'.
:param query_parameters: The query_parameters of this SearchJob.
:type: QueryParameters
"""
self._attrs["queryParameters"] = query_parameters.to_dict()
@property
def required_freshness(self) -> "int":
""" Gets the required_freshness of this SearchJob.
Specifies a maximum time interval, in seconds, between identical existing searches. The 'requiredFreshness' parameter is used to determine if an existing search with the same query and the same time boundaries can be reused, instead of running the same search again. Freshness is applied to the 'resolvedEarliest' and 'resolvedLatest' parameters. If an existing search has the same exact criteria as this search and the 'resolvedEarliest' and 'resolvedLatest' values are within the freshness interval, the existing search metadata is returned instead of initiating a new search job. By default, the 'requiredFreshness' parameter is set to 0 which means that the platform does not attempt to use an existing search. The maximum value for the 'requiredFreshness' parameter is 259200 seconds (72 hours).
"""
return self._attrs.get("requiredFreshness")
@required_freshness.setter
def required_freshness(self, required_freshness: "int"):
"""Sets the required_freshness of this SearchJob.
Specifies a maximum time interval, in seconds, between identical existing searches. The 'requiredFreshness' parameter is used to determine if an existing search with the same query and the same time boundaries can be reused, instead of running the same search again. Freshness is applied to the 'resolvedEarliest' and 'resolvedLatest' parameters. If an existing search has the same exact criteria as this search and the 'resolvedEarliest' and 'resolvedLatest' values are within the freshness interval, the existing search metadata is returned instead of initiating a new search job. By default, the 'requiredFreshness' parameter is set to 0 which means that the platform does not attempt to use an existing search. The maximum value for the 'requiredFreshness' parameter is 259200 seconds (72 hours).
:param required_freshness: The required_freshness of this SearchJob.
:type: int
"""
self._attrs["requiredFreshness"] = required_freshness
@property
def resolved_earliest(self) -> "str":
""" Gets the resolved_earliest of this SearchJob.
The earliest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
"""
return self._attrs.get("resolvedEarliest")
@resolved_earliest.setter
def resolved_earliest(self, resolved_earliest: "str"):
"""Sets the resolved_earliest of this SearchJob.
The earliest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
:param resolved_earliest: The resolved_earliest of this SearchJob.
:type: str
"""
self._attrs["resolvedEarliest"] = resolved_earliest
@property
def resolved_latest(self) -> "str":
""" Gets the resolved_latest of this SearchJob.
The latest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
"""
return self._attrs.get("resolvedLatest")
@resolved_latest.setter
def resolved_latest(self, resolved_latest: "str"):
"""Sets the resolved_latest of this SearchJob.
The latest time specified as an absolute value in GMT. The time is computed based on the values you specify for the 'timezone' and 'earliest' queryParameters.
:param resolved_latest: The resolved_latest of this SearchJob.
:type: str
"""
self._attrs["resolvedLatest"] = resolved_latest
@property
def results_available(self) -> "int":
""" Gets the results_available of this SearchJob.
The number of results produced so far for the search job.
"""
return self._attrs.get("resultsAvailable")
@results_available.setter
def results_available(self, results_available: "int"):
"""Sets the results_available of this SearchJob.
The number of results produced so far for the search job.
:param results_available: The results_available of this SearchJob.
:type: int
"""
self._attrs["resultsAvailable"] = results_available
@property
def results_preview_available(self) -> "int":
""" Gets the results_preview_available of this SearchJob.
The number of the preview search results for the job with the specified search ID (SID). You must set the 'enablePreview=true' parameter to return preview search results.
"""
return self._attrs.get("resultsPreviewAvailable")
@results_preview_available.setter
def results_preview_available(self, results_preview_available: "int"):
"""Sets the results_preview_available of this SearchJob.
The number of the preview search results for the job with the specified search ID (SID). You must set the 'enablePreview=true' parameter to return preview search results.
:param results_preview_available: The results_preview_available of this SearchJob.
:type: int
"""
self._attrs["resultsPreviewAvailable"] = results_preview_available
@property
def sid(self) -> "str":
""" Gets the sid of this SearchJob.
The ID assigned to the search job.
"""
return self._attrs.get("sid")
@sid.setter
def sid(self, sid: "str"):
"""Sets the sid of this SearchJob.
The ID assigned to the search job.
:param sid: The sid of this SearchJob.
:type: str
"""
self._attrs["sid"] = sid
@property
def status(self) -> "SearchStatus":
""" Gets the status of this SearchJob.
"""
return SearchStatus.from_value(self._attrs.get("status"))
@status.setter
def status(self, status: "SearchStatus"):
"""Sets the status of this SearchJob.
:param status: The status of this SearchJob.
:type: SearchStatus
"""
if isinstance(status, Enum):
self._attrs["status"] = status.value
else:
self._attrs["status"] = status # If you supply a string, we presume you know the service will take it.
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class SingleTimeBucket(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "SingleTimeBucket":
instance = SingleTimeBucket.__new__(SingleTimeBucket)
instance._attrs = model
return instance
def __init__(self, available_count: "int" = None, duration: "float" = None, earliest_time: "float" = None, earliest_time_strf_time: "str" = None, is_finalized: "bool" = None, total_count: "int" = None, **extra):
"""SingleTimeBucket"""
self._attrs = dict()
if available_count is not None:
self._attrs["availableCount"] = available_count
if duration is not None:
self._attrs["duration"] = duration
if earliest_time is not None:
self._attrs["earliestTime"] = earliest_time
if earliest_time_strf_time is not None:
self._attrs["earliestTimeStrfTime"] = earliest_time_strf_time
if is_finalized is not None:
self._attrs["isFinalized"] = is_finalized
if total_count is not None:
self._attrs["totalCount"] = total_count
for k, v in extra.items():
self._attrs[k] = v
@property
def available_count(self) -> "int":
""" Gets the available_count of this SingleTimeBucket.
Count of available events. Not all events in a bucket are retrievable. Typically this count is capped at 10000.
"""
return self._attrs.get("availableCount")
@available_count.setter
def available_count(self, available_count: "int"):
"""Sets the available_count of this SingleTimeBucket.
Count of available events. Not all events in a bucket are retrievable. Typically this count is capped at 10000.
:param available_count: The available_count of this SingleTimeBucket.
:type: int
"""
self._attrs["availableCount"] = available_count
@property
def duration(self) -> "float":
""" Gets the duration of this SingleTimeBucket.
"""
return self._attrs.get("duration")
@duration.setter
def duration(self, duration: "float"):
"""Sets the duration of this SingleTimeBucket.
:param duration: The duration of this SingleTimeBucket.
:type: float
"""
self._attrs["duration"] = duration
@property
def earliest_time(self) -> "float":
""" Gets the earliest_time of this SingleTimeBucket.
The timestamp of the earliest event in the current bucket, in UNIX format. This is the same time as 'earliestTimeStrfTime' in UNIX format.
"""
return self._attrs.get("earliestTime")
@earliest_time.setter
def earliest_time(self, earliest_time: "float"):
"""Sets the earliest_time of this SingleTimeBucket.
The timestamp of the earliest event in the current bucket, in UNIX format. This is the same time as 'earliestTimeStrfTime' in UNIX format.
:param earliest_time: The earliest_time of this SingleTimeBucket.
:type: float
"""
self._attrs["earliestTime"] = earliest_time
@property
def earliest_time_strf_time(self) -> "str":
""" Gets the earliest_time_strf_time of this SingleTimeBucket.
The timestamp of the earliest event in the current bucket, in UTC format with seconds. For example 2021-01-25T13:15:30Z, which follows the ISO-8601 (%FT%T.%Q) format.
"""
return self._attrs.get("earliestTimeStrfTime")
@earliest_time_strf_time.setter
def earliest_time_strf_time(self, earliest_time_strf_time: "str"):
"""Sets the earliest_time_strf_time of this SingleTimeBucket.
The timestamp of the earliest event in the current bucket, in UTC format with seconds. For example 2021-01-25T13:15:30Z, which follows the ISO-8601 (%FT%T.%Q) format.
:param earliest_time_strf_time: The earliest_time_strf_time of this SingleTimeBucket.
:type: str
"""
self._attrs["earliestTimeStrfTime"] = earliest_time_strf_time
@property
def is_finalized(self) -> "bool":
""" Gets the is_finalized of this SingleTimeBucket.
Specifies if all of the events in the current bucket have been finalized.
"""
return self._attrs.get("isFinalized")
@is_finalized.setter
def is_finalized(self, is_finalized: "bool"):
"""Sets the is_finalized of this SingleTimeBucket.
Specifies if all of the events in the current bucket have been finalized.
:param is_finalized: The is_finalized of this SingleTimeBucket.
:type: bool
"""
self._attrs["isFinalized"] = is_finalized
@property
def total_count(self) -> "int":
""" Gets the total_count of this SingleTimeBucket.
The total count of the events in the current bucket.
"""
return self._attrs.get("totalCount")
@total_count.setter
def total_count(self, total_count: "int"):
"""Sets the total_count of this SingleTimeBucket.
The total count of the events in the current bucket.
:param total_count: The total_count of this SingleTimeBucket.
:type: int
"""
self._attrs["totalCount"] = total_count
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class TimeBucketsSummary(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "TimeBucketsSummary":
instance = TimeBucketsSummary.__new__(TimeBucketsSummary)
instance._attrs = model
return instance
def __init__(self, is_time_cursored: "bool" = None, buckets: "List[SingleTimeBucket]" = None, cursor_time: "float" = None, event_count: "int" = None, **extra):
"""TimeBucketsSummary"""
self._attrs = dict()
if is_time_cursored is not None:
self._attrs["IsTimeCursored"] = is_time_cursored
if buckets is not None:
self._attrs["buckets"] = buckets
if cursor_time is not None:
self._attrs["cursorTime"] = cursor_time
if event_count is not None:
self._attrs["eventCount"] = event_count
for k, v in extra.items():
self._attrs[k] = v
@property
def is_time_cursored(self) -> "bool":
""" Gets the is_time_cursored of this TimeBucketsSummary.
Specifies if the events are returned in time order.
"""
return self._attrs.get("IsTimeCursored")
@is_time_cursored.setter
def is_time_cursored(self, is_time_cursored: "bool"):
"""Sets the is_time_cursored of this TimeBucketsSummary.
Specifies if the events are returned in time order.
:param is_time_cursored: The is_time_cursored of this TimeBucketsSummary.
:type: bool
"""
self._attrs["IsTimeCursored"] = is_time_cursored
@property
def buckets(self) -> "List[SingleTimeBucket]":
""" Gets the buckets of this TimeBucketsSummary.
"""
return [SingleTimeBucket._from_dict(i) for i in self._attrs.get("buckets")]
@buckets.setter
def buckets(self, buckets: "List[SingleTimeBucket]"):
"""Sets the buckets of this TimeBucketsSummary.
:param buckets: The buckets of this TimeBucketsSummary.
:type: List[SingleTimeBucket]
"""
self._attrs["buckets"] = buckets
@property
def cursor_time(self) -> "float":
""" Gets the cursor_time of this TimeBucketsSummary.
Identifies where the cursor is in processing the events. The 'cursorTime' is a timestamp specified in UNIX time.
"""
return self._attrs.get("cursorTime")
@cursor_time.setter
def cursor_time(self, cursor_time: "float"):
"""Sets the cursor_time of this TimeBucketsSummary.
Identifies where the cursor is in processing the events. The 'cursorTime' is a timestamp specified in UNIX time.
:param cursor_time: The cursor_time of this TimeBucketsSummary.
:type: float
"""
self._attrs["cursorTime"] = cursor_time
@property
def event_count(self) -> "int":
""" Gets the event_count of this TimeBucketsSummary.
The number of events processed at the 'cursorTime'.
"""
return self._attrs.get("eventCount")
@event_count.setter
def event_count(self, event_count: "int"):
"""Sets the event_count of this TimeBucketsSummary.
The number of events processed at the 'cursorTime'.
:param event_count: The event_count of this TimeBucketsSummary.
:type: int
"""
self._attrs["eventCount"] = event_count
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
class StatusEnum(str, Enum):
CANCELED = "canceled"
FINALIZED = "finalized"
@staticmethod
def from_value(value: str):
if value == "canceled":
return StatusEnum.CANCELED
if value == "finalized":
return StatusEnum.FINALIZED
class UpdateJob(SSCModel):
@staticmethod
def _from_dict(model: dict) -> "UpdateJob":
instance = UpdateJob.__new__(UpdateJob)
instance._attrs = model
return instance
def __init__(self, status: "str", **extra):
"""UpdateJob"""
self._attrs = dict()
if status is not None:
self._attrs["status"] = status
for k, v in extra.items():
self._attrs[k] = v
@property
def status(self) -> "StatusEnum":
""" Gets the status of this UpdateJob.
Modify the status of an existing search job using PATCH. The only status values you can PATCH are 'canceled' and 'finalized'. You can PATCH the 'canceled' status only to a search job that is running. 'finalize' means to terminate the search job, and the status will be set to 'failed'.
"""
return StatusEnum.from_value(self._attrs.get("status"))
@status.setter
def status(self, status: "str"):
"""Sets the status of this UpdateJob.
Modify the status of an existing search job using PATCH. The only status values you can PATCH are 'canceled' and 'finalized'. You can PATCH the 'canceled' status only to a search job that is running. 'finalize' means to terminate the search job, and the status will be set to 'failed'.
:param status: The status of this UpdateJob.
:type: str
"""
if status is None:
raise ValueError("Invalid value for `status`, must not be `None`")
if isinstance(status, Enum):
self._attrs["status"] = status.value
else:
self._attrs["status"] = status # If you supply a string, we presume you know the service will take it.
def to_dict(self):
return {k: v for (k, v) in self._attrs.items() if v is not None}
| 38.355925 | 809 | 0.653023 | 10,935 | 89,983 | 5.240329 | 0.042067 | 0.054814 | 0.018062 | 0.02314 | 0.871228 | 0.817758 | 0.778912 | 0.728269 | 0.693943 | 0.686683 | 0 | 0.00381 | 0.256215 | 89,983 | 2,345 | 810 | 38.372281 | 0.852364 | 0.425091 | 0 | 0.735182 | 0 | 0 | 0.125269 | 0.012559 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235182 | false | 0 | 0.00478 | 0.012428 | 0.39675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4dabc8834219be688f4ea3c3ae4db2a0897d837 | 202 | py | Python | l1c1_test.py | jchidley/foobar | 6f3bf700203c29285b4bd3533bc39671604b516d | [
"0BSD"
] | null | null | null | l1c1_test.py | jchidley/foobar | 6f3bf700203c29285b4bd3533bc39671604b516d | [
"0BSD"
] | null | null | null | l1c1_test.py | jchidley/foobar | 6f3bf700203c29285b4bd3533bc39671604b516d | [
"0BSD"
] | null | null | null | # https://foobar.withgoogle.com
# Level 1, Challenge 1
import l1c1_solution
def test_0():
assert l1c1_solution.solution(0) == "23571"
def test_3():
assert l1c1_solution.solution(3) == "71113"
| 20.2 | 47 | 0.707921 | 29 | 202 | 4.758621 | 0.586207 | 0.26087 | 0.26087 | 0.376812 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128655 | 0.153465 | 202 | 9 | 48 | 22.444444 | 0.678363 | 0.247525 | 0 | 0 | 0 | 0 | 0.067114 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.4 | true | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
6f9d265306c48a5675d63fe4bf0b29ab4c1e9e5a | 13,786 | py | Python | src/oci/bastion/bastion_client_composite_operations.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/bastion/bastion_client_composite_operations.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/bastion/bastion_client_composite_operations.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
import oci # noqa: F401
from oci.util import WAIT_RESOURCE_NOT_FOUND # noqa: F401
class BastionClientCompositeOperations(object):
"""
This class provides a wrapper around :py:class:`~oci.bastion.BastionClient` and offers convenience methods
for operations that would otherwise need to be chained together. For example, instead of performing an action
on a resource (e.g. launching an instance, creating a load balancer) and then using a waiter to wait for the resource
to enter a given state, you can call a single method in this class to accomplish the same functionality
"""
def __init__(self, client, **kwargs):
"""
Creates a new BastionClientCompositeOperations object
:param BastionClient client:
The service client which will be wrapped by this object
"""
self.client = client
def create_bastion_and_wait_for_state(self, create_bastion_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.create_bastion` and waits for the :py:class:`~oci.bastion.models.WorkRequest`
to enter the given state(s).
:param oci.bastion.models.CreateBastionDetails create_bastion_details: (required)
Details for the new bastion.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.WorkRequest.status`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.create_bastion`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.create_bastion(create_bastion_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.headers['opc-work-request-id']
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_work_request(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'status') and getattr(r.data, 'status').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def create_session_and_wait_for_state(self, create_session_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.create_session` and waits for the :py:class:`~oci.bastion.models.WorkRequest`
to enter the given state(s).
:param oci.bastion.models.CreateSessionDetails create_session_details: (required)
Details for the new session.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.WorkRequest.status`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.create_session`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.create_session(create_session_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.headers['opc-work-request-id']
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_work_request(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'status') and getattr(r.data, 'status').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def delete_bastion_and_wait_for_state(self, bastion_id, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.delete_bastion` and waits for the :py:class:`~oci.bastion.models.WorkRequest`
to enter the given state(s).
:param str bastion_id: (required)
The unique identifier (OCID) of the bastion.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.WorkRequest.status`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.delete_bastion`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = None
try:
operation_result = self.client.delete_bastion(bastion_id, **operation_kwargs)
except oci.exceptions.ServiceError as e:
if e.status == 404:
return WAIT_RESOURCE_NOT_FOUND
else:
raise e
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.headers['opc-work-request-id']
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_work_request(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'status') and getattr(r.data, 'status').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def delete_session_and_wait_for_state(self, session_id, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.delete_session` and waits for the :py:class:`~oci.bastion.models.WorkRequest`
to enter the given state(s).
:param str session_id: (required)
The unique identifier (OCID) of the session.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.WorkRequest.status`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.delete_session`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = None
try:
operation_result = self.client.delete_session(session_id, **operation_kwargs)
except oci.exceptions.ServiceError as e:
if e.status == 404:
return WAIT_RESOURCE_NOT_FOUND
else:
raise e
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.headers['opc-work-request-id']
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_work_request(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'status') and getattr(r.data, 'status').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def update_bastion_and_wait_for_state(self, bastion_id, update_bastion_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.update_bastion` and waits for the :py:class:`~oci.bastion.models.WorkRequest`
to enter the given state(s).
:param str bastion_id: (required)
The unique identifier (OCID) of the bastion.
:param oci.bastion.models.UpdateBastionDetails update_bastion_details: (required)
The bastion information to be updated.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.WorkRequest.status`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.update_bastion`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.update_bastion(bastion_id, update_bastion_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.headers['opc-work-request-id']
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_work_request(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'status') and getattr(r.data, 'status').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def update_session_and_wait_for_state(self, session_id, update_session_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.bastion.BastionClient.update_session` and waits for the :py:class:`~oci.bastion.models.Session` acted upon
to enter the given state(s).
:param str session_id: (required)
The unique identifier (OCID) of the session.
:param oci.bastion.models.UpdateSessionDetails update_session_details: (required)
The session information to be updated.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.bastion.models.Session.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.bastion.BastionClient.update_session`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.update_session(session_id, update_session_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_session(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
| 50.130909 | 245 | 0.678297 | 1,795 | 13,786 | 4.999443 | 0.114206 | 0.042902 | 0.052151 | 0.021395 | 0.863383 | 0.854246 | 0.840205 | 0.840205 | 0.82505 | 0.82505 | 0 | 0.002587 | 0.243073 | 13,786 | 274 | 246 | 50.313869 | 0.857403 | 0.46199 | 0 | 0.785124 | 0 | 0 | 0.027628 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057851 | false | 0 | 0.016529 | 0 | 0.198347 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b510511c3ecb4fb50acf2c6d53564adc8140c697 | 120 | py | Python | jina/peapods/__init__.py | Rohitpandit021/jina | f3db4d5e480375d8dc3bceda814ac1963dee76d7 | [
"Apache-2.0"
] | 15,179 | 2020-04-28T10:23:56.000Z | 2022-03-31T14:35:25.000Z | jina/peapods/__init__.py | Rohitpandit021/jina | f3db4d5e480375d8dc3bceda814ac1963dee76d7 | [
"Apache-2.0"
] | 3,912 | 2020-04-28T13:01:29.000Z | 2022-03-31T14:36:46.000Z | jina/peapods/__init__.py | Rohitpandit021/jina | f3db4d5e480375d8dc3bceda814ac1963dee76d7 | [
"Apache-2.0"
] | 1,955 | 2020-04-28T10:50:49.000Z | 2022-03-31T12:28:34.000Z | from .peas import BasePea as Pea
from .pods import BasePod
from .pods import Pod
from .pods.compound import CompoundPod
| 24 | 38 | 0.808333 | 19 | 120 | 5.105263 | 0.578947 | 0.247423 | 0.28866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 120 | 4 | 39 | 30 | 0.95098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d209d13b6f01ba9f578cad01317a4fc4a3e0ae43 | 72 | py | Python | Practice/Python/TextWrap.py | avantikasharma/HackerRank-Solutions | a980859ac352688853fcbcf3c7ec6d95685f99ea | [
"MIT"
] | 1 | 2018-07-08T15:44:15.000Z | 2018-07-08T15:44:15.000Z | Practice/Python/TextWrap.py | avantikasharma/HackerRank-Solutions | a980859ac352688853fcbcf3c7ec6d95685f99ea | [
"MIT"
] | null | null | null | Practice/Python/TextWrap.py | avantikasharma/HackerRank-Solutions | a980859ac352688853fcbcf3c7ec6d95685f99ea | [
"MIT"
] | 2 | 2018-08-10T06:49:34.000Z | 2020-10-01T04:50:59.000Z | def wrap(string, max_width):
return textwrap.fill(string,max_width)
| 24 | 42 | 0.763889 | 11 | 72 | 4.818182 | 0.727273 | 0.339623 | 0.528302 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 72 | 2 | 43 | 36 | 0.84127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d21798645c04075e46d8e69369ba8498b7bdabb3 | 47,028 | py | Python | tests/apps_test.py | uk-gov-mirror/UKHomeOffice.dq-tf-apps | 21696ec3ea336397d354cb63d2a4c561f066231f | [
"MIT"
] | null | null | null | tests/apps_test.py | uk-gov-mirror/UKHomeOffice.dq-tf-apps | 21696ec3ea336397d354cb63d2a4c561f066231f | [
"MIT"
] | null | null | null | tests/apps_test.py | uk-gov-mirror/UKHomeOffice.dq-tf-apps | 21696ec3ea336397d354cb63d2a4c561f066231f | [
"MIT"
] | null | null | null | # pylint: disable=missing-docstring, line-too-long, protected-access, E1101, C0202, E0602, W0109
import unittest
from runner import Runner
class TestE2E(unittest.TestCase):
@classmethod
def setUpClass(self):
self.snippet = """
provider "aws" {
region = "eu-west-2"
skip_credentials_validation = true
skip_get_ec2_platforms = true
}
module "apps" {
source = "./mymodule"
providers = {
aws = aws
}
cidr_block = "10.1.0.0/16"
public_subnet_cidr_block = "10.1.0.0/24"
ad_subnet_cidr_block = "10.1.0.0/24"
az = "eu-west-2a"
az2 = "eu-west-2b"
adminpassword = "1234"
ad_aws_ssm_document_name = "1234"
ad_writer_instance_profile_name = "1234"
naming_suffix = "preprod-dq"
namespace = "preprod"
haproxy_private_ip = "1.2.3.3"
haproxy_private_ip2 = "1.2.3.4"
s3_httpd_config_bucket = "s3-bucket-name"
s3_httpd_config_bucket_key = "arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab"
haproxy_config_bucket = "s3-bucket-name"
haproxy_config_bucket_key = "arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab"
s3_bucket_name = {
archive_log = "abcd"
archive_data = "abcd"
working_data = "abcd"
landing_data = "abcd"
airports_archive = "abcd"
airports_working = "abcd"
airports_internal = "abcd"
oag_archive = "abcd"
oag_internal = "abcd"
oag_transform = "abcd"
acl_archive = "abcd"
acl_internal = "abcd"
reference_data_archive = "abcd"
reference_data_internal = "abcd"
consolidated_schedule = "abcd"
api_archive = "abcd"
api_internal = "abcd"
api_record_level_scoring = "abcd"
gait_internal = "abcd"
cross_record_scored = "abcd"
reporting_internal_working = "abcd"
carrier_portal_working = "abcd"
mds_extract = "abcd"
raw_file_index_internal = "abcd"
fms_working = "abcd"
drt_working = "abcd"
athena_log = "abcd"
ops_pipeline = "abcd"
nats_archive = "abcd"
nats_internal = "abcd"
cdlz_bitd_input = "abcd"
api_arrivals = "abcd"
accuracy_score = "abcd"
api_cdlz_msk = "abcd"
drt_export = "abcd"
api_rls_xrs_reconciliation = "abcd"
dq_fs_archive = "abcd"
dq_fs_internal = "abcd"
dq_aws_config = "abcd"
dq_asn_archive = "abcd"
dq_asn_internal = "abcd"
dq_snsgb_archive = "abcd"
dq_snsgb_internal = "abcd"
aftc_sc_msk = "abcd"
dq_asn_marine_archive = "abcd"
dq_asn_marine_internal = "abcd"
dq_rm_archive = "abcd"
dq_rm_internal = "abcd"
dq_data_generator = "abcd"
}
s3_bucket_acl = {
archive_log = "private"
archive_data = "private"
working_data = "private"
landing_data = "private"
airports_archive = "private"
airports_working = "private"
airports_internal = "private"
oag_archive = "private"
oag_internal = "private"
oag_transform = "private"
acl_archive = "private"
acl_internal = "private"
reference_data_archive = "private"
reference_data_internal = "private"
consolidated_schedule = "private"
api_archive = "private"
api_internal = "private"
api_record_level_scoring = "private"
gait_internal = "private"
cross_record_scored = "private"
reporting_internal_working = "private"
carrier_portal_working = "private"
mds_extract = "private"
raw_file_index_internal = "private"
fms_working = "private"
drt_working = "private"
athena_log = "private"
ops_pipeline = "private"
nats_archive = "private"
nats_internal = "private"
cdlz_bitd_input = "private"
api_arrivals = "private"
accuracy_score = "private"
api_cdlz_msk = "private"
drt_export = "private"
api_rls_xrs_reconciliation = "private"
dq_fs_archive = "private"
dq_fs_internal = "private"
dq_aws_config = "private"
dq_asn_archive = "private"
dq_asn_internal = "private"
dq_snsgb_archive = "private"
dq_snsgb_internal = "private"
aftc_sc_msk = "private"
dq_asn_marine_archive = "private"
dq_asn_marine_internal = "private"
dq_rm_archive = "private"
dq_rm_internal = "private"
dq_data_generator = "private"
}
route_table_cidr_blocks = {
peering_cidr = "10.3.0.0/16"
ops_cidr = "10.2.0.0/24"
}
vpc_peering_connection_ids = {
peering_to_peering = "1234"
peering_to_ops = "1234"
}
ad_sg_cidr_ingress = [
"1.2.0.0/16",
"1.2.0.0/16",
"1.2.0.0/16"
]
}
"""
self.runner = Runner(self.snippet)
self.result = self.runner.result
def test_apps_vpc_cidr_block(self):
self.assertEqual(self.runner.get_value("module.apps.aws_vpc.appsvpc", "cidr_block"), "10.1.0.0/16")
def test_apps_public_cidr(self):
self.assertEqual(self.runner.get_value("module.apps.aws_subnet.public_subnet", "cidr_block"), "10.1.0.0/24")
def test_az_public_subnet(self):
self.assertEqual(self.runner.get_value("module.apps.aws_subnet.public_subnet", "availability_zone"), "eu-west-2a")
def test_name_suffix_ari(self):
self.assertEqual(self.runner.get_value("module.apps.aws_internet_gateway.AppsRouteToInternet", "tags"), {"Name": "igw-apps-preprod-dq"})
def test_name_suffix_appsvpc(self):
self.assertEqual(self.runner.get_value("module.apps.aws_vpc.appsvpc", "tags"), {"Name": "vpc-apps-preprod-dq"})
def test_name_suffix_public_subnet(self):
self.assertEqual(self.runner.get_value("module.apps.aws_subnet.public_subnet", "tags"), {"Name": "public-subnet-apps-preprod-dq"})
def test_name_suffix_ad_subnet(self):
self.assertEqual(self.runner.get_value("module.apps.aws_subnet.ad_subnet", "tags"), {"Name": "ad-subnet-apps-preprod-dq"})
def test_name_suffix_route_table(self):
self.assertEqual(self.runner.get_value("module.apps.aws_route_table.apps_route_table", "tags"), {"Name": "route-table-apps-preprod-dq"})
def test_name_suffix_public_route(self):
self.assertEqual(self.runner.get_value("module.apps.aws_route_table.apps_public_route_table", "tags"), {"Name": "public-route-table-apps-preprod-dq"})
def test_name_suffix_appsnatgw(self):
self.assertEqual(self.runner.get_value("module.apps.aws_nat_gateway.appsnatgw", "tags"), {"Name": "natgw-apps-preprod-dq"})
def test_name_suffix_archive_log(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.log_archive_bucket", "tags"), {"Name": "s3-log-archive-bucket-apps-preprod-dq"})
def test_name_suffix_data_archive_log(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.data_archive_bucket", "tags"), {"Name": "s3-data-archive-bucket-apps-preprod-dq"})
def test_name_suffix_data_working(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.data_working_bucket", "tags"), {"Name": "s3-data-working-bucket-apps-preprod-dq"})
def test_name_suffix_airports_archive(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.airports_archive_bucket", "tags"), {"Name": "s3-dq-airports-archive-apps-preprod-dq"})
def test_name_suffix_airports_internal(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.airports_internal_bucket", "tags"), {"Name": "s3-dq-airports-internal-apps-preprod-dq"})
def test_name_suffix_airports_working(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.airports_working_bucket", "tags"), {"Name": "s3-dq-airports-working-apps-preprod-dq"})
def test_name_suffix_carrier_portal_working(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.carrier_portal_working_bucket", "tags"), {"Name": "s3-dq-carrier-portal-working-apps-preprod-dq"})
def test_name_suffix_nats_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.nats", "name"), "iam-group-nats-apps-preprod-dq")
def test_name_suffix_nats_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.nats", "name"), "iam-group-membership-nats-apps-preprod-dq")
def test_name_suffix_nats_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.nats", "name"), "group-policy-nats-apps-preprod-dq")
def test_name_suffix_nats_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.nats", "name"), "iam-user-nats-apps-preprod-dq")
def test_name_suffix_rds_deploy_iam_lambda_rds(self):
self.assertEqual(self.runner.get_value("module.apps.module.rds_deploy.aws_iam_role.lambda_rds[0]", "tags"), {"Name": "iam-lambda-rds-deploy-apps-preprod-dq"})
def test_name_suffix_rds_deploy_lambda_function(self):
self.assertEqual(self.runner.get_value("module.apps.module.rds_deploy.aws_lambda_function.lambda_rds[0]", "tags"), {"Name": "lambda-rds-deploy-apps-preprod-dq"})
def test_name_suffix_rds_deploy_cloudwatch_log_group(self):
self.assertEqual(self.runner.get_value("module.apps.module.rds_deploy.aws_cloudwatch_log_group.lambda_rds[0]", "tags"), {"Name": "log-lambda-rds-deploy-apps-preprod-dq"})
def test_name_suffix_oag_archive(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.oag_archive_bucket", "tags"), {"Name": "s3-dq-oag-archive-apps-preprod-dq"})
def test_name_suffix_oag_internal(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.oag_internal_bucket", "tags"), {"Name": "s3-dq-oag-internal-apps-preprod-dq"})
def test_name_suffix_oag_transform(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.oag_transform_bucket", "tags"), {"Name": "s3-dq-oag-transform-apps-preprod-dq"})
def test_name_oag_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.oag", "name"), "iam-group-oag-apps-preprod-dq")
def test_name_oag_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.oag", "name"), "iam-group-membership-oag-apps-preprod-dq")
def test_name_oag_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.oag", "name"), "group-policy-oag-apps-preprod-dq")
def test_name_oag_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.oag", "name"), "iam-user-oag-apps-preprod-dq")
def test_name_acl_archive_bucket(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.acl_archive_bucket", "tags"), {"Name": "s3-dq-acl-archive-apps-preprod-dq"})
def test_name_acl_internal_bucket(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.acl_internal_bucket", "tags"), {"Name": "s3-dq-acl-internal-apps-preprod-dq"})
def test_name_acl_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.acl", "name"), "iam-group-acl-apps-preprod-dq")
def test_name_acl_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.acl", "name"), "iam-group-membership-acl-apps-preprod-dq")
def test_name_acl_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.acl", "name"), "group-policy-acl-apps-preprod-dq")
def test_name_acl_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.acl", "name"), "iam-user-acl-apps-preprod-dq")
def test_name_suffix_oag_input_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_iam_role.lambda_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_cloudwatch_log_group.lambda_trigger[0]", "tags"), {"Name": "log-lambda-trigger-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_lambda_oag_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_lambda_function.lambda_oag[0]", "tags"), {"Name": "lambda-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_input_pipeline_log_lambda_oag(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_input_pipeline.aws_cloudwatch_log_group.lambda_oag[0]", "tags"), {"Name": "log-lambda-oag-input-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_lambda_function.lambda_athena[0]", "tags"), {"Name": "lambda-athena-oag-transform-apps-preprod-dq"})
def test_name_suffix_oag_transform_pipeline_log_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.oag_transform_pipeline.aws_cloudwatch_log_group.lambda_log_group_athena[0]", "tags"), {"Name": "lambda-log-group-athena-oag-transform-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_lambda_acl_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-acl-input-apps-preprod-dq"})
def test_name_suffix_acl_input_pipeline_log_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.acl_input_pipeline.aws_cloudwatch_log_group.lambda_log_group_athena[0]", "tags"), {"Name": "lambda-log-group-athena-acl-input-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_lambda_function.lambda_athena[0]", "tags"), {"Name": "lambda-athena-reference-data-apps-preprod-dq"})
def test_name_suffix_reference_data_pipeline_log_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.reference_data_pipeline.aws_cloudwatch_log_group.lambda_log_group_athena[0]", "tags"), {"Name": "lambda-log-group-athena-reference-data-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_lambda_acl_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_consolidated_schedule_pipeline_log_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.consolidated_schedule_pipeline.aws_cloudwatch_log_group.lambda_log_group_athena[0]", "tags"), {"Name": "lambda-log-group-athena-consolidated-schedule-apps-preprod-dq"})
def test_name_suffix_cdlz_iam_lambda(self):
self.assertEqual(self.runner.get_value("module.apps.module.cdlz.aws_iam_role.lambda_acquire", "tags"), {"Name": "iam-lambda-cdlz-apps-preprod-dq"})
def test_name_suffix_cdlz_ssm_lambda(self):
self.assertEqual(
self.runner.get_value("module.apps.module.cdlz.aws_ssm_parameter.lambda_enabled[0]", "tags"), {"Name": "ssm-lambda-enabled-cdlz-apps-preprod-dq"})
def test_name_suffix_cdlz_lambda(self):
self.assertEqual(self.runner.get_value("module.apps.module.cdlz.aws_lambda_function.lambda_acquire[0]", "tags"), {"Name": "lambda-cdlz-apps-preprod-dq"})
def test_name_suffix_cdlz_log_lambda(self):
self.assertEqual(
self.runner.get_value("module.apps.module.cdlz.aws_cloudwatch_log_group.lambda_acquire[0]", "tags"), {"Name": "log-lambda-cdlz-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-api-input-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-api-input-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-api-input-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-input-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-api-input-apps-preprod-dq"})
def test_name_suffix_api_input_pipeline_lambda_acl_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_input_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-input-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_api_record_level_score_pipeline_lambda_acl_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_record_level_score_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-record-level-score-apps-preprod-dq"})
def test_name_suffix_gait_pipeline_step_function_exec(self):
self.assertEqual(self.runner.get_value("module.apps.module.gait_pipeline.aws_iam_role.step_function_exec[0]", "tags"), {"Name": "step-function-exec-gait-apps-preprod-dq"})
def test_name_suffix_gait_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.gait_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-gait-apps-preprod-dq"})
def test_name_suffix_gait_pipeline_lambda_gait(self):
self.assertEqual(self.runner.get_value("module.apps.module.gait_pipeline.aws_lambda_function.lambda_gait[0]", "tags"), {"Name": "lambda-gait-apps-preprod-dq"})
def test_name_suffix_gait_pipeline_log_lambda_gait(self):
self.assertEqual(self.runner.get_value("module.apps.module.gait_pipeline.aws_cloudwatch_log_group.lambda_log_group_gait[0]", "tags"), {"Name": "lambda-log-group-gait-apps-preprod-dq"})
def test_name_suffix_fms_postgres(self):
self.assertEqual(self.runner.get_value("module.apps.module.fms.aws_db_instance.postgres", "tags"), {"Name": "postgres-fms-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_cross_record_scored_pipeline_lambda_acl_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.api_cross_record_scored_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-api-cross-record-scored-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_iam_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_iam_role.lambda_role_trigger[0]", "tags"), {"Name": "iam-lambda-trigger-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_ssm_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_ssm_parameter.lambda_trigger_enabled[0]", "tags"), {"Name": "ssm-lambda-trigger-enabled-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_sfn_state_machine(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_sfn_state_machine.sfn_state_machine[0]", "tags"), {"Name": "sfn-state-machine-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_lambda_function.lambda_trigger[0]", "tags"), {"Name": "lambda-trigger-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_log_lambda_trigger(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_cloudwatch_log_group.lambda_log_group_trigger[0]", "tags"), {"Name": "lambda-log-group-trigger-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_lambda_function.lambda_athena[0]", "tags"), {"Name": "lambda-athena-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_log_lambda_athena(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_cloudwatch_log_group.lambda_log_group_athena[0]", "tags"), {"Name": "lambda-log-group-athena-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_iam_lambda_rds(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_iam_role.lambda_rds[0]", "tags"), {"Name": "iam-lambda-rds-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_lambda_rds(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_lambda_function.lambda_rds[0]", "tags"), {"Name": "lambda-rds-internal-reporting-apps-preprod-dq"})
def test_name_suffix_internal_reporting_pipeline_log_lambda_rds(self):
self.assertEqual(self.runner.get_value("module.apps.module.internal_reporting_pipeline.aws_cloudwatch_log_group.lambda_rds[0]", "tags"), {"Name": "log-lambda-rds-internal-reporting-apps-preprod-dq"})
def test_name_suffix_dq_pipeline_ops_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.dq_pipeline_ops_group", "name"), "dq-pipeline-ops-preprod")
def test_name_suffix_dq_pipeline_ops_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_policy.dq_pipeline_ops_policy", "name"), "dq-pipeline-ops-policy-preprod")
def test_name_suffix_mds_extractor_lambda_mds_extractor(self):
self.assertEqual(self.runner.get_value("module.apps.module.mds_extractor.aws_lambda_function.lambda_mds_extractor[0]", "tags"), {"Name": "lambda-mds-extractor-apps-preprod-dq"})
def test_name_suffix_mds_extractor_lambda_role_mds_extractor(self):
self.assertEqual(self.runner.get_value("module.apps.module.mds_extractor.aws_iam_role.lambda_role_mds_extractor[0]", "tags"), {"Name": "lambda-role-mds-extractor-apps-preprod-dq"})
def test_name_suffix_mds_extractor_lambda_log_group_mds_extractor(self):
self.assertEqual(self.runner.get_value("module.apps.module.mds_extractor.aws_cloudwatch_log_group.lambda_log_group_mds_extractor[0]", "tags"), {"Name": "lambda-log-group-mds-extractor-apps-preprod-dq"})
def test_name_suffix_athena_log(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.athena_log_bucket", "tags"), {"Name": "s3-dq-athena-log-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_iam_lambda_reconcile(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_lambda_function.lambda_reconcile[0]", "tags"), {"Name": "lambda-reconcile-ops-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_iam_lambda_role_reconcile(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_iam_role.lambda_role_reconcile[0]", "tags"), {"Name": "lambda-role-reconcile-ops-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_lambda_log_group_reconcile(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_cloudwatch_log_group.lambda_log_group_reconcile[0]", "tags"), {"Name": "lambda-log-group-reconcile-ops-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_iam_lambda_cleaner(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_lambda_function.lambda_cleaner[0]", "tags"), {"Name": "lambda-cleaner-ops-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_iam_lambda_role_cleaner(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_iam_role.lambda_role_cleaner[0]", "tags"), {"Name": "lambda-role-cleaner-ops-apps-preprod-dq"})
def test_name_suffix_ops_pipeline_lambda_log_group_cleaner(self):
self.assertEqual(self.runner.get_value("module.apps.module.ops_pipeline.aws_cloudwatch_log_group.lambda_log_group_cleaner[0]", "tags"), {"Name": "lambda-log-group-cleaner-ops-apps-preprod-dq"})
def test_name_crt_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.crt", "name"), "iam-group-crt-apps-preprod-dq")
def test_name_crt_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.crt", "name"), "iam-group-membership-crt-apps-preprod-dq")
def test_name_crt_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.crt", "name"), "iam-group-policy-crt-apps-preprod-dq")
def test_name_crt_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.crt", "name"), "iam-user-crt-apps-preprod-dq")
def test_name_crt_ssm_iam_user_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.crt_id", "name"), "kubernetes-crt-user-id-apps-preprod-dq")
def test_name_crt_ssm_iam_user_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.crt_key", "name"), "kubernetes-crt-user-key-apps-preprod-dq")
def test_name_athena_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.athena", "name"), "iam-group-athena-apps-preprod-dq")
def test_name_athena_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.athena", "name"), "iam-group-membership-athena-apps-preprod-dq")
def test_name_athena_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.athena", "name"), "iam-group-policy-athena-apps-preprod-dq")
def test_name_athena_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.athena", "name"), "iam-user-athena-apps-preprod-dq")
def test_name_ssm_athena_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_id", "name"), "kubernetes-athena-user-id-app-apps-preprod-dq")
def test_name_ssm_athena_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_key", "name"), "kubernetes-athena-user-key-app-apps-preprod-dq")
def test_name_nats_history_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.nats_history", "name"), "iam-group-nats-history-apps-preprod-dq")
def test_name_nats_historyiam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.nats_history", "name"), "iam-group-membership-nats-history-apps-preprod-dq")
def test_name_nats_history_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.nats_history", "name"), "iam-group-policy-nats-history-apps-preprod-dq")
def test_name_nats_history_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.nats_history", "name"), "iam-user-nats-history-apps-preprod-dq")
def test_name_ssm_nats_history_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.nats_history_id", "name"), "nats-history-user-id-apps-preprod-dq")
def test_name_ssm_nats_history_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.nats_history_key", "name"), "nats-history-user-key-apps-preprod-dq")
def test_name_rds_maintenance_history_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.rds_maintenance", "name"), "iam-group-rds-maintenance-apps-preprod-dq")
def test_name_rds_maintenance_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.rds_maintenance", "name"), "iam-group-membership-rds-maintenance-apps-preprod-dq")
def test_name_rds_maintenance_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.lambda_policy_rds_maintenance", "name"), "iam-group-policy-rds-maintenance-apps-preprod-dq")
def test_name_rds_maintenance_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.rds_maintenance", "name"), "iam-user-rds-maintenance-apps-preprod-dq")
def test_name_ssm_rds_maintenance_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.rds_maintenance_id", "name"), "kubernetes-rds-maintenance-user-id-apps-preprod-dq")
def test_name_ssm_rds_maintenance_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.rds_maintenance_key", "name"), "kubernetes-rds-maintenance-user-key-apps-preprod-dq")
def test_name_athena_maintenance_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.athena_maintenance", "name"), "iam-group-athena-maintenance-apps-preprod-dq")
def test_name_athena_maintenance_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.athena_maintenance", "name"), "iam-group-membership-athena-maintenance-apps-preprod-dq")
def test_name_athena_maintenance_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.athena_maintenance", "name"), "iam-group-policy-athena-maintenance-apps-preprod-dq")
def test_name_athena_maintenance_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.athena_maintenance", "name"), "iam-user-athena-maintenance-apps-preprod-dq")
def test_name_ssm_athena_maintenance_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_maintenance_id", "name"), "kubernetes-athena-maintenance-user-id-apps-preprod-dq")
def test_name_ssm_athena_maintenance_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_maintenance_key", "name"), "kubernetes-athena-maintenance-user-key-apps-preprod-dq")
def test_name_jira_backup_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.data_archive_bucket", "name"), "data_archive_bucket")
def test_name_jira_backup_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.data_archive_bucket", "name"), "data_archive_bucket_user")
def test_name_ssm_jirs_backup_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.jira_id", "name"), "kubernetes-jira-backup-user-id-apps-preprod-dq")
def test_name_ssm_jira_backup_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.jira_key", "name"), "kubernetes-jira-backup-user-key-apps-preprod-dq")
def test_name_suffix_cdlz_bitd_input(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.cdlz_bitd_input", "tags"), {"Name": "s3-dq-cdlz-bitd-input-apps-preprod-dq"})
def test_name_suffix_api_arrivals(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.api_arrivals_bucket", "tags"), {"Name": "s3-dq-api-arrivals-apps-preprod-dq"})
def test_name_suffix_accuracy_score(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.accuracy_score_bucket", "tags"), {"Name": "s3-dq-accuracy-score-apps-preprod-dq"})
def test_name_suffix_api_cdlz_msk(self):
self.assertEqual(self.runner.get_value("module.apps.aws_s3_bucket.api_cdlz_msk_bucket", "tags"), {'Name': "s3-dq-api-cdlz-msk-apps-preprod-dq"})
def test_api_cdlz_msk_bucket_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.api_cdlz_msk_bucket", "name"), "api_cdlz_msk_bucket")
def test_api_cdlz_msk_bucket_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.api_cdlz_msk_bucket", "name"), "api_cdlz_msk_bucket_user")
def test_name_athena_tableau_iam_group(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group.athena_tableau", "name"), "iam-group-athena-tableau-apps-preprod-dq")
def test_name_athena_tableau_iam_group_membership(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_membership.athena_tableau", "name"), "iam-group-membership-athena-tableau-apps-preprod-dq")
def test_name_athena_tableau_iam_group_policy(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_group_policy.athena_tableau", "name"), "iam-group-policy-athena-tableau-apps-preprod-dq")
def test_name_athena_tableau_iam_user(self):
self.assertEqual(self.runner.get_value("module.apps.aws_iam_user.athena_tableau", "name"), "iam-user-athena-tableau-apps-preprod-dq")
def test_name_ssm_athena_tableau_id(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_tableau_id", "name"), "tableau-athena-user-id-apps-preprod-dq")
def test_name_ssm_athena_tableau_key(self):
self.assertEqual(self.runner.get_value("module.apps.aws_ssm_parameter.athena_tableau_key", "name"), "tableau-athena-user-key-apps-preprod-dq")
if __name__ == '__main__':
unittest.main()
| 70.295964 | 241 | 0.716105 | 6,336 | 47,028 | 4.961963 | 0.032828 | 0.053755 | 0.100926 | 0.122173 | 0.871624 | 0.838704 | 0.821973 | 0.807182 | 0.785203 | 0.750724 | 0 | 0.007751 | 0.157778 | 47,028 | 668 | 242 | 70.401198 | 0.786003 | 0.001999 | 0 | 0.02863 | 0 | 0.01227 | 0.550946 | 0.38347 | 0 | 0 | 0 | 0 | 0.341513 | 1 | 0.343558 | false | 0.002045 | 0.00409 | 0 | 0.349693 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
966797d57389a9d808b7c3fe8c88d929f607676f | 5,069 | py | Python | Plugins/Aspose-Slides-Java-for-Jython/asposeslides/WorkingWithCharts/ChartProperties.py | tienph91/Aspose.Slides-for-Java | 874a8245c6e1ae227393644d9bd35d5e65e07d52 | [
"MIT"
] | 27 | 2016-10-25T13:19:25.000Z | 2022-03-03T04:13:53.000Z | Plugins/Aspose-Slides-Java-for-Jython/asposeslides/WorkingWithCharts/ChartProperties.py | tienph91/Aspose.Slides-for-Java | 874a8245c6e1ae227393644d9bd35d5e65e07d52 | [
"MIT"
] | 9 | 2017-03-14T13:02:17.000Z | 2021-11-25T13:22:20.000Z | Plugins/Aspose-Slides-Java-for-Jython/asposeslides/WorkingWithCharts/ChartProperties.py | tienph91/Aspose.Slides-for-Java | 874a8245c6e1ae227393644d9bd35d5e65e07d52 | [
"MIT"
] | 38 | 2016-04-07T16:37:29.000Z | 2022-01-17T06:35:14.000Z | from asposeslides import Settings
from com.aspose.slides import Presentation
from com.aspose.slides import ChartType
from com.aspose.slides import SaveFormat
class ChartProperties:
def __init__(self):
# Setting the RotationX, RotationY and DepthPercents properties of 3D Chart.
self.set_rotation_and_depth()
# Setting the GapWidth property of Chart Series
self.set_gapwidth()
def set_rotation_and_depth(dataDir):
dataDir = Settings.dataDir + 'WorkingWithCharts/ChartProperties'
pres = Presentation()
# Access first slide
sld = pres.getSlides().get_Item(0)
# Add chart with default data
charType=ChartType
chart = sld.getShapes().addChart(charType.StackedColumn3D, 0, 0, 500, 500)
# Getting the chart data worksheet
fact = chart.getChartData().getChartDataWorkbook()
# Delete default generated series and categories
chart.getChartData().getSeries().clear()
chart.getChartData().getCategories().clear()
# Adding series
chart.getChartData().getSeries().add(fact.getCell(0, 0, 1, "Series 1"), chart.getType())
chart.getChartData().getSeries().add(fact.getCell(0, 0, 2, "Series 2"), chart.getType())
# Adding categories
chart.getChartData().getCategories().add(fact.getCell(0, 1, 0, "Caetegoty 1"))
chart.getChartData().getCategories().add(fact.getCell(0, 2, 0, "Caetegoty 2"))
chart.getChartData().getCategories().add(fact.getCell(0, 3, 0, "Caetegoty 3"))
# Set Rotation3D properties
chart.getRotation3D().setRightAngleAxes(True)
chart.getRotation3D().setRotationX(40)
chart.getRotation3D().setRotationY(270)
chart.getRotation3D().setDepthPercents(150)
# Take first chart series
series = chart.getChartData().getSeries().get_Item(0)
# Populating series data
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 1, 1, 20))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 2, 1, 50))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 3, 1, 30))
# Take second chart series
series = chart.getChartData().getSeries().get_Item(1)
# Populating series data
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 1, 2, 30))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 2, 2, 10))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 3, 2, 60))
# Saving the presentation
save_format = SaveFormat
pres.save(dataDir + "3Drotation.pptx", save_format.Pptx)
print "Done with rotation, please check the output file."
def set_gapwidth(dataDir):
dataDir = Settings.dataDir + 'WorkingWithCharts/ChartProperties'
pres = Presentation()
# Access first slide
sld = pres.getSlides().get_Item(0)
# Add chart with default data
charType=ChartType
chart = sld.getShapes().addChart(charType.StackedColumn3D, 0, 0, 500, 500)
# Getting the chart data worksheet
fact = chart.getChartData().getChartDataWorkbook()
# Delete default generated series and categories
chart.getChartData().getSeries().clear()
chart.getChartData().getCategories().clear()
# Adding series
chart.getChartData().getSeries().add(fact.getCell(0, 0, 1, "Series 1"), chart.getType())
chart.getChartData().getSeries().add(fact.getCell(0, 0, 2, "Series 2"), chart.getType())
# Adding categories
chart.getChartData().getCategories().add(fact.getCell(0, 1, 0, "Caetegoty 1"))
chart.getChartData().getCategories().add(fact.getCell(0, 2, 0, "Caetegoty 2"))
chart.getChartData().getCategories().add(fact.getCell(0, 3, 0, "Caetegoty 3"))
# Take first chart series
series = chart.getChartData().getSeries().get_Item(0)
# Populating series data
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 1, 1, 20))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 2, 1, 50))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 3, 1, 30))
# Take second chart series
series = chart.getChartData().getSeries().get_Item(1)
# Populating series data
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 1, 2, 30))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 2, 2, 10))
series.getDataPoints().addDataPointForBarSeries(fact.getCell(0, 3, 2, 60))
# Set GapWidth value
series.getParentSeriesGroup().setGapWidth(75)
# Saving the presentation
save_format = SaveFormat
pres.save(dataDir + "SetGapWidth.pptx", save_format.Pptx)
print "Set Gapwidth property of chart series, please check the output file."
if __name__ == '__main__':
ChartProperties() | 39.601563 | 96 | 0.660091 | 543 | 5,069 | 6.106814 | 0.197053 | 0.072979 | 0.079614 | 0.170084 | 0.84228 | 0.773824 | 0.773824 | 0.773824 | 0.773824 | 0.740048 | 0 | 0.035796 | 0.222924 | 5,069 | 128 | 97 | 39.601563 | 0.806042 | 0.142237 | 0 | 0.676923 | 0 | 0 | 0.07404 | 0.015271 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.061538 | null | null | 0.030769 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
969f6d15ad8dad2f816922b1efb6727efafbf1e2 | 757,420 | py | Python | openapi_client/api/audits_api.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | openapi_client/api/audits_api.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | openapi_client/api/audits_api.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Prisma Cloud Compute API
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: 21.04.439
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from openapi_client.api_client import ApiClient
from openapi_client.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class AuditsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def api_v1_audits_access_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_access_download_get # noqa: E501
DownloadAccessAudits downloads the access audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_access_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Type is the audit type.
:type type: str
:param rule_name: RuleNames are the rules names to filter by.
:type rule_name: list[str]
:param api: APIs are apis to filter by.
:type api: list[str]
:param hostname: Hosts are hosts to filter by.
:type hostname: list[str]
:param user: Users are users to filter by.
:type user: list[str]
:param allow: Allow indicated whether allowed requests should be shown.
:type allow: str
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_access_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_access_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_access_download_get # noqa: E501
DownloadAccessAudits downloads the access audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_access_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Type is the audit type.
:type type: str
:param rule_name: RuleNames are the rules names to filter by.
:type rule_name: list[str]
:param api: APIs are apis to filter by.
:type api: list[str]
:param hostname: Hosts are hosts to filter by.
:type hostname: list[str]
:param user: Users are users to filter by.
:type user: list[str]
:param allow: Allow indicated whether allowed requests should be shown.
:type allow: str
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'type',
'rule_name',
'api',
'hostname',
'user',
'allow',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_access_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'api' in local_var_params and local_var_params['api'] is not None: # noqa: E501
query_params.append(('api', local_var_params['api'])) # noqa: E501
collection_formats['api'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'allow' in local_var_params and local_var_params['allow'] is not None: # noqa: E501
query_params.append(('allow', local_var_params['allow'])) # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/access/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_access_get(self, **kwargs): # noqa: E501
"""api_v1_audits_access_get # noqa: E501
AccessAudits returns all access audits for the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_access_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Type is the audit type.
:type type: str
:param rule_name: RuleNames are the rules names to filter by.
:type rule_name: list[str]
:param api: APIs are apis to filter by.
:type api: list[str]
:param hostname: Hosts are hosts to filter by.
:type hostname: list[str]
:param user: Users are users to filter by.
:type user: list[str]
:param allow: Allow indicated whether allowed requests should be shown.
:type allow: str
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_access_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_access_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_access_get # noqa: E501
AccessAudits returns all access audits for the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_access_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Type is the audit type.
:type type: str
:param rule_name: RuleNames are the rules names to filter by.
:type rule_name: list[str]
:param api: APIs are apis to filter by.
:type api: list[str]
:param hostname: Hosts are hosts to filter by.
:type hostname: list[str]
:param user: Users are users to filter by.
:type user: list[str]
:param allow: Allow indicated whether allowed requests should be shown.
:type allow: str
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'type',
'rule_name',
'api',
'hostname',
'user',
'allow',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_access_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'api' in local_var_params and local_var_params['api'] is not None: # noqa: E501
query_params.append(('api', local_var_params['api'])) # noqa: E501
collection_formats['api'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'allow' in local_var_params and local_var_params['allow'] is not None: # noqa: E501
query_params.append(('allow', local_var_params['allow'])) # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/access', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_admission_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_admission_download_get # noqa: E501
DownloadAdmissionAudits downloads the admission audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_admission_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param namespace: Namespaces is the list of namespaces to use for filtering.
:type namespace: list[str]
:param operation: Operations is the list of operations to use for filtering.
:type operation: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_admission_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_admission_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_admission_download_get # noqa: E501
DownloadAdmissionAudits downloads the admission audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_admission_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param namespace: Namespaces is the list of namespaces to use for filtering.
:type namespace: list[str]
:param operation: Operations is the list of operations to use for filtering.
:type operation: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'namespace',
'operation',
'cluster',
'attack_techniques'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_admission_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'operation' in local_var_params and local_var_params['operation'] is not None: # noqa: E501
query_params.append(('operation', local_var_params['operation'])) # noqa: E501
collection_formats['operation'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/admission/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_admission_get(self, **kwargs): # noqa: E501
"""api_v1_audits_admission_get # noqa: E501
AdmissionAudits returns all admission audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_admission_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param namespace: Namespaces is the list of namespaces to use for filtering.
:type namespace: list[str]
:param operation: Operations is the list of operations to use for filtering.
:type operation: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[AdmissionAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_admission_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_admission_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_admission_get # noqa: E501
AdmissionAudits returns all admission audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_admission_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param namespace: Namespaces is the list of namespaces to use for filtering.
:type namespace: list[str]
:param operation: Operations is the list of operations to use for filtering.
:type operation: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[AdmissionAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'namespace',
'operation',
'cluster',
'attack_techniques'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_admission_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'operation' in local_var_params and local_var_params['operation'] is not None: # noqa: E501
query_params.append(('operation', local_var_params['operation'])) # noqa: E501
collection_formats['operation'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[AdmissionAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/admission', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_app_embedded_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_download_get # noqa: E501
DownloadAppEmbeddedAppFirewallAudits downloads the embedded defender firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_app_embedded_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_app_embedded_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_download_get # noqa: E501
DownloadAppEmbeddedAppFirewallAudits downloads the embedded defender firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_app_embedded_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/app-embedded/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_app_embedded_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_get # noqa: E501
AppEmbeddedAppFirewallAudits returns all embedded defender firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedAppFirewallAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_app_embedded_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_app_embedded_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_get # noqa: E501
AppEmbeddedAppFirewallAudits returns all embedded defender firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedAppFirewallAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_app_embedded_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedAppFirewallAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/app-embedded', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_app_embedded_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_timeslice_get # noqa: E501
AppEmbeddedAppFirewallAuditTimeslice returns embedded firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_app_embedded_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_app_embedded_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_app_embedded_timeslice_get # noqa: E501
AppEmbeddedAppFirewallAuditTimeslice returns embedded firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_app_embedded_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_app_embedded_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/app-embedded/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_container_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_download_get # noqa: E501
DownloadContainerAppFirewallAudits downloads the container firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_container_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_container_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_download_get # noqa: E501
DownloadContainerAppFirewallAudits downloads the container firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_container_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/container/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_container_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_get # noqa: E501
ContainerAppFirewallAudits returns all container firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedAppFirewallAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_container_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_container_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_get # noqa: E501
ContainerAppFirewallAudits returns all container firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedAppFirewallAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_container_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedAppFirewallAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/container', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_container_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_timeslice_get # noqa: E501
ContainerAppFirewallAuditTimeslice returns container firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_container_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_container_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_container_timeslice_get # noqa: E501
ContainerAppFirewallAuditTimeslice returns container firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_container_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_container_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/container/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_host_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_download_get # noqa: E501
DownloadHostAppFirewallAudits downloads the host firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_host_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_host_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_download_get # noqa: E501
DownloadHostAppFirewallAudits downloads the host firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_host_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/host/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_host_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_get # noqa: E501
HostAppFirewallAudits returns all host firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedAppFirewallAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_host_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_host_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_get # noqa: E501
HostAppFirewallAudits returns all host firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedAppFirewallAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_host_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedAppFirewallAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/host', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_host_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_timeslice_get # noqa: E501
HostAppFirewallAuditTimeslice returns host firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_host_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_host_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_host_timeslice_get # noqa: E501
HostAppFirewallAuditTimeslice returns host firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_host_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_host_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/host/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_serverless_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_download_get # noqa: E501
DownloadServerlessAppFirewallAudits downloads the serverless firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_serverless_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_serverless_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_download_get # noqa: E501
DownloadServerlessAppFirewallAudits downloads the serverless firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_serverless_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/serverless/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_serverless_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_get # noqa: E501
ServerlessAppFirewallAudits returns all serverless firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedAppFirewallAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_serverless_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_serverless_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_get # noqa: E501
ServerlessAppFirewallAudits returns all serverless firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedAppFirewallAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_serverless_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedAppFirewallAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/serverless', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_app_serverless_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_timeslice_get # noqa: E501
ServerlessAppFirewallAuditTimeslice returns serverless firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_app_serverless_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_app_serverless_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_app_serverless_timeslice_get # noqa: E501
ServerlessAppFirewallAuditTimeslice returns serverless firewall audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_app_serverless_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param image_name: Images is the image names filter.
:type image_name: list[str]
:param container_name: Containers is the container names filter.
:type container_name: list[str]
:param hostname: Hosts is the hostnames filter.
:type hostname: list[str]
:param rule_name: RuleNames is the rule names filter.
:type rule_name: list[str]
:param type: Types is the firewall audit type filter.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect.
:type effect: str
:param rule_app_id: RuleAppIDs is the rule app IDs filter.
:type rule_app_id: list[str]
:param function: FunctionName is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param ns: Namespaces is the list of namespaces to use for filtering.
:type ns: list[str]
:param app_id: AppIDs is the app embedded appID filter.
:type app_id: list[str]
:param subnet: Subnets is the source IPs filter.
:type subnet: list[str]
:param connecting_ips: ConnectingIPs is the connecting IPs filter.
:type connecting_ips: list[str]
:param country: Countries is the source IP country filter.
:type country: list[str]
:param user_agent_header: UserAgents is the user agent header filter.
:type user_agent_header: list[str]
:param url: URLs is the URL filter.
:type url: list[str]
:param request_host: RequestHosts is the request host filter.
:type request_host: list[str]
:param url_path: Paths is the URL path filter.
:type url_path: list[str]
:param url_query: Queries is the URL query filter.
:type url_query: list[str]
:param method: Methods is the request method filter.
:type method: list[str]
:param request_header_names: RequestHeaderNames is the request header names filter.
:type request_header_names: list[str]
:param os: OS is the OS filter.
:type os: list[str]
:param msg: Messages is the audit message text filter.
:type msg: list[str]
:param cluster: Cluster is the audit cluster filter.
:type cluster: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param protection: Protections is the firewall audit protection type filter.
:type protection: list[str]
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'image_name',
'container_name',
'hostname',
'rule_name',
'type',
'effect',
'rule_app_id',
'function',
'region',
'runtime',
'ns',
'app_id',
'subnet',
'connecting_ips',
'country',
'user_agent_header',
'url',
'request_host',
'url_path',
'url_query',
'method',
'request_header_names',
'os',
'msg',
'cluster',
'attack_techniques',
'aggregate',
'protection',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_app_serverless_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container_name' in local_var_params and local_var_params['container_name'] is not None: # noqa: E501
query_params.append(('containerName', local_var_params['container_name'])) # noqa: E501
collection_formats['containerName'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
if 'rule_app_id' in local_var_params and local_var_params['rule_app_id'] is not None: # noqa: E501
query_params.append(('ruleAppID', local_var_params['rule_app_id'])) # noqa: E501
collection_formats['ruleAppID'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'ns' in local_var_params and local_var_params['ns'] is not None: # noqa: E501
query_params.append(('ns', local_var_params['ns'])) # noqa: E501
collection_formats['ns'] = 'multi' # noqa: E501
if 'app_id' in local_var_params and local_var_params['app_id'] is not None: # noqa: E501
query_params.append(('appID', local_var_params['app_id'])) # noqa: E501
collection_formats['appID'] = 'multi' # noqa: E501
if 'subnet' in local_var_params and local_var_params['subnet'] is not None: # noqa: E501
query_params.append(('subnet', local_var_params['subnet'])) # noqa: E501
collection_formats['subnet'] = 'multi' # noqa: E501
if 'connecting_ips' in local_var_params and local_var_params['connecting_ips'] is not None: # noqa: E501
query_params.append(('connectingIPs', local_var_params['connecting_ips'])) # noqa: E501
collection_formats['connectingIPs'] = 'multi' # noqa: E501
if 'country' in local_var_params and local_var_params['country'] is not None: # noqa: E501
query_params.append(('country', local_var_params['country'])) # noqa: E501
collection_formats['country'] = 'multi' # noqa: E501
if 'user_agent_header' in local_var_params and local_var_params['user_agent_header'] is not None: # noqa: E501
query_params.append(('userAgentHeader', local_var_params['user_agent_header'])) # noqa: E501
collection_formats['userAgentHeader'] = 'multi' # noqa: E501
if 'url' in local_var_params and local_var_params['url'] is not None: # noqa: E501
query_params.append(('url', local_var_params['url'])) # noqa: E501
collection_formats['url'] = 'multi' # noqa: E501
if 'request_host' in local_var_params and local_var_params['request_host'] is not None: # noqa: E501
query_params.append(('requestHost', local_var_params['request_host'])) # noqa: E501
collection_formats['requestHost'] = 'multi' # noqa: E501
if 'url_path' in local_var_params and local_var_params['url_path'] is not None: # noqa: E501
query_params.append(('urlPath', local_var_params['url_path'])) # noqa: E501
collection_formats['urlPath'] = 'multi' # noqa: E501
if 'url_query' in local_var_params and local_var_params['url_query'] is not None: # noqa: E501
query_params.append(('urlQuery', local_var_params['url_query'])) # noqa: E501
collection_formats['urlQuery'] = 'multi' # noqa: E501
if 'method' in local_var_params and local_var_params['method'] is not None: # noqa: E501
query_params.append(('method', local_var_params['method'])) # noqa: E501
collection_formats['method'] = 'multi' # noqa: E501
if 'request_header_names' in local_var_params and local_var_params['request_header_names'] is not None: # noqa: E501
query_params.append(('requestHeaderNames', local_var_params['request_header_names'])) # noqa: E501
collection_formats['requestHeaderNames'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'protection' in local_var_params and local_var_params['protection'] is not None: # noqa: E501
query_params.append(('protection', local_var_params['protection'])) # noqa: E501
collection_formats['protection'] = 'multi' # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/app/serverless/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_network_container_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_container_download_get # noqa: E501
DownloadContainerNetworkFirewallAudits downloads the container network firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_container_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_image_name: SrcImages are the source images filter.
:type src_image_name: list[str]
:param dst_image_name: DstImages are the destination images filter.
:type dst_image_name: list[str]
:param block: Block is the block/audit filter.
:type block: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_network_container_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_network_container_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_container_download_get # noqa: E501
DownloadContainerNetworkFirewallAudits downloads the container network firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_container_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_image_name: SrcImages are the source images filter.
:type src_image_name: list[str]
:param dst_image_name: DstImages are the destination images filter.
:type dst_image_name: list[str]
:param block: Block is the block/audit filter.
:type block: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'src_image_name',
'dst_image_name',
'block'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_network_container_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'src_image_name' in local_var_params and local_var_params['src_image_name'] is not None: # noqa: E501
query_params.append(('srcImageName', local_var_params['src_image_name'])) # noqa: E501
collection_formats['srcImageName'] = 'multi' # noqa: E501
if 'dst_image_name' in local_var_params and local_var_params['dst_image_name'] is not None: # noqa: E501
query_params.append(('dstImageName', local_var_params['dst_image_name'])) # noqa: E501
collection_formats['dstImageName'] = 'multi' # noqa: E501
if 'block' in local_var_params and local_var_params['block'] is not None: # noqa: E501
query_params.append(('block', local_var_params['block'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/network/container/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_network_container_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_container_get # noqa: E501
ContainerNetworkFirewallAudits returns all container network firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_container_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_image_name: SrcImages are the source images filter.
:type src_image_name: list[str]
:param dst_image_name: DstImages are the destination images filter.
:type dst_image_name: list[str]
:param block: Block is the block/audit filter.
:type block: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedContainerNetworkFirewallProfileAudits]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_network_container_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_network_container_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_container_get # noqa: E501
ContainerNetworkFirewallAudits returns all container network firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_container_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_image_name: SrcImages are the source images filter.
:type src_image_name: list[str]
:param dst_image_name: DstImages are the destination images filter.
:type dst_image_name: list[str]
:param block: Block is the block/audit filter.
:type block: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedContainerNetworkFirewallProfileAudits], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'src_image_name',
'dst_image_name',
'block'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_network_container_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'src_image_name' in local_var_params and local_var_params['src_image_name'] is not None: # noqa: E501
query_params.append(('srcImageName', local_var_params['src_image_name'])) # noqa: E501
collection_formats['srcImageName'] = 'multi' # noqa: E501
if 'dst_image_name' in local_var_params and local_var_params['dst_image_name'] is not None: # noqa: E501
query_params.append(('dstImageName', local_var_params['dst_image_name'])) # noqa: E501
collection_formats['dstImageName'] = 'multi' # noqa: E501
if 'block' in local_var_params and local_var_params['block'] is not None: # noqa: E501
query_params.append(('block', local_var_params['block'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedContainerNetworkFirewallProfileAudits]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/network/container', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_network_host_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_host_download_get # noqa: E501
DownloadHostNetworkFirewallAudits downloads the host network firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_host_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_hostnames: SrcHostname are the source hostnames filter.
:type src_hostnames: list[str]
:param dst_hostnames: DstHostname are the destination hostnames filter.
:type dst_hostnames: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_network_host_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_network_host_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_host_download_get # noqa: E501
DownloadHostNetworkFirewallAudits downloads the host network firewall audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_host_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_hostnames: SrcHostname are the source hostnames filter.
:type src_hostnames: list[str]
:param dst_hostnames: DstHostname are the destination hostnames filter.
:type dst_hostnames: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'src_hostnames',
'dst_hostnames'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_network_host_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'src_hostnames' in local_var_params and local_var_params['src_hostnames'] is not None: # noqa: E501
query_params.append(('srcHostnames', local_var_params['src_hostnames'])) # noqa: E501
collection_formats['srcHostnames'] = 'multi' # noqa: E501
if 'dst_hostnames' in local_var_params and local_var_params['dst_hostnames'] is not None: # noqa: E501
query_params.append(('dstHostnames', local_var_params['dst_hostnames'])) # noqa: E501
collection_formats['dstHostnames'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/firewall/network/host/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_firewall_network_host_get(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_host_get # noqa: E501
HostNetworkFirewallAudits returns all host network firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_host_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_hostnames: SrcHostname are the source hostnames filter.
:type src_hostnames: list[str]
:param dst_hostnames: DstHostname are the destination hostnames filter.
:type dst_hostnames: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedHostNetworkFirewallProfileAudits]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_firewall_network_host_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_firewall_network_host_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_firewall_network_host_get # noqa: E501
HostNetworkFirewallAudits returns all host network firewall audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_firewall_network_host_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audits.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audits.
:type to: datetime
:param src_hostnames: SrcHostname are the source hostnames filter.
:type src_hostnames: list[str]
:param dst_hostnames: DstHostname are the destination hostnames filter.
:type dst_hostnames: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedHostNetworkFirewallProfileAudits], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'src_hostnames',
'dst_hostnames'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_firewall_network_host_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'src_hostnames' in local_var_params and local_var_params['src_hostnames'] is not None: # noqa: E501
query_params.append(('srcHostnames', local_var_params['src_hostnames'])) # noqa: E501
collection_formats['srcHostnames'] = 'multi' # noqa: E501
if 'dst_hostnames' in local_var_params and local_var_params['dst_hostnames'] is not None: # noqa: E501
query_params.append(('dstHostnames', local_var_params['dst_hostnames'])) # noqa: E501
collection_formats['dstHostnames'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedHostNetworkFirewallProfileAudits]",
}
return self.api_client.call_api(
'/api/v1/audits/firewall/network/host', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_incidents_acknowledge_id_patch(self, id, **kwargs): # noqa: E501
"""api_v1_audits_incidents_acknowledge_id_patch # noqa: E501
SetIncidentAcknowledge sets the given incident's acknowledgement status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_acknowledge_id_patch(id, async_req=True)
>>> result = thread.get()
:param id: (required)
:type id: str
:param shared_incident:
:type shared_incident: SharedIncident
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_incidents_acknowledge_id_patch_with_http_info(id, **kwargs) # noqa: E501
def api_v1_audits_incidents_acknowledge_id_patch_with_http_info(self, id, **kwargs): # noqa: E501
"""api_v1_audits_incidents_acknowledge_id_patch # noqa: E501
SetIncidentAcknowledge sets the given incident's acknowledgement status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_acknowledge_id_patch_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: (required)
:type id: str
:param shared_incident:
:type shared_incident: SharedIncident
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id',
'shared_incident'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_incidents_acknowledge_id_patch" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `api_v1_audits_incidents_acknowledge_id_patch`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'shared_incident' in local_var_params:
body_params = local_var_params['shared_incident']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/incidents/acknowledge/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_incidents_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_incidents_download_get # noqa: E501
DownloadIncidents downloads incidents according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: Filters results from a start datetime.
:type _from: datetime
:param to: Filters results from an end datetime.
:type to: datetime
:param hostname: Filters results by hostname where the incident occurred.
:type hostname: list[str]
:param category: Filters results by incident category.
:type category: list[str]
:param type: Filters results by incident type.
:type type: list[str]
:param profile_id: Filters results by runtime profile ID.
:type profile_id: list[str]
:param acknowledged: Filters results by incidents that have been acknowledged.
:type acknowledged: str
:param region: Filters results by region (for functions).
:type region: list[str]
:param cluster: Filters results by cluster name.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_incidents_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_incidents_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_incidents_download_get # noqa: E501
DownloadIncidents downloads incidents according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: Filters results from a start datetime.
:type _from: datetime
:param to: Filters results from an end datetime.
:type to: datetime
:param hostname: Filters results by hostname where the incident occurred.
:type hostname: list[str]
:param category: Filters results by incident category.
:type category: list[str]
:param type: Filters results by incident type.
:type type: list[str]
:param profile_id: Filters results by runtime profile ID.
:type profile_id: list[str]
:param acknowledged: Filters results by incidents that have been acknowledged.
:type acknowledged: str
:param region: Filters results by region (for functions).
:type region: list[str]
:param cluster: Filters results by cluster name.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'category',
'type',
'profile_id',
'acknowledged',
'region',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_incidents_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'category' in local_var_params and local_var_params['category'] is not None: # noqa: E501
query_params.append(('category', local_var_params['category'])) # noqa: E501
collection_formats['category'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if 'acknowledged' in local_var_params and local_var_params['acknowledged'] is not None: # noqa: E501
query_params.append(('acknowledged', local_var_params['acknowledged'])) # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/incidents/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_incidents_get(self, **kwargs): # noqa: E501
"""api_v1_audits_incidents_get # noqa: E501
Incidents returns all incidents according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: Filters results from a start datetime.
:type _from: datetime
:param to: Filters results from an end datetime.
:type to: datetime
:param hostname: Filters results by hostname where the incident occurred.
:type hostname: list[str]
:param category: Filters results by incident category.
:type category: list[str]
:param type: Filters results by incident type.
:type type: list[str]
:param profile_id: Filters results by runtime profile ID.
:type profile_id: list[str]
:param acknowledged: Filters results by incidents that have been acknowledged.
:type acknowledged: str
:param region: Filters results by region (for functions).
:type region: list[str]
:param cluster: Filters results by cluster name.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedIncident]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_incidents_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_incidents_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_incidents_get # noqa: E501
Incidents returns all incidents according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_incidents_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: Filters results from a start datetime.
:type _from: datetime
:param to: Filters results from an end datetime.
:type to: datetime
:param hostname: Filters results by hostname where the incident occurred.
:type hostname: list[str]
:param category: Filters results by incident category.
:type category: list[str]
:param type: Filters results by incident type.
:type type: list[str]
:param profile_id: Filters results by runtime profile ID.
:type profile_id: list[str]
:param acknowledged: Filters results by incidents that have been acknowledged.
:type acknowledged: str
:param region: Filters results by region (for functions).
:type region: list[str]
:param cluster: Filters results by cluster name.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedIncident], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'category',
'type',
'profile_id',
'acknowledged',
'region',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_incidents_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'category' in local_var_params and local_var_params['category'] is not None: # noqa: E501
query_params.append(('category', local_var_params['category'])) # noqa: E501
collection_formats['category'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if 'acknowledged' in local_var_params and local_var_params['acknowledged'] is not None: # noqa: E501
query_params.append(('acknowledged', local_var_params['acknowledged'])) # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedIncident]",
}
return self.api_client.call_api(
'/api/v1/audits/incidents', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_kubernetes_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_kubernetes_download_get # noqa: E501
DownloadKubernetesAudits downloads the Kubernetes audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_kubernetes_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param user: Users is the list of users to use for filtering.
:type user: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_kubernetes_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_kubernetes_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_kubernetes_download_get # noqa: E501
DownloadKubernetesAudits downloads the Kubernetes audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_kubernetes_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param user: Users is the list of users to use for filtering.
:type user: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'user',
'attack_techniques'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_kubernetes_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/kubernetes/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_kubernetes_get(self, **kwargs): # noqa: E501
"""api_v1_audits_kubernetes_get # noqa: E501
KubernetesAudits returns a list of Kubernetes audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_kubernetes_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param user: Users is the list of users to use for filtering.
:type user: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedKubernetesAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_kubernetes_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_kubernetes_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_kubernetes_get # noqa: E501
KubernetesAudits returns a list of Kubernetes audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_kubernetes_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the activity.
:type _from: datetime
:param to: To is an optional maximum time constraints for the activity.
:type to: datetime
:param user: Users is the list of users to use for filtering.
:type user: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedKubernetesAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'user',
'attack_techniques'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_kubernetes_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedKubernetesAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/kubernetes', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_mgmt_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_download_get # noqa: E501
DownloadMgmtAudits downloads the management audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_mgmt_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_mgmt_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_download_get # noqa: E501
DownloadMgmtAudits downloads the management audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'type',
'username'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_mgmt_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'username' in local_var_params and local_var_params['username'] is not None: # noqa: E501
query_params.append(('username', local_var_params['username'])) # noqa: E501
collection_formats['username'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/mgmt/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_mgmt_filters_get(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_filters_get # noqa: E501
MgmtAuditFilters returns container management audits filters according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_filters_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: TypesMgmtAuditFilters
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_mgmt_filters_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_mgmt_filters_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_filters_get # noqa: E501
MgmtAuditFilters returns container management audits filters according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_filters_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(TypesMgmtAuditFilters, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'type',
'username'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_mgmt_filters_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'username' in local_var_params and local_var_params['username'] is not None: # noqa: E501
query_params.append(('username', local_var_params['username'])) # noqa: E501
collection_formats['username'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "TypesMgmtAuditFilters",
}
return self.api_client.call_api(
'/api/v1/audits/mgmt/filters', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_mgmt_get(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_get # noqa: E501
MgmtAudits returns all management audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedMgmtAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_mgmt_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_mgmt_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_mgmt_get # noqa: E501
MgmtAudits returns all management audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_mgmt_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param type: Types is the audit type filter.
:type type: list[str]
:param username: Usernames is the username filter.
:type username: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedMgmtAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'type',
'username'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_mgmt_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'username' in local_var_params and local_var_params['username'] is not None: # noqa: E501
query_params.append(('username', local_var_params['username'])) # noqa: E501
collection_formats['username'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedMgmtAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/mgmt', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_app_embedded_delete(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_delete # noqa: E501
DeleteAppEmbeddedRuntimeAudits deletes all embedded defender runtime audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_delete(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_app_embedded_delete_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_app_embedded_delete_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_delete # noqa: E501
DeleteAppEmbeddedRuntimeAudits deletes all embedded defender runtime audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_delete_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_app_embedded_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/app-embedded', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_app_embedded_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_download_get # noqa: E501
DownloadAppEmbeddedRuntimeAudits downloads the embedded defender audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_app_embedded_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_app_embedded_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_download_get # noqa: E501
DownloadAppEmbeddedRuntimeAudits downloads the embedded defender audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_app_embedded_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/app-embedded/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_app_embedded_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_get # noqa: E501
AppEmbeddedRuntimeAudits returns all embedded defender audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedRuntimeAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_app_embedded_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_app_embedded_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_app_embedded_get # noqa: E501
AppEmbeddedRuntimeAudits returns all embedded defender audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_app_embedded_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedRuntimeAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_app_embedded_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedRuntimeAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/app-embedded', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_container_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_download_get # noqa: E501
DownloadContainerRuntimeAudits downloads the runtime audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_container_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_container_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_download_get # noqa: E501
DownloadContainerRuntimeAudits downloads the runtime audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_container_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/container/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_container_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_get # noqa: E501
ContainerRuntimeAudits returns container runtime audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedRuntimeAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_container_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_container_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_get # noqa: E501
ContainerRuntimeAudits returns container runtime audits # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedRuntimeAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_container_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedRuntimeAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/container', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_container_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_timeslice_get # noqa: E501
ContainerRuntimeAuditsTimeslice returns container runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_container_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_container_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_container_timeslice_get # noqa: E501
ContainerRuntimeAuditsTimeslice returns container runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_container_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_container_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/container/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_file_integrity_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_file_integrity_download_get # noqa: E501
DownloadFileIntegrityEvents downloads the file integrity events according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_file_integrity_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param path: Paths is the list of paths to use for filtering.
:type path: list[str]
:param event_type: EventTypes is the list of file intergrity events to use for filtering.
:type event_type: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_file_integrity_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_file_integrity_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_file_integrity_download_get # noqa: E501
DownloadFileIntegrityEvents downloads the file integrity events according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_file_integrity_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param path: Paths is the list of paths to use for filtering.
:type path: list[str]
:param event_type: EventTypes is the list of file intergrity events to use for filtering.
:type event_type: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'path',
'event_type',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_file_integrity_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'path' in local_var_params and local_var_params['path'] is not None: # noqa: E501
query_params.append(('path', local_var_params['path'])) # noqa: E501
collection_formats['path'] = 'multi' # noqa: E501
if 'event_type' in local_var_params and local_var_params['event_type'] is not None: # noqa: E501
query_params.append(('eventType', local_var_params['event_type'])) # noqa: E501
collection_formats['eventType'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/file-integrity/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_file_integrity_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_file_integrity_get # noqa: E501
FileIntegrityEvents returns the file integrity events # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_file_integrity_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param path: Paths is the list of paths to use for filtering.
:type path: list[str]
:param event_type: EventTypes is the list of file intergrity events to use for filtering.
:type event_type: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedFileIntegrityEvent]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_file_integrity_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_file_integrity_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_file_integrity_get # noqa: E501
FileIntegrityEvents returns the file integrity events # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_file_integrity_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param path: Paths is the list of paths to use for filtering.
:type path: list[str]
:param event_type: EventTypes is the list of file intergrity events to use for filtering.
:type event_type: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedFileIntegrityEvent], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'path',
'event_type',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_file_integrity_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'path' in local_var_params and local_var_params['path'] is not None: # noqa: E501
query_params.append(('path', local_var_params['path'])) # noqa: E501
collection_formats['path'] = 'multi' # noqa: E501
if 'event_type' in local_var_params and local_var_params['event_type'] is not None: # noqa: E501
query_params.append(('eventType', local_var_params['event_type'])) # noqa: E501
collection_formats['eventType'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedFileIntegrityEvent]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/file-integrity', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_host_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_download_get # noqa: E501
DownloadHostRuntimeAudits downloads the host audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_host_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_host_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_download_get # noqa: E501
DownloadHostRuntimeAudits downloads the host audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_host_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/host/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_host_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_get # noqa: E501
HostRuntimeAudits returns all host audits according to the matched profile and query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedRuntimeAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_host_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_host_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_get # noqa: E501
HostRuntimeAudits returns all host audits according to the matched profile and query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedRuntimeAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_host_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedRuntimeAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/host', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_host_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_timeslice_get # noqa: E501
HostRuntimeAuditsTimeslice returns host runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_host_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_host_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_host_timeslice_get # noqa: E501
HostRuntimeAuditsTimeslice returns host runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_host_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_host_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/host/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_log_inspection_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_log_inspection_download_get # noqa: E501
DownloadLogInspectionEvents downloads the log inspection events according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_log_inspection_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param logfile: Logfiles is the list of log files to use for filtering.
:type logfile: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_log_inspection_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_log_inspection_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_log_inspection_download_get # noqa: E501
DownloadLogInspectionEvents downloads the log inspection events according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_log_inspection_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param logfile: Logfiles is the list of log files to use for filtering.
:type logfile: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'logfile',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_log_inspection_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'logfile' in local_var_params and local_var_params['logfile'] is not None: # noqa: E501
query_params.append(('logfile', local_var_params['logfile'])) # noqa: E501
collection_formats['logfile'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/log-inspection/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_log_inspection_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_log_inspection_get # noqa: E501
LogInspectionEvents returns the log inspection events # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_log_inspection_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param logfile: Logfiles is the list of log files to use for filtering.
:type logfile: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedLogInspectionEvent]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_log_inspection_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_log_inspection_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_log_inspection_get # noqa: E501
LogInspectionEvents returns the log inspection events # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_log_inspection_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the event.
:type _from: datetime
:param to: To is an optional maximum time constraints for the event.
:type to: datetime
:param hostname: Hosts is the list of hosts to use for filtering.
:type hostname: list[str]
:param logfile: Logfiles is the list of log files to use for filtering.
:type logfile: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedLogInspectionEvent], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'hostname',
'logfile',
'cluster'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_log_inspection_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'logfile' in local_var_params and local_var_params['logfile'] is not None: # noqa: E501
query_params.append(('logfile', local_var_params['logfile'])) # noqa: E501
collection_formats['logfile'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedLogInspectionEvent]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/log-inspection', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_serverless_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_download_get # noqa: E501
DownloadServerlessRuntimeAudits downloads the serverless audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_serverless_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_serverless_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_download_get # noqa: E501
DownloadServerlessRuntimeAudits downloads the serverless audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_serverless_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/runtime/serverless/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_serverless_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_get # noqa: E501
ServerlessRuntimeAudits returns all host audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile ids to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is an optional exact time constraint for the audit.
:type time: datetime
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is a filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param request_id: RequestID is used to filter by request id.
:type request_id: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedRuntimeAudit]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_serverless_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_serverless_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_get # noqa: E501
ServerlessRuntimeAudits returns all host audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile ids to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is an optional exact time constraint for the audit.
:type time: datetime
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is a filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param request_id: RequestID is used to filter by request id.
:type request_id: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedRuntimeAudit], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'rule_name',
'type',
'effect',
'function',
'region',
'runtime',
'attack_techniques',
'request_id',
'msg',
'attack_type',
'aggregate'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_serverless_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedRuntimeAudit]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/serverless', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_runtime_serverless_timeslice_get(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_timeslice_get # noqa: E501
ServerlessRuntimeAuditTimeslice returns serverless runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_timeslice_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[TypesAuditTimeslice]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_runtime_serverless_timeslice_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_runtime_serverless_timeslice_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_runtime_serverless_timeslice_get # noqa: E501
ServerlessRuntimeAuditTimeslice returns serverless runtime audit buckets according to the query timeframe # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_runtime_serverless_timeslice_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param profile_id: ProfileIDs are the profile IDs to filter.
:type profile_id: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param time: Time is used to filter by audit time.
:type time: datetime
:param image_name: ImageNames is the image name filter.
:type image_name: list[str]
:param container: Containers is the container name filter.
:type container: list[str]
:param container_id: ContainerID is used to filter by container ID.
:type container_id: list[str]
:param rule_name: RuleNames is used to filter by rule name.
:type rule_name: list[str]
:param type: Types is used to filter by runtime audit type.
:type type: list[str]
:param effect: Effect is used to filter by runtime audit effect (e.g., block/alert).
:type effect: list[str]
:param user: Users is used to filter by host users.
:type user: list[str]
:param os: OS is the image OS distro filter.
:type os: list[str]
:param namespace: Namespaces is the namespaces filter.
:type namespace: list[str]
:param cluster: Clusters is the cluster filter.
:type cluster: list[str]
:param attack_type: AttackTypes is used to filter by runtime audit attack type.
:type attack_type: list[str]
:param hostname: Hostname is the hostname filter.
:type hostname: list[str]
:param msg: Message is the audit message text filter.
:type msg: list[str]
:param interactive: Interactive is the audit interactive filter.
:type interactive: list[str]
:param function: Function is used to filter by function name.
:type function: list[str]
:param region: Region is used to filter by region.
:type region: list[str]
:param runtime: Runtime is used to filter by runtime.
:type runtime: list[str]
:param attack_techniques: AttackTechniques are the MITRE attack techniques.
:type attack_techniques: list[str]
:param app: App is the name constraint of the service that triggered the audit.
:type app: list[str]
:param process_path: ProcessPath is the path constraint of the process that triggered the audit.
:type process_path: list[str]
:param request_id: RequestID is used to filter by request ID.
:type request_id: list[str]
:param function_id: FunctionID is used to filter by function ID.
:type function_id: list[str]
:param aggregate: Aggregate indicates whether the result audits should be aggregated according to the Select field.
:type aggregate: bool
:param buckets: Buckets is the number of buckets to return.
:type buckets: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[TypesAuditTimeslice], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'profile_id',
'_from',
'to',
'time',
'image_name',
'container',
'container_id',
'rule_name',
'type',
'effect',
'user',
'os',
'namespace',
'cluster',
'attack_type',
'hostname',
'msg',
'interactive',
'function',
'region',
'runtime',
'attack_techniques',
'app',
'process_path',
'request_id',
'function_id',
'aggregate',
'buckets'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_runtime_serverless_timeslice_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if 'profile_id' in local_var_params and local_var_params['profile_id'] is not None: # noqa: E501
query_params.append(('profileID', local_var_params['profile_id'])) # noqa: E501
collection_formats['profileID'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'time' in local_var_params and local_var_params['time'] is not None: # noqa: E501
query_params.append(('time', local_var_params['time'])) # noqa: E501
if 'image_name' in local_var_params and local_var_params['image_name'] is not None: # noqa: E501
query_params.append(('imageName', local_var_params['image_name'])) # noqa: E501
collection_formats['imageName'] = 'multi' # noqa: E501
if 'container' in local_var_params and local_var_params['container'] is not None: # noqa: E501
query_params.append(('container', local_var_params['container'])) # noqa: E501
collection_formats['container'] = 'multi' # noqa: E501
if 'container_id' in local_var_params and local_var_params['container_id'] is not None: # noqa: E501
query_params.append(('containerID', local_var_params['container_id'])) # noqa: E501
collection_formats['containerID'] = 'multi' # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'type' in local_var_params and local_var_params['type'] is not None: # noqa: E501
query_params.append(('type', local_var_params['type'])) # noqa: E501
collection_formats['type'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'user' in local_var_params and local_var_params['user'] is not None: # noqa: E501
query_params.append(('user', local_var_params['user'])) # noqa: E501
collection_formats['user'] = 'multi' # noqa: E501
if 'os' in local_var_params and local_var_params['os'] is not None: # noqa: E501
query_params.append(('os', local_var_params['os'])) # noqa: E501
collection_formats['os'] = 'multi' # noqa: E501
if 'namespace' in local_var_params and local_var_params['namespace'] is not None: # noqa: E501
query_params.append(('namespace', local_var_params['namespace'])) # noqa: E501
collection_formats['namespace'] = 'multi' # noqa: E501
if 'cluster' in local_var_params and local_var_params['cluster'] is not None: # noqa: E501
query_params.append(('cluster', local_var_params['cluster'])) # noqa: E501
collection_formats['cluster'] = 'multi' # noqa: E501
if 'attack_type' in local_var_params and local_var_params['attack_type'] is not None: # noqa: E501
query_params.append(('attackType', local_var_params['attack_type'])) # noqa: E501
collection_formats['attackType'] = 'multi' # noqa: E501
if 'hostname' in local_var_params and local_var_params['hostname'] is not None: # noqa: E501
query_params.append(('hostname', local_var_params['hostname'])) # noqa: E501
collection_formats['hostname'] = 'multi' # noqa: E501
if 'msg' in local_var_params and local_var_params['msg'] is not None: # noqa: E501
query_params.append(('msg', local_var_params['msg'])) # noqa: E501
collection_formats['msg'] = 'multi' # noqa: E501
if 'interactive' in local_var_params and local_var_params['interactive'] is not None: # noqa: E501
query_params.append(('interactive', local_var_params['interactive'])) # noqa: E501
collection_formats['interactive'] = 'multi' # noqa: E501
if 'function' in local_var_params and local_var_params['function'] is not None: # noqa: E501
query_params.append(('function', local_var_params['function'])) # noqa: E501
collection_formats['function'] = 'multi' # noqa: E501
if 'region' in local_var_params and local_var_params['region'] is not None: # noqa: E501
query_params.append(('region', local_var_params['region'])) # noqa: E501
collection_formats['region'] = 'multi' # noqa: E501
if 'runtime' in local_var_params and local_var_params['runtime'] is not None: # noqa: E501
query_params.append(('runtime', local_var_params['runtime'])) # noqa: E501
collection_formats['runtime'] = 'multi' # noqa: E501
if 'attack_techniques' in local_var_params and local_var_params['attack_techniques'] is not None: # noqa: E501
query_params.append(('attackTechniques', local_var_params['attack_techniques'])) # noqa: E501
collection_formats['attackTechniques'] = 'multi' # noqa: E501
if 'app' in local_var_params and local_var_params['app'] is not None: # noqa: E501
query_params.append(('app', local_var_params['app'])) # noqa: E501
collection_formats['app'] = 'multi' # noqa: E501
if 'process_path' in local_var_params and local_var_params['process_path'] is not None: # noqa: E501
query_params.append(('processPath', local_var_params['process_path'])) # noqa: E501
collection_formats['processPath'] = 'multi' # noqa: E501
if 'request_id' in local_var_params and local_var_params['request_id'] is not None: # noqa: E501
query_params.append(('requestID', local_var_params['request_id'])) # noqa: E501
collection_formats['requestID'] = 'multi' # noqa: E501
if 'function_id' in local_var_params and local_var_params['function_id'] is not None: # noqa: E501
query_params.append(('functionID', local_var_params['function_id'])) # noqa: E501
collection_formats['functionID'] = 'multi' # noqa: E501
if 'aggregate' in local_var_params and local_var_params['aggregate'] is not None: # noqa: E501
query_params.append(('aggregate', local_var_params['aggregate'])) # noqa: E501
if 'buckets' in local_var_params and local_var_params['buckets'] is not None: # noqa: E501
query_params.append(('buckets', local_var_params['buckets'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[TypesAuditTimeslice]",
}
return self.api_client.call_api(
'/api/v1/audits/runtime/serverless/timeslice', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_trust_download_get(self, **kwargs): # noqa: E501
"""api_v1_audits_trust_download_get # noqa: E501
DownloadTrustAudits downloads the trust audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_trust_download_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param rule_name: RuleNames is used to filter by rulename.
:type rule_name: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param id: IDs is used to filter by registry/repo.
:type id: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_trust_download_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_trust_download_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_trust_download_get # noqa: E501
DownloadTrustAudits downloads the trust audits according to the specified query # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_trust_download_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param rule_name: RuleNames is used to filter by rulename.
:type rule_name: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param id: IDs is used to filter by registry/repo.
:type id: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'rule_name',
'effect',
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_trust_download_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'id' in local_var_params and local_var_params['id'] is not None: # noqa: E501
query_params.append(('_id', local_var_params['id'])) # noqa: E501
collection_formats['_id'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/api/v1/audits/trust/download', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def api_v1_audits_trust_get(self, **kwargs): # noqa: E501
"""api_v1_audits_trust_get # noqa: E501
TrustAudits returns all trust audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_trust_get(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param rule_name: RuleNames is used to filter by rulename.
:type rule_name: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param id: IDs is used to filter by registry/repo.
:type id: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[SharedTrustAudits]
"""
kwargs['_return_http_data_only'] = True
return self.api_v1_audits_trust_get_with_http_info(**kwargs) # noqa: E501
def api_v1_audits_trust_get_with_http_info(self, **kwargs): # noqa: E501
"""api_v1_audits_trust_get # noqa: E501
TrustAudits returns all trust audits according to the query specification # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api_v1_audits_trust_get_with_http_info(async_req=True)
>>> result = thread.get()
:param offset: Offset from the start of the list from which to retrieve documents.
:type offset: int
:param limit: Number of documents to return.
:type limit: int
:param search: Search term.
:type search: str
:param sort: Key on which to sort.
:type sort: str
:param reverse: Sort order.
:type reverse: bool
:param collections: Scopes the query by collection.
:type collections: list[str]
:param account_ids: Scopes the query by account ID.
:type account_ids: list[str]
:param fields: List of fields to retrieve.
:type fields: list[str]
:param _from: From is an optional minimum time constraints for the audit.
:type _from: datetime
:param to: To is an optional maximum time constraints for the audit.
:type to: datetime
:param rule_name: RuleNames is used to filter by rulename.
:type rule_name: list[str]
:param effect: Effect is used to filter by runtime audit effect (block/alert).
:type effect: list[str]
:param id: IDs is used to filter by registry/repo.
:type id: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[SharedTrustAudits], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'offset',
'limit',
'search',
'sort',
'reverse',
'collections',
'account_ids',
'fields',
'_from',
'to',
'rule_name',
'effect',
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method api_v1_audits_trust_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in local_var_params and local_var_params['offset'] is not None: # noqa: E501
query_params.append(('offset', local_var_params['offset'])) # noqa: E501
if 'limit' in local_var_params and local_var_params['limit'] is not None: # noqa: E501
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'reverse' in local_var_params and local_var_params['reverse'] is not None: # noqa: E501
query_params.append(('reverse', local_var_params['reverse'])) # noqa: E501
if 'collections' in local_var_params and local_var_params['collections'] is not None: # noqa: E501
query_params.append(('collections', local_var_params['collections'])) # noqa: E501
collection_formats['collections'] = 'multi' # noqa: E501
if 'account_ids' in local_var_params and local_var_params['account_ids'] is not None: # noqa: E501
query_params.append(('accountIDs', local_var_params['account_ids'])) # noqa: E501
collection_formats['accountIDs'] = 'multi' # noqa: E501
if 'fields' in local_var_params and local_var_params['fields'] is not None: # noqa: E501
query_params.append(('fields', local_var_params['fields'])) # noqa: E501
collection_formats['fields'] = 'multi' # noqa: E501
if '_from' in local_var_params and local_var_params['_from'] is not None: # noqa: E501
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'to' in local_var_params and local_var_params['to'] is not None: # noqa: E501
query_params.append(('to', local_var_params['to'])) # noqa: E501
if 'rule_name' in local_var_params and local_var_params['rule_name'] is not None: # noqa: E501
query_params.append(('ruleName', local_var_params['rule_name'])) # noqa: E501
collection_formats['ruleName'] = 'multi' # noqa: E501
if 'effect' in local_var_params and local_var_params['effect'] is not None: # noqa: E501
query_params.append(('effect', local_var_params['effect'])) # noqa: E501
collection_formats['effect'] = 'multi' # noqa: E501
if 'id' in local_var_params and local_var_params['id'] is not None: # noqa: E501
query_params.append(('_id', local_var_params['id'])) # noqa: E501
collection_formats['_id'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[SharedTrustAudits]",
}
return self.api_client.call_api(
'/api/v1/audits/trust', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 53.148551 | 144 | 0.622702 | 92,648 | 757,420 | 4.875054 | 0.004814 | 0.068688 | 0.117353 | 0.039853 | 0.996227 | 0.996004 | 0.995931 | 0.995745 | 0.995601 | 0.995275 | 0 | 0.020344 | 0.290468 | 757,420 | 14,250 | 145 | 53.152281 | 0.820096 | 0.457143 | 0 | 0.920012 | 1 | 0 | 0.191227 | 0.020318 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014305 | false | 0 | 0.000769 | 0 | 0.02938 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
73993936160f9c4f93d338ecb645190abe5e6122 | 32,537 | py | Python | tests/test_python.py | sumanthvrao/pygments | abbf9119b28bda4f93b6104927caa234b0b2a155 | [
"BSD-2-Clause"
] | 1 | 2021-05-13T21:13:19.000Z | 2021-05-13T21:13:19.000Z | tests/test_python.py | sumanthvrao/pygments | abbf9119b28bda4f93b6104927caa234b0b2a155 | [
"BSD-2-Clause"
] | 1 | 2021-07-17T22:46:36.000Z | 2021-07-17T22:46:36.000Z | tests/test_python.py | sumanthvrao/pygments | abbf9119b28bda4f93b6104927caa234b0b2a155 | [
"BSD-2-Clause"
] | 1 | 2021-07-15T20:08:44.000Z | 2021-07-15T20:08:44.000Z | # -*- coding: utf-8 -*-
"""
Python Tests
~~~~~~~~~~~~
:copyright: Copyright 2006-2020 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import pytest
from pygments.lexers import PythonLexer, Python3Lexer
from pygments.token import Token
import re
@pytest.fixture(scope='module')
def lexer2():
yield PythonLexer()
@pytest.fixture(scope='module')
def lexer3():
yield Python3Lexer()
def test_cls_builtin(lexer2):
"""
Tests that a cls token gets interpreted as a Token.Name.Builtin.Pseudo
"""
fragment = 'class TestClass():\n @classmethod\n def hello(cls):\n pass\n'
tokens = [
(Token.Keyword, 'class'),
(Token.Text, ' '),
(Token.Name.Class, 'TestClass'),
(Token.Punctuation, '('),
(Token.Punctuation, ')'),
(Token.Punctuation, ':'),
(Token.Text, '\n'),
(Token.Text, ' '),
(Token.Name.Decorator, '@classmethod'),
(Token.Text, '\n'),
(Token.Text, ' '),
(Token.Keyword, 'def'),
(Token.Text, ' '),
(Token.Name.Function, 'hello'),
(Token.Punctuation, '('),
(Token.Name.Builtin.Pseudo, 'cls'),
(Token.Punctuation, ')'),
(Token.Punctuation, ':'),
(Token.Text, '\n'),
(Token.Text, ' '),
(Token.Keyword, 'pass'),
(Token.Text, '\n'),
]
assert list(lexer2.get_tokens(fragment)) == tokens
def test_needs_name(lexer3):
"""
Tests that '@' is recognized as an Operator
"""
fragment = 'S = (H @ beta - r).T @ inv(H @ V @ H.T) @ (H @ beta - r)\n'
tokens = [
(Token.Name, 'S'),
(Token.Text, ' '),
(Token.Operator, '='),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Name, 'H'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Name, 'beta'),
(Token.Text, ' '),
(Token.Operator, '-'),
(Token.Text, ' '),
(Token.Name, 'r'),
(Token.Punctuation, ')'),
(Token.Operator, '.'),
(Token.Name, 'T'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Name, 'inv'),
(Token.Punctuation, '('),
(Token.Name, 'H'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Name, 'V'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Name, 'H'),
(Token.Operator, '.'),
(Token.Name, 'T'),
(Token.Punctuation, ')'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Name, 'H'),
(Token.Text, ' '),
(Token.Operator, '@'),
(Token.Text, ' '),
(Token.Name, 'beta'),
(Token.Text, ' '),
(Token.Operator, '-'),
(Token.Text, ' '),
(Token.Name, 'r'),
(Token.Punctuation, ')'),
(Token.Text, '\n'),
]
assert list(lexer3.get_tokens(fragment)) == tokens
def test_pep_515(lexer3):
"""
Tests that the lexer can parse numeric literals with underscores
"""
fragments = (
(Token.Literal.Number.Integer, '1_000_000'),
(Token.Literal.Number.Float, '1_000.000_001'),
(Token.Literal.Number.Float, '1_000e1_000j'),
(Token.Literal.Number.Hex, '0xCAFE_F00D'),
(Token.Literal.Number.Bin, '0b_0011_1111_0100_1110'),
(Token.Literal.Number.Oct, '0o_777_123'),
)
for token, fragment in fragments:
tokens = [
(token, fragment),
(Token.Text, '\n'),
]
assert list(lexer3.get_tokens(fragment)) == tokens
def test_walrus_operator(lexer3):
"""
Tests that ':=' is recognized as an Operator
"""
fragment = 'if (a := 2) > 4:'
tokens = [
(Token.Keyword, 'if'),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Name, 'a'),
(Token.Text, ' '),
(Token.Operator, ':='),
(Token.Text, ' '),
(Token.Literal.Number.Integer, '2'),
(Token.Punctuation, ')'),
(Token.Text, ' '),
(Token.Operator, '>'),
(Token.Text, ' '),
(Token.Literal.Number.Integer, '4'),
(Token.Punctuation, ':'),
(Token.Text, '\n'),
]
assert list(lexer3.get_tokens(fragment)) == tokens
def test_fstring(lexer3):
"""
Tests that the lexer can parse f-strings
"""
fragments_and_tokens = (
# examples from PEP-0498
(
"f'My name is {name}, my age next year is {age+1}, my anniversary is {anniversary:%A, %B %d, %Y}.'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'My name is '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'name'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ', my age next year is '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'age'),
(Token.Operator, '+'),
(Token.Literal.Number.Integer, '1'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ', my anniversary is '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'anniversary'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Single, '%A, %B %d, %Y'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '.'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'He said his name is {name!r}.'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'He said his name is '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'name'),
(Token.Literal.String.Interpol, '!r}'),
(Token.Literal.String.Single, '.'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'input={value:#06x}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'input='),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'value'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Single, '#06x'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""f'{"quoted string"}'\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Double, 'quoted string'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""f'{f"{inner}"}'\n""", # not in the PEP
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'inner'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
# SyntaxError: f-string expression part cannot include a backslash
"f'{\\'quoted string\\'}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Error, '\\'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'quoted string'),
(Token.Literal.String.Escape, "\\'"),
(Token.Literal.String.Single, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{{ {4*10} }}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Escape, '{{'),
(Token.Literal.String.Single, ' '),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.Number.Integer, '4'),
(Token.Operator, '*'),
(Token.Literal.Number.Integer, '10'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ' '),
(Token.Literal.String.Escape, '}}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{{{4*10}}}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Escape, '{{'),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.Number.Integer, '4'),
(Token.Operator, '*'),
(Token.Literal.Number.Integer, '10'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Escape, '}}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"fr'x={4*10}'\n",
[
(Token.Literal.String.Affix, 'fr'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, "x="),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.Number.Integer, '4'),
(Token.Operator, '*'),
(Token.Literal.Number.Integer, '10'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""f'abc {a["x"]} def'\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'abc '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'a'),
(Token.Punctuation, '['),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Double, 'x'),
(Token.Literal.String.Double, '"'),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ' def'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'''abc {a['x']} def'''\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'''"),
(Token.Literal.String.Single, 'abc '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'a'),
(Token.Punctuation, '['),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'x'),
(Token.Literal.String.Single, "'"),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ' def'),
(Token.Literal.String.Single, "'''"),
(Token.Text, '\n')
]
), (
"""f'''{x
+1}'''\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'''"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'x'),
(Token.Text, '\n'),
(Token.Operator, '+'),
(Token.Literal.Number.Integer, '1'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'''"),
(Token.Text, '\n')
]
), (
"""f'''{d[0
]}'''\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'''"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'd'),
(Token.Punctuation, '['),
(Token.Literal.Number.Integer, '0'),
(Token.Text, '\n'),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'''"),
(Token.Text, '\n')
]
), (
"f'result: {value:{width}.{precision}}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'result: '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'value'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'width'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '.'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'precision'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"'a' 'b' f'{x}' '{c}' f'str<{y:^4}>' 'd' 'e'\n",
[
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'a'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'b'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'x'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{c}'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'str<'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'y'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Single, '^4'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '>'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'd'),
(Token.Literal.String.Single, "'"),
(Token.Text, ' '),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'e'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{i}:{d[i]}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'i'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ':'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'd'),
(Token.Punctuation, '['),
(Token.Name, 'i'),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'x = {x:+3}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, "x = "),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'x'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Single, '+3'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{fn(lst,2)} {fn(lst,3)}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'fn'),
(Token.Punctuation, '('),
(Token.Name, 'lst'),
(Token.Punctuation, ','),
(Token.Literal.Number.Integer, '2'),
(Token.Punctuation, ')'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ' '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'fn'),
(Token.Punctuation, '('),
(Token.Name, 'lst'),
(Token.Punctuation, ','),
(Token.Literal.Number.Integer, '3'),
(Token.Punctuation, ')'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'mapping is { {a:b for (a, b) in ((1, 2), (3, 4))} }'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'mapping is '),
(Token.Literal.String.Interpol, '{'),
(Token.Text, ' '),
(Token.Punctuation, '{'),
(Token.Name, 'a'),
(Token.Punctuation, ':'),
(Token.Name, 'b'),
(Token.Text, ' '),
(Token.Keyword, 'for'),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Name, 'a'),
(Token.Punctuation, ','),
(Token.Text, ' '),
(Token.Name, 'b'),
(Token.Punctuation, ')'),
(Token.Text, ' '),
(Token.Operator.Word, 'in'),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Punctuation, '('),
(Token.Literal.Number.Integer, '1'),
(Token.Punctuation, ','),
(Token.Text, ' '),
(Token.Literal.Number.Integer, '2'),
(Token.Punctuation, ')'),
(Token.Punctuation, ','),
(Token.Text, ' '),
(Token.Punctuation, '('),
(Token.Literal.Number.Integer, '3'),
(Token.Punctuation, ','),
(Token.Text, ' '),
(Token.Literal.Number.Integer, '4'),
(Token.Punctuation, ')'),
(Token.Punctuation, ')'),
(Token.Punctuation, '}'),
(Token.Text, ' '),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""f'a={d["a"]}'\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'a='),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'd'),
(Token.Punctuation, '['),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Double, 'a'),
(Token.Literal.String.Double, '"'),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'a={d[a]}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, 'a='),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'd'),
(Token.Punctuation, '['),
(Token.Name, 'a'),
(Token.Punctuation, ']'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"fr'{header}:\\s+'\n",
[
(Token.Literal.String.Affix, 'fr'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'header'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ':'),
(Token.Literal.String.Single, '\\'),
(Token.Literal.String.Single, 's+'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{a!r}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'a'),
(Token.Literal.String.Interpol, '!r}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'{(lambda x: x*2)(3)}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Punctuation, '('),
(Token.Keyword, 'lambda'),
(Token.Text, ' '),
(Token.Name, 'x'),
(Token.Punctuation, ':'),
(Token.Text, ' '),
(Token.Name, 'x'),
(Token.Operator, '*'),
(Token.Literal.Number.Integer, '2'),
(Token.Punctuation, ')'),
(Token.Punctuation, '('),
(Token.Literal.Number.Integer, '3'),
(Token.Punctuation, ')'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"extra = f'{extra},waiters:{len(self._waiters)}'\n",
[
(Token.Name, 'extra'),
(Token.Text, ' '),
(Token.Operator, '='),
(Token.Text, ' '),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'extra'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, ',waiters:'),
(Token.Literal.String.Interpol, '{'),
(Token.Name.Builtin, 'len'),
(Token.Punctuation, '('),
(Token.Name.Builtin.Pseudo, 'self'),
(Token.Operator, '.'),
(Token.Name, '_waiters'),
(Token.Punctuation, ')'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
'message.append(f" [line {lineno:2d}]")\n',
[
(Token.Name, 'message'),
(Token.Operator, '.'),
(Token.Name, 'append'),
(Token.Punctuation, '('),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Double, ' [line '),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'lineno'),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Double, '2d'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Double, ']'),
(Token.Literal.String.Double, '"'),
(Token.Punctuation, ')'),
(Token.Text, '\n')
]
),
# Examples from https://bugs.python.org/issue36817
(
'f"{foo=}"\n',
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'foo'),
(Token.Literal.String.Interpol, '=}'),
(Token.Literal.String.Double, '"'),
(Token.Text, '\n')
]
), (
"f'{foo=!s}'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'foo'),
(Token.Literal.String.Interpol, '=!s}'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
'f"{math.pi=!f:.2f}"\n',
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'math'),
(Token.Operator, '.'),
(Token.Name, 'pi'),
(Token.Literal.String.Interpol, '=!f:'),
(Token.Literal.String.Double, '.2f'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Double, '"'),
(Token.Text, '\n')
]
), (
'f"{ chr(65) =}"\n',
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '{'),
(Token.Text, ' '),
(Token.Name.Builtin, 'chr'),
(Token.Punctuation, '('),
(Token.Literal.Number.Integer, '65'),
(Token.Punctuation, ')'),
(Token.Text, ' '),
(Token.Literal.String.Interpol, '=}'),
(Token.Literal.String.Double, '"'),
(Token.Text, '\n')
]
), (
'f"{chr(65) = }"\n',
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Interpol, '{'),
(Token.Name.Builtin, 'chr'),
(Token.Punctuation, '('),
(Token.Literal.Number.Integer, '65'),
(Token.Punctuation, ')'),
(Token.Text, ' '),
(Token.Literal.String.Interpol, '= }'),
(Token.Literal.String.Double, '"'),
(Token.Text, '\n')
]
), (
"f'*{n=:30}*'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'n'),
(Token.Literal.String.Interpol, '=:'),
(Token.Literal.String.Single, '30'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"f'*{n=!r:30}*'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'n'),
(Token.Literal.String.Interpol, '=!r:'),
(Token.Literal.String.Single, '30'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""f"*{f'{n=}':30}*"\n""",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Double, '"'),
(Token.Literal.String.Double, '*'),
(Token.Literal.String.Interpol, '{'),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'n'),
(Token.Literal.String.Interpol, '=}'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Interpol, ':'),
(Token.Literal.String.Double, '30'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Double, '*'),
(Token.Literal.String.Double, '"'),
(Token.Text, '\n')
]
), (
"f'*{n=:+<30}*'\n",
[
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'"),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'n'),
(Token.Literal.String.Interpol, '=:'),
(Token.Literal.String.Single, '+<30'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, '*'),
(Token.Literal.String.Single, "'"),
(Token.Text, '\n')
]
), (
"""
f'''{foo
= !s:20}'''\n""",
[
(Token.Text, ' '),
(Token.Literal.String.Affix, 'f'),
(Token.Literal.String.Single, "'''"),
(Token.Literal.String.Interpol, '{'),
(Token.Name, 'foo'),
(Token.Text, '\n '),
(Token.Literal.String.Interpol, '= !s:'),
(Token.Literal.String.Single, '20'),
(Token.Literal.String.Interpol, '}'),
(Token.Literal.String.Single, "'''"),
(Token.Text, '\n')
]
)
)
for fragment,tokens in fragments_and_tokens:
assert list(lexer3.get_tokens(fragment)) == tokens
# Now switch between single and double quotes, to cover both cases equally
rep = {"'":'"', '"':"'"}
pattern = re.compile("|".join(rep.keys()))
for fragment,tokens in fragments_and_tokens:
fragment = pattern.sub(lambda m: rep[m.group(0)], fragment)
tokens = list(tokens)
for i,(token,match) in enumerate(tokens):
if token == Token.Literal.String.Single:
token = Token.Literal.String.Double
elif token == Token.Literal.String.Double:
token = Token.Literal.String.Single
match = pattern.sub(lambda m: rep[m.group(0)], match)
tokens[i] = (token, match)
assert list(lexer3.get_tokens(fragment)) == tokens
| 38.596679 | 114 | 0.421766 | 2,662 | 32,537 | 5.141623 | 0.076634 | 0.293709 | 0.405056 | 0.224447 | 0.872434 | 0.842917 | 0.816322 | 0.785563 | 0.744721 | 0.718638 | 0 | 0.008715 | 0.386391 | 32,537 | 842 | 115 | 38.642518 | 0.676834 | 0.019885 | 0 | 0.713561 | 0 | 0.00507 | 0.063969 | 0.004338 | 0 | 0 | 0 | 0 | 0.007605 | 1 | 0.008872 | false | 0.002535 | 0.00507 | 0 | 0.013942 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
73a7db52e64e95a13c8a201d94940a433651a850 | 12,044 | py | Python | tests/test_models.py | soundsensing/openl3 | b3d924e69841ab48ea5bf5995fefaeeebb26d531 | [
"MIT"
] | null | null | null | tests/test_models.py | soundsensing/openl3 | b3d924e69841ab48ea5bf5995fefaeeebb26d531 | [
"MIT"
] | null | null | null | tests/test_models.py | soundsensing/openl3 | b3d924e69841ab48ea5bf5995fefaeeebb26d531 | [
"MIT"
] | null | null | null | import openl3.core
from openl3.models import (
load_audio_embedding_model, get_audio_embedding_model_path,
load_image_embedding_model, get_image_embedding_model_path)
from openl3.openl3_exceptions import OpenL3Error
import pytest
INPUT_REPR_SIZES = {
'linear': (None, 257, 197, 1),
'mel128': (None, 128, 199, 1),
'mel256': (None, 256, 199, 1),
}
CONTENT_TYPES = ['env', 'music']
@pytest.mark.parametrize('input_repr', list(INPUT_REPR_SIZES))
@pytest.mark.parametrize('content_type', CONTENT_TYPES)
def test_get_audio_embedding_model_path(input_repr, content_type):
embedding_model_path = get_audio_embedding_model_path(input_repr, content_type)
assert (
'/'.join(embedding_model_path.split('/')[-2:]) ==
'openl3/openl3_audio_{}_{}.h5'.format(input_repr, content_type))
def test_load_audio_embedding_model():
import kapre
m = load_audio_embedding_model('linear', 'music', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Spectrogram)
assert m.layers[1].output_shape == (None, 257, 197, 1)
assert m.output_shape[1] == 6144
first_model = m
m = load_audio_embedding_model('linear', 'music', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Spectrogram)
assert m.layers[1].output_shape == (None, 257, 197, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('linear', 'env', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Spectrogram)
assert m.layers[1].output_shape == (None, 257, 197, 1)
assert m.output_shape[1] == 6144
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('linear', 'env', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Spectrogram)
assert m.layers[1].output_shape == (None, 257, 197, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel128', 'music', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 128, 199, 1)
assert m.output_shape[1] == 6144
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel128', 'music', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 128, 199, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel128', 'env', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 128, 199, 1)
assert m.output_shape[1] == 6144
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel128', 'env', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 128, 199, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel256', 'music', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 256, 199, 1)
assert m.output_shape[1] == 6144
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel256', 'music', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 256, 199, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel256', 'env', 6144)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 256, 199, 1)
assert m.output_shape[1] == 6144
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
m = load_audio_embedding_model('mel256', 'env', 512)
# assert isinstance(m.layers[1], kapre.time_frequency.Melspectrogram)
assert m.layers[1].output_shape == (None, 256, 199, 1)
assert m.output_shape[1] == 512
# Check model consistency
assert isinstance(m.layers[0], type(first_model.layers[0]))
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers[2:], first_model.layers[2:])])
def _compare_layers(layersA, layersB):
assert len(layersA) == len(layersB)
for la, lb in zip(layersA, layersB):
assert type(la) == type(lb)
assert la.input_shape == lb.input_shape
assert la.output_shape == lb.output_shape
@pytest.mark.parametrize('input_repr', list(INPUT_REPR_SIZES))
def test_frontend(input_repr):
# check spectrogram input size
m = load_audio_embedding_model(input_repr, 'env', 512, frontend='librosa')
assert m.input_shape == INPUT_REPR_SIZES[input_repr]
m2 = load_audio_embedding_model(input_repr, 'env', 512, frontend='kapre')
assert m2.input_shape == (None, 1, openl3.core.TARGET_SR)
# compare all layers to model with frontend
_compare_layers(m.layers[1:], m2.layers[2:])
with pytest.raises(OpenL3Error):
load_audio_embedding_model(input_repr, 'env', 512, frontend='not-a-thing')
def test_validate_audio_frontend():
input_repr = 'mel128'
# test kapre
mk = load_audio_embedding_model(input_repr, 'env', 512, frontend='kapre')
assert len(mk.input_shape) == 3
# assert openl3.models._validate_audio_frontend('infer', input_repr, mk) == ('kapre', input_repr)
assert openl3.models._validate_audio_frontend('kapre', input_repr, mk) == ('kapre', input_repr)
# test librosa validate
ml = load_audio_embedding_model(input_repr, 'env', 512, frontend='librosa')
assert len(ml.input_shape) == 4
# assert openl3.models._validate_audio_frontend('infer', input_repr, ml) == ('librosa', input_repr)
assert openl3.models._validate_audio_frontend('librosa', input_repr, ml) == ('librosa', input_repr)
# test frontend + no input_repr
assert openl3.models._validate_audio_frontend('kapre', None, mk) == ('kapre', 'mel256')
with pytest.raises(OpenL3Error):
openl3.models._validate_audio_frontend('librosa', None, ml)
# test mismatched frontend/model
with pytest.raises(OpenL3Error):
openl3.models._validate_audio_frontend('librosa', None, mk)
with pytest.raises(OpenL3Error):
openl3.models._validate_audio_frontend('kapre', None, ml)
@pytest.mark.parametrize('input_repr', list(INPUT_REPR_SIZES))
@pytest.mark.parametrize('content_type', CONTENT_TYPES)
def test_get_image_embedding_model_path(input_repr, content_type):
embedding_model_path = get_image_embedding_model_path(input_repr, content_type)
assert (
'/'.join(embedding_model_path.split('/')[-2:]) ==
'openl3/openl3_image_{}_{}.h5'.format(input_repr, content_type))
def test_load_image_embedding_model():
m = load_image_embedding_model('linear', 'music', 8192)
assert m.output_shape[1] == 8192
first_model = m
m = load_image_embedding_model('linear', 'music', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('linear', 'env', 8192)
assert m.output_shape[1] == 8192
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('linear', 'env', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel128', 'music', 8192)
assert m.output_shape[1] == 8192
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel128', 'music', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel128', 'env', 8192)
assert m.output_shape[1] == 8192
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel128', 'env', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel256', 'music', 8192)
assert m.output_shape[1] == 8192
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel256', 'music', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel256', 'env', 8192)
assert m.output_shape[1] == 8192
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
m = load_image_embedding_model('mel256', 'env', 512)
assert m.output_shape[1] == 512
assert len(m.layers) == len(first_model.layers)
assert all([isinstance(l1, type(l2))
for (l1, l2) in zip(m.layers, first_model.layers)])
| 43.480144 | 103 | 0.674278 | 1,723 | 12,044 | 4.518862 | 0.053976 | 0.071924 | 0.113023 | 0.055484 | 0.893013 | 0.88338 | 0.873748 | 0.856666 | 0.846391 | 0.786797 | 0 | 0.056877 | 0.179592 | 12,044 | 276 | 104 | 43.637681 | 0.7311 | 0.118399 | 0 | 0.646465 | 0 | 0 | 0.04676 | 0.00529 | 0 | 0 | 0 | 0 | 0.525253 | 1 | 0.035354 | false | 0 | 0.025253 | 0 | 0.060606 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73be31a3b9547699e262e69a719a20926f08c34e | 8,614 | py | Python | tests/integration/test_api.py | neuro-inc/platform-reports | 161c18733370235af0b63a772de49343e956c35c | [
"Apache-2.0"
] | null | null | null | tests/integration/test_api.py | neuro-inc/platform-reports | 161c18733370235af0b63a772de49343e956c35c | [
"Apache-2.0"
] | 9 | 2021-12-23T03:10:40.000Z | 2022-03-31T03:15:52.000Z | tests/integration/test_api.py | neuro-inc/platform-reports | 161c18733370235af0b63a772de49343e956c35c | [
"Apache-2.0"
] | null | null | null | from __future__ import annotations
import re
import time
from collections.abc import Callable
from contextlib import AbstractAsyncContextManager
from dataclasses import replace
import aiohttp
from aiohttp.web import HTTPForbidden, HTTPOk
from yarl import URL
from platform_reports.config import MetricsConfig
from platform_reports.kube_client import Node
class TestMetrics:
async def test_ping(
self, client: aiohttp.ClientSession, metrics_server: URL
) -> None:
async with client.get(metrics_server / "ping") as response:
assert response.status == HTTPOk.status_code
async def test_node_metrics(
self, client: aiohttp.ClientSession, metrics_server: URL, kube_node: Node
) -> None:
async with client.get(metrics_server / "metrics") as response:
text = await response.text()
assert response.status == HTTPOk.status_code, text
assert (
text
== f"""\
# HELP kube_node_price_total The total price of the node.
# TYPE kube_node_price_total counter
kube_node_price_total{{node="{kube_node.metadata.name}",currency="USD"}} 0.00"""
)
async def test_node_and_pod_metrics(
self,
client: aiohttp.ClientSession,
metrics_server_factory: Callable[
[MetricsConfig], AbstractAsyncContextManager[URL]
],
metrics_config: MetricsConfig,
kube_node: Node,
) -> None:
metrics_config = replace(metrics_config, job_label="")
async with metrics_server_factory(metrics_config) as server:
async with client.get(server / "metrics") as response:
text = await response.text()
assert response.status == HTTPOk.status_code, text
assert re.search(
rf"""# HELP kube_node_price_total The total price of the node\.
\# TYPE kube_node_price_total counter
kube_node_price_total{{node="{kube_node.metadata.name}",currency="USD"}} 0\.00
\# HELP kube_pod_credits_total The total credits of the pod\.
\# TYPE kube_pod_credits_total counter
(kube_pod_credits_total{{pod=".+"}} 10\s*)+""",
text,
), text
class TestPrometheusProxy:
async def test_ping(
self, client: aiohttp.ClientSession, prometheus_proxy_server: URL
) -> None:
async with client.get(prometheus_proxy_server / "api/v1/ping") as response:
assert response.status == HTTPOk.status_code
async def test_query(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
(prometheus_proxy_server / "api/v1/query").with_query(
query='node_cpu_seconds_total{job="node-exporter"}'
),
cookies={"dat": cluster_admin_token},
) as response:
assert response.status == HTTPOk.status_code
async def test_query_forbidden(
self,
client: aiohttp.ClientSession,
regular_user_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
(prometheus_proxy_server / "api/v1/query").with_query(
query='node_cpu_seconds_total{job="node-exporter"}'
),
cookies={"dat": regular_user_token},
) as response:
assert response.status == HTTPForbidden.status_code
async def test_query_range(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
prometheus_proxy_server: URL,
) -> None:
now = int(time.time())
async with client.get(
(prometheus_proxy_server / "api/v1/query_range").with_query(
query='node_cpu_seconds_total{job="node-exporter"}',
step=5,
start=now - 60,
end=now,
),
cookies={"dat": cluster_admin_token},
) as response:
assert response.status == HTTPOk.status_code
async def test_query_forbidden_range(
self,
client: aiohttp.ClientSession,
regular_user_token: str,
prometheus_proxy_server: URL,
) -> None:
now = int(time.time())
async with client.get(
(prometheus_proxy_server / "api/v1/query_range").with_query(
query='node_cpu_seconds_total{job="node-exporter"}',
step=5,
start=now - 60,
end=now,
),
cookies={"dat": regular_user_token},
) as response:
assert response.status == HTTPForbidden.status_code
async def test_series(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
(prometheus_proxy_server / "api/v1/series").with_query(
[("match[]", 'node_cpu_seconds_total{job="node-exporter"}')]
),
cookies={"dat": cluster_admin_token},
) as response:
assert response.status == HTTPOk.status_code
async def test_series_forbidden(
self,
client: aiohttp.ClientSession,
regular_user_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
(prometheus_proxy_server / "api/v1/series").with_query(
[("match[]", 'node_cpu_seconds_total{job="node-exporter"}')]
),
cookies={"dat": regular_user_token},
) as response:
assert response.status == HTTPForbidden.status_code
async def test_label_values(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
prometheus_proxy_server / "api/v1/label/job/values",
cookies={"dat": cluster_admin_token},
) as response:
assert response.status == HTTPOk.status_code
async def test_label_values_forbidden(
self,
client: aiohttp.ClientSession,
regular_user_token: str,
prometheus_proxy_server: URL,
) -> None:
async with client.get(
prometheus_proxy_server / "api/v1/label/job/values",
cookies={"dat": regular_user_token},
) as response:
assert response.status == HTTPForbidden.status_code
class TestGrafanaProxy:
async def test_ping(
self, client: aiohttp.ClientSession, grafana_proxy_server: URL
) -> None:
async with client.get(grafana_proxy_server / "ping") as response:
assert response.status == HTTPOk.status_code
async def test_ping_includes_version(
self, client: aiohttp.ClientSession, grafana_proxy_server: URL
) -> None:
async with client.get(grafana_proxy_server / "ping") as response:
assert response.status == HTTPOk.status_code
assert "platform-reports" in response.headers["X-Service-Version"]
async def test_main(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
grafana_proxy_server: URL,
) -> None:
async with client.get(
grafana_proxy_server, cookies={"dat": cluster_admin_token}
) as response:
assert response.status == HTTPOk.status_code
async def test_main_forbidden(
self,
client: aiohttp.ClientSession,
other_cluster_user_token: str,
grafana_proxy_server: URL,
) -> None:
async with client.get(
grafana_proxy_server, cookies={"dat": other_cluster_user_token}
) as response:
assert response.status == HTTPForbidden.status_code
async def test_dashboard(
self,
client: aiohttp.ClientSession,
cluster_admin_token: str,
grafana_proxy_server: URL,
) -> None:
async with client.get(
grafana_proxy_server / "api/dashboards/uid/nodes",
cookies={"dat": cluster_admin_token},
) as response:
assert response.status == HTTPOk.status_code
async def test_dashboard_forbidden(
self,
client: aiohttp.ClientSession,
regular_user_token: str,
grafana_proxy_server: URL,
) -> None:
async with client.get(
grafana_proxy_server / "api/dashboards/uid/nodes",
cookies={"dat": regular_user_token},
) as response:
assert response.status == HTTPForbidden.status_code
| 34.874494 | 83 | 0.62085 | 944 | 8,614 | 5.412076 | 0.119703 | 0.064592 | 0.042278 | 0.105696 | 0.833235 | 0.823645 | 0.816794 | 0.777843 | 0.756704 | 0.7477 | 0 | 0.003745 | 0.286975 | 8,614 | 246 | 84 | 35.01626 | 0.828069 | 0 | 0 | 0.714932 | 0 | 0 | 0.122243 | 0.076503 | 0 | 0 | 0 | 0 | 0.095023 | 1 | 0 | false | 0 | 0.049774 | 0 | 0.063348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73d804165c47256a883c7e7aa9c006f9fb120119 | 68,624 | py | Python | benchmarks/SimResults/combinations_spec_mylocality/oldstuff/cmp_bwavesgccmcfhmmer/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/combinations_spec_mylocality/oldstuff/cmp_bwavesgccmcfhmmer/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/combinations_spec_mylocality/oldstuff/cmp_bwavesgccmcfhmmer/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0263568,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.223391,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.141179,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.616118,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 1.06689,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.611893,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 2.2949,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.587362,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 6.30225,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0266717,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0223348,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.171423,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.165179,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.198095,
'Execution Unit/Register Files/Runtime Dynamic': 0.187514,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.421449,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 1.3494,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 4.46059,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00151466,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00151466,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00131452,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000506275,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00237281,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00671666,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0146921,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.158791,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 6.43323,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.365066,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.539326,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.96874,
'Instruction Fetch Unit/Runtime Dynamic': 1.08459,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0719882,
'L2/Runtime Dynamic': 0.0295821,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 8.14272,
'Load Store Unit/Data Cache/Runtime Dynamic': 3.35801,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.223412,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.223412,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 9.20201,
'Load Store Unit/Runtime Dynamic': 4.68321,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.550897,
'Load Store Unit/StoreQ/Runtime Dynamic': 1.10179,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.195515,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.19656,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.399995,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0599551,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.896502,
'Memory Management Unit/Runtime Dynamic': 0.256515,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 30.0032,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0930512,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0326246,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.332908,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.458584,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 10.9731,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0208578,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.219071,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.111721,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.227523,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.366986,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.185242,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.77975,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.243091,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.496,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0211066,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00954332,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0768564,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0705787,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0979629,
'Execution Unit/Register Files/Runtime Dynamic': 0.080122,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.16713,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.514509,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.99883,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000681844,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000681844,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00060596,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000241181,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00101387,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00298352,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00610604,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0678491,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.31578,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.155542,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.230446,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.74375,
'Instruction Fetch Unit/Runtime Dynamic': 0.462927,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0370959,
'L2/Runtime Dynamic': 0.0124134,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 4.09971,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.39211,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0926115,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0926115,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 4.53704,
'Load Store Unit/Runtime Dynamic': 1.94145,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.228364,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.456728,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0810471,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.081597,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.26834,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0255203,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.563673,
'Memory Management Unit/Runtime Dynamic': 0.107117,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 19.967,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0555222,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0109409,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.119111,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.185574,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.70831,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.00933259,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.210019,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0499859,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.113471,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.183024,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0923842,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.388879,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.122114,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.15004,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.00944341,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00475946,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0379276,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0351991,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.047371,
'Execution Unit/Register Files/Runtime Dynamic': 0.0399586,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0822365,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.22339,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.26762,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000899747,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000899747,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000805749,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000323989,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000505639,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00311088,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00783817,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0338378,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 2.15238,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0915341,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.114929,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.47535,
'Instruction Fetch Unit/Runtime Dynamic': 0.25125,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0354928,
'L2/Runtime Dynamic': 0.00721276,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.49028,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.611852,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0405426,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0405427,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.68173,
'Load Store Unit/Runtime Dynamic': 0.852337,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0999712,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.199943,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0354801,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0359669,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.133827,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0151431,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.350884,
'Memory Management Unit/Runtime Dynamic': 0.0511099,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 15.283,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0248419,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0054218,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0582551,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0885188,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.51805,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.00146616,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.203841,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.00785323,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.130534,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.210547,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.106277,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.447357,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.148089,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.12646,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.00148364,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00547518,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.040144,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0404923,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0416277,
'Execution Unit/Register Files/Runtime Dynamic': 0.0459675,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0849388,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.233495,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.33604,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00133998,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00133998,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00120103,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000483482,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000581676,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00446266,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0116362,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0389263,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 2.47605,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.11914,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.132211,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.81473,
'Instruction Fetch Unit/Runtime Dynamic': 0.306376,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0326021,
'L2/Runtime Dynamic': 0.00762951,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.58039,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.656816,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0434581,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0434581,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.78561,
'Load Store Unit/Runtime Dynamic': 0.914594,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.10716,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.21432,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0380315,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.038449,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.153952,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0197448,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.375392,
'Memory Management Unit/Runtime Dynamic': 0.0581937,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 15.7243,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.00390292,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00593683,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0672433,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.077083,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.69991,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 6.567093064106936,
'Runtime Dynamic': 6.567093064106936,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.266036,
'Runtime Dynamic': 0.112859,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 81.2435,
'Peak Power': 114.356,
'Runtime Dynamic': 21.0122,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 80.9775,
'Total Cores/Runtime Dynamic': 20.8994,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.266036,
'Total L3s/Runtime Dynamic': 0.112859,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.080963 | 124 | 0.682152 | 8,082 | 68,624 | 5.786192 | 0.067929 | 0.123514 | 0.112907 | 0.093405 | 0.939141 | 0.929968 | 0.917223 | 0.885125 | 0.861325 | 0.842079 | 0 | 0.132192 | 0.22428 | 68,624 | 914 | 125 | 75.080963 | 0.746285 | 0 | 0 | 0.642232 | 0 | 0 | 0.657268 | 0.048087 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.