hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
439ee8e4d8aea7439dc055e48dde6a07a20ded16 | 14,380 | py | Python | test/dm_test.py | genouest/biomaj2galaxy | 8c76f3cc96902d9401a03e7b1a6cd8f4a7ba17bd | [
"MIT"
] | 1 | 2015-05-11T00:08:24.000Z | 2015-05-11T00:08:24.000Z | test/dm_test.py | genouest/biomaj2galaxy | 8c76f3cc96902d9401a03e7b1a6cd8f4a7ba17bd | [
"MIT"
] | 5 | 2019-04-15T16:09:50.000Z | 2020-11-24T10:35:21.000Z | test/dm_test.py | genouest/biomaj2galaxy | 8c76f3cc96902d9401a03e7b1a6cd8f4a7ba17bd | [
"MIT"
] | 3 | 2015-06-14T08:33:49.000Z | 2020-10-16T09:07:21.000Z | import logging
import os
import time
import unittest
from biomaj2galaxy.cli import biomaj2galaxy
from click.testing import CliRunner
from . import gi
class DmTest(unittest.TestCase):
def test_add_dbkey(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
def test_add_bowtie2(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar'], catch_exceptions=False)
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
assert [new_dbkey, new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
def test_add_genome(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', '-g', '/data/sample.fasta'], catch_exceptions=False)
fasta = self.gi.tool_data.show_data_table('all_fasta')
fasta = fasta['fields']
assert [new_dbkey, new_dbkey, new_dbkey_name, '/galaxy-central/tool-data/test_dbkey/seq/test_dbkey.fa'] in fasta
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, '/galaxy-central/tool-data/test_dbkey/len/test_dbkey.len'] in dbkeys
def test_add_bowtie2_multiple(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar', 'bowtie2:/some/other_path/foo/bar', 'bowtie2:/some/really/other/path/foo/bar:With a cool name'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] in bowtie2
for u in uuids:
assert u.startswith(new_dbkey)
def test_add_mess(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar', 'bowtie2:/some/other_path/foo/bar', 'bowtie2:/some/really/other/path/foo/bar:With a cool name', 'blastdb:/foo/really/other/path/foo/bar:With a cool name too!', 'blastdb:/foo/really/other/xxxx/bar:Wisuith a cool name too!', 'bwa:/foo/really/foo/bar:With a cool name too bwa!', 'twobit:/foo/really/foo/bar:Wblablatoo!', 'star:/foo/bloup/test/faa/bor:Wixxxtoo!', 'fasta:/fxx/bloup/txx/faa/bor:Wixxx fasta too!'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] in bowtie2
for u in uuids:
assert u.startswith(new_dbkey) and u != new_dbkey
blastdb = self.gi.tool_data.show_data_table('blastdb')
blastdb = blastdb['fields']
uuids = [x[0] for x in blastdb]
blastdb = [x[1:] for x in blastdb]
assert ['With a cool name too!', '/foo/really/other/path/foo/bar'] in blastdb
assert ["Wisuith a cool name too!", '/foo/really/other/xxxx/bar'] in blastdb
for u in uuids:
assert u.startswith(new_dbkey) and u != new_dbkey
bwa = self.gi.tool_data.show_data_table('bwa_indexes')
bwa = bwa['fields']
assert [new_dbkey, new_dbkey, 'With a cool name too bwa!', '/foo/really/foo/bar'] in bwa
twobit = self.gi.tool_data.show_data_table('twobit')
twobit = twobit['fields']
assert [new_dbkey, '/foo/really/foo/bar'] in twobit
star = self.gi.tool_data.show_data_table('rnastar_index2x_versioned')
star = star['fields']
assert [new_dbkey, new_dbkey, 'Wixxxtoo!', '/foo/bloup/test/faa/bor', '0', '0'] in star
fasta = self.gi.tool_data.show_data_table('all_fasta')
fasta = fasta['fields']
assert [new_dbkey, new_dbkey, 'Wixxx fasta too!', '/fxx/bloup/txx/faa/bor'] in fasta
def test_add_star_gtf(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'star:/foo/bloup/test/faa/bor:Wixxxtoo!', '--star-with-gtf'], catch_exceptions=False)
star = self.gi.tool_data.show_data_table('rnastar_index2x_versioned')
star = star['fields']
assert [new_dbkey, new_dbkey, 'Wixxxtoo!', '/foo/bloup/test/faa/bor', '1', '0'] in star
def test_add_star_gtf_version(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'star:/foo/bloup/test/faa/bor:Wixxxtoo!', '--star-with-gtf', '--star-version', '1.7.6'], catch_exceptions=False)
star = self.gi.tool_data.show_data_table('rnastar_index2x_versioned')
star = star['fields']
assert [new_dbkey, new_dbkey, 'Wixxxtoo!', '/foo/bloup/test/faa/bor', '1', '1.7.6'] in star
def test_add_biomaj_env(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'Cool_db (v3.0)'
back_env = os.environ
os.environ['dbname'] = "Cool_db"
os.environ['remoterelease'] = "v3.0"
os.environ['data.dir'] = "/db/"
os.environ['dirversion'] = "some/dir"
os.environ['localrelease'] = "v3.0beta"
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--no-file-check', 'bowtie2:foo/bar'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
assert [new_dbkey, new_dbkey, new_dbkey_name, '/db/some/dir/v3.0beta/foo/bar'] in bowtie2
os.environ = back_env
def test_rm_bowtie2(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
assert [new_dbkey, new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
runner.invoke(biomaj2galaxy, ['rm', new_dbkey], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] not in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
assert [new_dbkey, new_dbkey, new_dbkey_name, '/some/path/foo/bar'] not in bowtie2
def test_rm_multiple(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar', 'bowtie2:/some/other_path/foo/bar', 'bowtie2:/some/really/other/path/foo/bar:With a cool name'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] in bowtie2
for u in uuids:
assert u.startswith(new_dbkey)
runner.invoke(biomaj2galaxy, ['rm', new_dbkey], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] not in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] not in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] not in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] not in bowtie2
for u in uuids:
assert not u.startswith(new_dbkey)
def test_rm_multiple_exact(self):
new_dbkey = 'test_dbkey'
new_dbkey_name = 'My cool dbkey'
runner = CliRunner()
runner.invoke(biomaj2galaxy, ['add', '--dbkey', new_dbkey, '--dbkey-display-name', new_dbkey_name, '--no-file-check', 'bowtie2:/some/path/foo/bar', 'bowtie2:/some/other_path/foo/bar', 'bowtie2:/some/really/other/path/foo/bar:With a cool name'], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] in bowtie2
for u in uuids:
assert u.startswith(new_dbkey)
runner.invoke(biomaj2galaxy, ['rm', '--exact', new_dbkey], catch_exceptions=False)
dbkeys = self.gi.tool_data.show_data_table('__dbkeys__')
dbkeys = dbkeys['fields']
assert [new_dbkey, new_dbkey_name, ''] not in dbkeys
bowtie2 = self.gi.tool_data.show_data_table('bowtie2_indexes')
bowtie2 = bowtie2['fields']
uuids = [x[0] for x in bowtie2]
bowtie2 = [x[1:] for x in bowtie2]
assert [new_dbkey, new_dbkey_name, '/some/path/foo/bar'] not in bowtie2
assert [new_dbkey, new_dbkey_name, '/some/other_path/foo/bar'] not in bowtie2
assert [new_dbkey, "With a cool name", '/some/really/other/path/foo/bar'] not in bowtie2
for u in uuids:
assert u.startswith(new_dbkey) and u != new_dbkey
def setUp(self):
self.gi = gi
logging.getLogger("urllib3").setLevel(logging.WARNING)
logging.getLogger("bioblend").setLevel(logging.WARNING)
# Empty all tables
touched_tables = [
'bowtie2_indexes',
'blastdb',
'twobit',
'__dbkeys__',
'rnastar_index2x_versioned',
'bwa_indexes',
'all_fasta'
]
tables = [x['name'] for x in self.gi.tool_data.get_data_tables()]
for table in tables:
if table in touched_tables: # To speed things up a bit
fields = self.gi.tool_data.show_data_table(table)['fields']
for line in fields:
self.gi.tool_data.delete_data_table(table, "\t".join(line))
time.sleep(1) # Reloading too soon might not work for some strange reason
self.gi.tool_data.reload_data_table(table)
def tearDown(self):
# Empty all tables
touched_tables = [
'bowtie2_indexes',
'blastdb',
'twobit',
'__dbkeys__',
'rnastar_index2x_versioned',
'bwa_indexes',
'all_fasta'
]
tables = [x['name'] for x in self.gi.tool_data.get_data_tables()]
for table in tables:
if table in touched_tables: # To speed things up a bit
fields = self.gi.tool_data.show_data_table(table)['fields']
for line in fields:
self.gi.tool_data.delete_data_table(table, "\t".join(line))
time.sleep(1) # Reloading too soon might not work for some strange reason
self.gi.tool_data.reload_data_table(table)
| 43.575758 | 589 | 0.63943 | 1,975 | 14,380 | 4.422785 | 0.069367 | 0.124556 | 0.090784 | 0.077848 | 0.896623 | 0.877962 | 0.865598 | 0.842587 | 0.838809 | 0.821637 | 0 | 0.014076 | 0.224339 | 14,380 | 329 | 590 | 43.708207 | 0.769051 | 0.013839 | 0 | 0.744856 | 0 | 0 | 0.252928 | 0.105757 | 0 | 0 | 0 | 0 | 0.205761 | 1 | 0.053498 | false | 0 | 0.028807 | 0 | 0.08642 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
439fb0987b194d0c558beda2767e3dd3127d435d | 2,275 | py | Python | pages/12a_counting.py | sebastiandres/streamlit_ml_edu | ef9fea48654a2db8e722b0850ebfcf1ae6fb6189 | [
"MIT"
] | 4 | 2022-01-15T03:26:37.000Z | 2022-03-11T03:11:43.000Z | pages/12a_counting.py | sebastiandres/streamlit_ml_edu | ef9fea48654a2db8e722b0850ebfcf1ae6fb6189 | [
"MIT"
] | null | null | null | pages/12a_counting.py | sebastiandres/streamlit_ml_edu | ef9fea48654a2db8e722b0850ebfcf1ae6fb6189 | [
"MIT"
] | 2 | 2022-01-14T21:34:41.000Z | 2022-01-15T19:02:58.000Z | import streamlit as st
#st.title("The experiments")
md = """
Let me show you how the previous experiments have worked out:
* Patient 0: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 1: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 2: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#0088ff;">blue pill</span>.
* Patient 3: We predicted <span style="color:#0088ff;">blue pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 4: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 5: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#0088ff;">blue pill</span>.
* Patient 6: We predicted <span style="color:#0088ff;">blue pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 7: We predicted <span style="color:#0088ff;">blue pill</span> and took <span style="color:#ff4400;">red pill</span>.
* Patient 8: We predicted <span style="color:#0088ff;">blue pill</span> and took <span style="color:#0088ff;">blue pill</span>.
* Patient 9: We predicted <span style="color:#ff4400;">red pill</span> and took <span style="color:#ff4400;">red pill</span>.
As you can see, not a stellar record.
Counting by hand, we get the following:
* **True Positive**: For only 1 persons we correctly predicted the preference for positive (<span style="color:#0088ff;">blue pill</span>)
* **True Negative**: For 4 persons we correctly predicted the preference for negative (<span style="color:#ff4400;">red pill</span>).
* **False Positive (Type I error)**: For 3 persons we predicted <span style="color:#0088ff;">blue pill</span> (positive) but people preferred the <span style="color:#ff4400;">red pill</span> (negative). This is sometimes called error type I.
* **False Negative (Type II error)**: For 2 persons we predicted <span style="color:#ff4400;">red pill</span> (negative) but people preferred the <span style="color:#0088ff;">blue pill</span> (positive). This is sometimes called error type II.
"""
st.markdown(md, unsafe_allow_html=True) | 84.259259 | 243 | 0.714725 | 355 | 2,275 | 4.574648 | 0.211268 | 0.144089 | 0.224138 | 0.197044 | 0.809729 | 0.809729 | 0.764163 | 0.646552 | 0.588054 | 0.535714 | 0 | 0.058706 | 0.116484 | 2,275 | 27 | 244 | 84.259259 | 0.749254 | 0.011868 | 0 | 0 | 0 | 0.666667 | 0.965747 | 0.30516 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78ea103e5e4777351298b091539d3373170a37e4 | 2,686 | py | Python | netbox/dcim/migrations/0038_wireless_interfaces.py | BrnoPCmaniak/netbox | 7b517abdb68a6324950dfd0375861163c7bfff00 | [
"Apache-2.0"
] | 6 | 2017-12-01T05:13:39.000Z | 2020-01-23T13:04:43.000Z | netbox/dcim/migrations/0038_wireless_interfaces.py | emersonfelipesp/netbox | fecca5ad83fb6b48a2f15982dfd3242653f105f9 | [
"Apache-2.0"
] | 25 | 2019-09-17T19:40:50.000Z | 2022-03-11T04:01:55.000Z | netbox/dcim/migrations/0038_wireless_interfaces.py | emersonfelipesp/netbox | fecca5ad83fb6b48a2f15982dfd3242653f105f9 | [
"Apache-2.0"
] | 3 | 2017-11-18T01:28:22.000Z | 2018-05-17T14:04:43.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.1 on 2017-06-16 21:38
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dcim', '0037_unicode_literals'),
]
operations = [
migrations.AlterField(
model_name='interface',
name='form_factor',
field=models.PositiveSmallIntegerField(choices=[['Virtual interfaces', [[0, 'Virtual'], [200, 'Link Aggregation Group (LAG)']]], ['Ethernet (fixed)', [[800, '100BASE-TX (10/100ME)'], [1000, '1000BASE-T (1GE)'], [1150, '10GBASE-T (10GE)']]], ['Ethernet (modular)', [[1050, 'GBIC (1GE)'], [1100, 'SFP (1GE)'], [1200, 'SFP+ (10GE)'], [1300, 'XFP (10GE)'], [1310, 'XENPAK (10GE)'], [1320, 'X2 (10GE)'], [1350, 'SFP28 (25GE)'], [1400, 'QSFP+ (40GE)'], [1500, 'CFP (100GE)'], [1600, 'QSFP28 (100GE)']]], ['Wireless', [[2600, 'IEEE 802.11a'], [2610, 'IEEE 802.11b/g'], [2620, 'IEEE 802.11n'], [2630, 'IEEE 802.11ac'], [2640, 'IEEE 802.11ad']]], ['FibreChannel', [[3010, 'SFP (1GFC)'], [3020, 'SFP (2GFC)'], [3040, 'SFP (4GFC)'], [3080, 'SFP+ (8GFC)'], [3160, 'SFP+ (16GFC)']]], ['Serial', [[4000, 'T1 (1.544 Mbps)'], [4010, 'E1 (2.048 Mbps)'], [4040, 'T3 (45 Mbps)'], [4050, 'E3 (34 Mbps)']]], ['Stacking', [[5000, 'Cisco StackWise'], [5050, 'Cisco StackWise Plus'], [5100, 'Cisco FlexStack'], [5150, 'Cisco FlexStack Plus'], [5200, 'Juniper VCP']]], ['Other', [[32767, 'Other']]]], default=1200),
),
migrations.AlterField(
model_name='interfacetemplate',
name='form_factor',
field=models.PositiveSmallIntegerField(choices=[['Virtual interfaces', [[0, 'Virtual'], [200, 'Link Aggregation Group (LAG)']]], ['Ethernet (fixed)', [[800, '100BASE-TX (10/100ME)'], [1000, '1000BASE-T (1GE)'], [1150, '10GBASE-T (10GE)']]], ['Ethernet (modular)', [[1050, 'GBIC (1GE)'], [1100, 'SFP (1GE)'], [1200, 'SFP+ (10GE)'], [1300, 'XFP (10GE)'], [1310, 'XENPAK (10GE)'], [1320, 'X2 (10GE)'], [1350, 'SFP28 (25GE)'], [1400, 'QSFP+ (40GE)'], [1500, 'CFP (100GE)'], [1600, 'QSFP28 (100GE)']]], ['Wireless', [[2600, 'IEEE 802.11a'], [2610, 'IEEE 802.11b/g'], [2620, 'IEEE 802.11n'], [2630, 'IEEE 802.11ac'], [2640, 'IEEE 802.11ad']]], ['FibreChannel', [[3010, 'SFP (1GFC)'], [3020, 'SFP (2GFC)'], [3040, 'SFP (4GFC)'], [3080, 'SFP+ (8GFC)'], [3160, 'SFP+ (16GFC)']]], ['Serial', [[4000, 'T1 (1.544 Mbps)'], [4010, 'E1 (2.048 Mbps)'], [4040, 'T3 (45 Mbps)'], [4050, 'E3 (34 Mbps)']]], ['Stacking', [[5000, 'Cisco StackWise'], [5050, 'Cisco StackWise Plus'], [5100, 'Cisco FlexStack'], [5150, 'Cisco FlexStack Plus'], [5200, 'Juniper VCP']]], ['Other', [[32767, 'Other']]]], default=1200),
),
]
| 111.916667 | 1,102 | 0.569248 | 327 | 2,686 | 4.657492 | 0.41896 | 0.045962 | 0.03283 | 0.038083 | 0.84176 | 0.84176 | 0.84176 | 0.84176 | 0.84176 | 0.84176 | 0 | 0.211013 | 0.154877 | 2,686 | 23 | 1,103 | 116.782609 | 0.459912 | 0.025316 | 0 | 0.470588 | 1 | 0 | 0.446272 | 0.008031 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
60171baf501652b5572afe424a5c5ca27e227e94 | 8,113 | py | Python | gapid_tests/resource_creation_tests/CreateDestroyCommandPool_test/CreateDestroyCommandPool_test.py | AWoloszyn/vulkan_test_applications | 5e9f86cdbd4e2344f41db9e0a578fe9fba41106f | [
"Apache-2.0"
] | null | null | null | gapid_tests/resource_creation_tests/CreateDestroyCommandPool_test/CreateDestroyCommandPool_test.py | AWoloszyn/vulkan_test_applications | 5e9f86cdbd4e2344f41db9e0a578fe9fba41106f | [
"Apache-2.0"
] | null | null | null | gapid_tests/resource_creation_tests/CreateDestroyCommandPool_test/CreateDestroyCommandPool_test.py | AWoloszyn/vulkan_test_applications | 5e9f86cdbd4e2344f41db9e0a578fe9fba41106f | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from gapit_test_framework import gapit_test, require, require_equal
from gapit_test_framework import require_not_equal, GapitTest
from gapit_test_framework import get_read_offset_function, get_write_offset_function
from vulkan_constants import *
from struct_offsets import VulkanStruct, UINT32_T, POINTER, HANDLE, DEVICE_SIZE
from struct_offsets import ARRAY, CHAR
COMMAND_POOL_CREATE_INFO = [
("sType", UINT32_T),
("pNext", POINTER),
("flags", UINT32_T),
("queueFamilyIndex", UINT32_T),
]
COMMAND_POOL = [("handle", HANDLE)]
@gapit_test("CreateDestroyCommandPool_test")
class TransientBitCommandPool(GapitTest):
def expect(self):
"""1. Expects a command pool created and destroyed with
VK_COMMAND_POOL_CREATE_TRANSIENT_BIT ."""
architecture = self.architecture
create_command_pool = require(self.nth_call_of("vkCreateCommandPool",
1))
device = create_command_pool.int_device
require_not_equal(0, device)
require_not_equal(0, create_command_pool.hex_pCreateInfo)
require_equal(0, create_command_pool.hex_pAllocator)
require_not_equal(0, create_command_pool.hex_pCommandPool)
require_equal(VK_SUCCESS, int(create_command_pool.return_val))
create_info = VulkanStruct(
architecture, COMMAND_POOL_CREATE_INFO, get_read_offset_function(
create_command_pool, create_command_pool.hex_pCreateInfo))
require_equal(VK_STRUCTURE_TYPE_COMMAND_POOL_CREATE_INFO,
create_info.sType)
require_equal(0, create_info.pNext)
require_equal(VK_COMMAND_POOL_CREATE_TRANSIENT_BIT, create_info.flags)
require_equal(0, create_info.queueFamilyIndex)
command_pool = VulkanStruct(
architecture, COMMAND_POOL, get_write_offset_function(
create_command_pool, create_command_pool.hex_pCommandPool))
require_not_equal(0, command_pool.handle)
destroy_command_pool = require(self.next_call_of(
"vkDestroyCommandPool"))
require_equal(device, destroy_command_pool.int_device)
require_equal(
command_pool.handle, destroy_command_pool.int_commandPool)
@gapit_test("CreateDestroyCommandPool_test")
class ResetCommandBufferBitCommandPool(GapitTest):
def expect(self):
"""2. Expects a command pool created and destroyed with
VK_COMMAND_POOL_CREATE_RESET_COMMAND_BUFFER_BIT"""
architecture = self.architecture
create_command_pool = require(self.nth_call_of("vkCreateCommandPool",
2))
device = create_command_pool.int_device
require_not_equal(0, device)
require_not_equal(0, create_command_pool.hex_pCreateInfo)
require_equal(0, create_command_pool.hex_pAllocator)
require_not_equal(0, create_command_pool.hex_pCommandPool)
require_equal(VK_SUCCESS, int(create_command_pool.return_val))
create_info = VulkanStruct(
architecture, COMMAND_POOL_CREATE_INFO, get_read_offset_function(
create_command_pool, create_command_pool.hex_pCreateInfo))
require_equal(VK_STRUCTURE_TYPE_COMMAND_POOL_CREATE_INFO,
create_info.sType)
require_equal(0, create_info.pNext)
require_equal(VK_COMMAND_POOL_CREATE_RESET_COMMAND_BUFFER_BIT,
create_info.flags)
require_equal(0, create_info.queueFamilyIndex)
command_pool = VulkanStruct(
architecture, COMMAND_POOL, get_write_offset_function(
create_command_pool, create_command_pool.hex_pCommandPool))
require_not_equal(0, command_pool.handle)
destroy_command_pool = require(self.next_call_of(
"vkDestroyCommandPool"))
require_equal(device, destroy_command_pool.int_device)
require_equal(
command_pool.handle, destroy_command_pool.int_commandPool)
@gapit_test("CreateDestroyCommandPool_test")
class ResetCommandBufferBitTransientBitCommandPool(GapitTest):
def expect(self):
"""3. Expects a command pool created and destroyed with
VK_COMMAND_POOL_CREATE_RESET_COMMAND_BUFFER_BIT |
VK_COMMAND_POOL_CREATE_TRANSIENT_BIT"""
architecture = self.architecture
create_command_pool = require(self.nth_call_of("vkCreateCommandPool",
3))
device = create_command_pool.int_device
require_not_equal(0, device)
require_not_equal(0, create_command_pool.hex_pCreateInfo)
require_equal(0, create_command_pool.hex_pAllocator)
require_not_equal(0, create_command_pool.hex_pCommandPool)
require_equal(VK_SUCCESS, int(create_command_pool.return_val))
create_info = VulkanStruct(
architecture, COMMAND_POOL_CREATE_INFO, get_read_offset_function(
create_command_pool, create_command_pool.hex_pCreateInfo))
require_equal(VK_STRUCTURE_TYPE_COMMAND_POOL_CREATE_INFO,
create_info.sType)
require_equal(0, create_info.pNext)
require_equal(VK_COMMAND_POOL_CREATE_RESET_COMMAND_BUFFER_BIT
| VK_COMMAND_POOL_CREATE_TRANSIENT_BIT, create_info.flags)
require_equal(0, create_info.queueFamilyIndex)
command_pool = VulkanStruct(
architecture, COMMAND_POOL, get_write_offset_function(
create_command_pool, create_command_pool.hex_pCommandPool))
require_not_equal(0, command_pool.handle)
destroy_command_pool = require(self.next_call_of(
"vkDestroyCommandPool"))
require_equal(device, destroy_command_pool.int_device)
require_equal(
command_pool.handle, destroy_command_pool.int_commandPool)
@gapit_test("CreateDestroyCommandPool_test")
class EmptyBitCommandPool(GapitTest):
def expect(self):
"""3. Expects a command pool created and destroyed with empty flag
bit"""
architecture = self.architecture
create_command_pool = require(self.nth_call_of("vkCreateCommandPool",
4))
device = create_command_pool.int_device
require_not_equal(0, device)
require_not_equal(0, create_command_pool.hex_pCreateInfo)
require_equal(0, create_command_pool.hex_pAllocator)
require_not_equal(0, create_command_pool.hex_pCommandPool)
require_equal(VK_SUCCESS, int(create_command_pool.return_val))
create_info = VulkanStruct(
architecture, COMMAND_POOL_CREATE_INFO, get_read_offset_function(
create_command_pool, create_command_pool.hex_pCreateInfo))
require_equal(VK_STRUCTURE_TYPE_COMMAND_POOL_CREATE_INFO,
create_info.sType)
require_equal(0, create_info.pNext)
require_equal(0, create_info.flags)
require_equal(0, create_info.queueFamilyIndex)
command_pool = VulkanStruct(
architecture, COMMAND_POOL, get_write_offset_function(
create_command_pool, create_command_pool.hex_pCommandPool))
require_not_equal(0, command_pool.handle)
destroy_command_pool = require(self.next_call_of(
"vkDestroyCommandPool"))
require_equal(device, destroy_command_pool.int_device)
require_equal(
command_pool.handle, destroy_command_pool.int_commandPool)
| 43.854054 | 84 | 0.71256 | 942 | 8,113 | 5.708068 | 0.145435 | 0.184118 | 0.126465 | 0.074391 | 0.826111 | 0.802492 | 0.802492 | 0.802492 | 0.802492 | 0.802492 | 0 | 0.008419 | 0.224085 | 8,113 | 184 | 85 | 44.092391 | 0.845751 | 0.117096 | 0 | 0.793893 | 0 | 0 | 0.043583 | 0.016361 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030534 | false | 0 | 0.045802 | 0 | 0.10687 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6018f785f892dd5251a92ae0875aea4aacbc19a7 | 17,475 | py | Python | migrations/versions/6696bcd413f1_.py | Darvel888/rest_api | 00605b3df6f32386168fc207f7acd25e8542c478 | [
"MIT"
] | null | null | null | migrations/versions/6696bcd413f1_.py | Darvel888/rest_api | 00605b3df6f32386168fc207f7acd25e8542c478 | [
"MIT"
] | null | null | null | migrations/versions/6696bcd413f1_.py | Darvel888/rest_api | 00605b3df6f32386168fc207f7acd25e8542c478 | [
"MIT"
] | null | null | null | """empty message
Revision ID: 6696bcd413f1
Revises:
Create Date: 2019-12-14 20:47:34.468354
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from app.point import Point
# revision identifiers, used by Alembic.
revision = '6696bcd413f1'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('aircrafts_data', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Aircraft code, IATA')
op.alter_column('aircrafts_data', 'model',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment=None,
existing_comment='Aircraft model',
existing_nullable=False)
op.alter_column('aircrafts_data', 'rrange',
existing_type=sa.INTEGER(),
comment=None,
existing_comment='Maximal flying distance, km',
existing_nullable=False)
op.drop_table_comment(
'aircrafts_data',
existing_comment='Aircrafts (internal data)',
schema=None
)
op.alter_column('airports_data', 'airport_code',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Airport code')
op.alter_column('airports_data', 'airport_name',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment=None,
existing_comment='Airport name',
existing_nullable=False)
op.alter_column('airports_data', 'city',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment=None,
existing_comment='City',
existing_nullable=False)
op.alter_column('airports_data', 'coordinates',
existing_type=Point(),
comment=None,
existing_comment='Airport coordinates (longitude and latitude)',
existing_nullable=False)
op.alter_column('airports_data', 'timezone',
existing_type=sa.TEXT(),
comment=None,
existing_comment='Airport time zone',
existing_nullable=False)
op.drop_table_comment(
'airports_data',
existing_comment='Airports (internal data)',
schema=None
)
op.alter_column('boarding_passes', 'boarding_no',
existing_type=sa.INTEGER(),
comment=None,
existing_comment='Boarding pass number',
existing_nullable=False)
op.alter_column('boarding_passes', 'flight_id',
existing_type=sa.INTEGER(),
comment=None,
existing_comment='Flight ID')
op.alter_column('boarding_passes', 'seat_no',
existing_type=sa.VARCHAR(length=4),
comment=None,
existing_comment='Seat number',
existing_nullable=False)
op.alter_column('boarding_passes', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment=None,
existing_comment='Ticket number')
op.drop_table_comment(
'boarding_passes',
existing_comment='Boarding passes',
schema=None
)
op.alter_column('bookings', 'book_date',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment=None,
existing_comment='Booking date',
existing_nullable=False)
op.alter_column('bookings', 'book_ref',
existing_type=sa.CHAR(length=6),
comment=None,
existing_comment='Booking number')
op.alter_column('bookings', 'total_amount',
existing_type=sa.NUMERIC(precision=10, scale=2),
comment=None,
existing_comment='Total booking cost',
existing_nullable=False)
op.drop_table_comment(
'bookings',
existing_comment='Bookings',
schema=None
)
op.alter_column('flights', 'actual_arrival',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment=None,
existing_comment='Actual arrival time',
existing_nullable=True)
op.alter_column('flights', 'actual_departure',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment=None,
existing_comment='Actual departure time',
existing_nullable=True)
op.alter_column('flights', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Aircraft code, IATA',
existing_nullable=False)
op.alter_column('flights', 'arrival_airport',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Airport of arrival',
existing_nullable=False)
op.alter_column('flights', 'departure_airport',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Airport of departure',
existing_nullable=False)
op.alter_column('flights', 'flight_id',
existing_type=sa.INTEGER(),
comment=None,
existing_comment='Flight ID',
autoincrement=True,
existing_server_default=sa.text("nextval('flights_flight_id_seq'::regclass)"))
op.alter_column('flights', 'flight_no',
existing_type=sa.CHAR(length=6),
comment=None,
existing_comment='Flight number',
existing_nullable=False)
op.alter_column('flights', 'scheduled_arrival',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment=None,
existing_comment='Scheduled arrival time',
existing_nullable=False)
op.alter_column('flights', 'scheduled_departure',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment=None,
existing_comment='Scheduled departure time',
existing_nullable=False)
op.alter_column('flights', 'status',
existing_type=sa.VARCHAR(length=20),
comment=None,
existing_comment='Flight status',
existing_nullable=False)
op.drop_table_comment(
'flights',
existing_comment='Flights',
schema=None
)
op.alter_column('seats', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment=None,
existing_comment='Aircraft code, IATA')
op.alter_column('seats', 'fare_conditions',
existing_type=sa.VARCHAR(length=10),
comment=None,
existing_comment='Travel class',
existing_nullable=False)
op.alter_column('seats', 'seat_no',
existing_type=sa.VARCHAR(length=4),
comment=None,
existing_comment='Seat number')
op.drop_constraint('seats_aircraft_code_fkey', 'seats', type_='foreignkey')
op.create_foreign_key('boarding_passes_ticket_no_fkey', 'seats', 'aircrafts_data', ['aircraft_code'], ['aircraft_code'])
op.drop_table_comment(
'seats',
existing_comment='Seats',
schema=None
)
op.alter_column('ticket_flights', 'amount',
existing_type=sa.NUMERIC(precision=10, scale=2),
comment=None,
existing_comment='Travel cost',
existing_nullable=False)
op.alter_column('ticket_flights', 'fare_conditions',
existing_type=sa.VARCHAR(length=10),
comment=None,
existing_comment='Travel class',
existing_nullable=False)
op.alter_column('ticket_flights', 'flight_id',
existing_type=sa.INTEGER(),
comment=None,
existing_comment='Flight ID')
op.alter_column('ticket_flights', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment=None,
existing_comment='Ticket number')
op.drop_table_comment(
'ticket_flights',
existing_comment='Flight segment',
schema=None
)
op.alter_column('tickets', 'book_ref',
existing_type=sa.CHAR(length=6),
comment=None,
existing_comment='Booking number',
existing_nullable=False)
op.alter_column('tickets', 'contact_data',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment=None,
existing_comment='Passenger contact information',
existing_nullable=True)
op.alter_column('tickets', 'passenger_id',
existing_type=sa.VARCHAR(length=20),
comment=None,
existing_comment='Passenger ID',
existing_nullable=False)
op.alter_column('tickets', 'passenger_name',
existing_type=sa.TEXT(),
comment=None,
existing_comment='Passenger name',
existing_nullable=False)
op.alter_column('tickets', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment=None,
existing_comment='Ticket number')
op.drop_table_comment(
'tickets',
existing_comment='Tickets',
schema=None
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table_comment(
'tickets',
'Tickets',
existing_comment=None,
schema=None
)
op.alter_column('tickets', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment='Ticket number')
op.alter_column('tickets', 'passenger_name',
existing_type=sa.TEXT(),
comment='Passenger name',
existing_nullable=False)
op.alter_column('tickets', 'passenger_id',
existing_type=sa.VARCHAR(length=20),
comment='Passenger ID',
existing_nullable=False)
op.alter_column('tickets', 'contact_data',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment='Passenger contact information',
existing_nullable=True)
op.alter_column('tickets', 'book_ref',
existing_type=sa.CHAR(length=6),
comment='Booking number',
existing_nullable=False)
op.create_table_comment(
'ticket_flights',
'Flight segment',
existing_comment=None,
schema=None
)
op.alter_column('ticket_flights', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment='Ticket number')
op.alter_column('ticket_flights', 'flight_id',
existing_type=sa.INTEGER(),
comment='Flight ID')
op.alter_column('ticket_flights', 'fare_conditions',
existing_type=sa.VARCHAR(length=10),
comment='Travel class',
existing_nullable=False)
op.alter_column('ticket_flights', 'amount',
existing_type=sa.NUMERIC(precision=10, scale=2),
comment='Travel cost',
existing_nullable=False)
op.create_table_comment(
'seats',
'Seats',
existing_comment=None,
schema=None
)
op.drop_constraint('boarding_passes_ticket_no_fkey', 'seats', type_='foreignkey')
op.create_foreign_key('seats_aircraft_code_fkey', 'seats', 'aircrafts_data', ['aircraft_code'], ['aircraft_code'], ondelete='CASCADE')
op.alter_column('seats', 'seat_no',
existing_type=sa.VARCHAR(length=4),
comment='Seat number')
op.alter_column('seats', 'fare_conditions',
existing_type=sa.VARCHAR(length=10),
comment='Travel class',
existing_nullable=False)
op.alter_column('seats', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment='Aircraft code, IATA')
op.create_table_comment(
'flights',
'Flights',
existing_comment=None,
schema=None
)
op.alter_column('flights', 'status',
existing_type=sa.VARCHAR(length=20),
comment='Flight status',
existing_nullable=False)
op.alter_column('flights', 'scheduled_departure',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment='Scheduled departure time',
existing_nullable=False)
op.alter_column('flights', 'scheduled_arrival',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment='Scheduled arrival time',
existing_nullable=False)
op.alter_column('flights', 'flight_no',
existing_type=sa.CHAR(length=6),
comment='Flight number',
existing_nullable=False)
op.alter_column('flights', 'flight_id',
existing_type=sa.INTEGER(),
comment='Flight ID',
autoincrement=True,
existing_server_default=sa.text("nextval('flights_flight_id_seq'::regclass)"))
op.alter_column('flights', 'departure_airport',
existing_type=sa.CHAR(length=3),
comment='Airport of departure',
existing_nullable=False)
op.alter_column('flights', 'arrival_airport',
existing_type=sa.CHAR(length=3),
comment='Airport of arrival',
existing_nullable=False)
op.alter_column('flights', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment='Aircraft code, IATA',
existing_nullable=False)
op.alter_column('flights', 'actual_departure',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment='Actual departure time',
existing_nullable=True)
op.alter_column('flights', 'actual_arrival',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment='Actual arrival time',
existing_nullable=True)
op.create_table_comment(
'bookings',
'Bookings',
existing_comment=None,
schema=None
)
op.alter_column('bookings', 'total_amount',
existing_type=sa.NUMERIC(precision=10, scale=2),
comment='Total booking cost',
existing_nullable=False)
op.alter_column('bookings', 'book_ref',
existing_type=sa.CHAR(length=6),
comment='Booking number')
op.alter_column('bookings', 'book_date',
existing_type=postgresql.TIMESTAMP(timezone=True),
comment='Booking date',
existing_nullable=False)
op.create_table_comment(
'boarding_passes',
'Boarding passes',
existing_comment=None,
schema=None
)
op.alter_column('boarding_passes', 'ticket_no',
existing_type=sa.CHAR(length=13),
comment='Ticket number')
op.alter_column('boarding_passes', 'seat_no',
existing_type=sa.VARCHAR(length=4),
comment='Seat number',
existing_nullable=False)
op.alter_column('boarding_passes', 'flight_id',
existing_type=sa.INTEGER(),
comment='Flight ID')
op.alter_column('boarding_passes', 'boarding_no',
existing_type=sa.INTEGER(),
comment='Boarding pass number',
existing_nullable=False)
op.create_table_comment(
'airports_data',
'Airports (internal data)',
existing_comment=None,
schema=None
)
op.alter_column('airports_data', 'timezone',
existing_type=sa.TEXT(),
comment='Airport time zone',
existing_nullable=False)
op.alter_column('airports_data', 'coordinates',
existing_type=sa.NullType(),
comment='Airport coordinates (longitude and latitude)',
existing_nullable=False)
op.alter_column('airports_data', 'city',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment='City',
existing_nullable=False)
op.alter_column('airports_data', 'airport_name',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment='Airport name',
existing_nullable=False)
op.alter_column('airports_data', 'airport_code',
existing_type=sa.CHAR(length=3),
comment='Airport code')
op.create_table_comment(
'aircrafts_data',
'Aircrafts (internal data)',
existing_comment=None,
schema=None
)
op.alter_column('aircrafts_data', 'rrange',
existing_type=sa.INTEGER(),
comment='Maximal flying distance, km',
existing_nullable=False)
op.alter_column('aircrafts_data', 'model',
existing_type=postgresql.JSONB(astext_type=sa.Text()),
comment='Aircraft model',
existing_nullable=False)
op.alter_column('aircrafts_data', 'aircraft_code',
existing_type=sa.CHAR(length=3),
comment='Aircraft code, IATA')
# ### end Alembic commands ###
| 40.172414 | 138 | 0.592561 | 1,761 | 17,475 | 5.633731 | 0.080068 | 0.052212 | 0.096966 | 0.106642 | 0.907671 | 0.901119 | 0.890535 | 0.861405 | 0.784699 | 0.772503 | 0 | 0.008002 | 0.29917 | 17,475 | 434 | 139 | 40.264977 | 0.802074 | 0.016195 | 0 | 0.7506 | 0 | 0 | 0.199767 | 0.011192 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004796 | false | 0.06235 | 0.009592 | 0 | 0.014388 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
608721065d3c3dd6ddd05c47c485a201289b8ce2 | 164 | py | Python | cupy/random/permutations.py | umitanuki/chainer | 225c56b233e684ff4855451d2af4c2fb66915f21 | [
"MIT"
] | null | null | null | cupy/random/permutations.py | umitanuki/chainer | 225c56b233e684ff4855451d2af4c2fb66915f21 | [
"MIT"
] | null | null | null | cupy/random/permutations.py | umitanuki/chainer | 225c56b233e684ff4855451d2af4c2fb66915f21 | [
"MIT"
] | 1 | 2018-11-18T00:36:51.000Z | 2018-11-18T00:36:51.000Z | def shuffle(x):
# TODO(beam2d): Implement it
raise NotImplementedError
def permutation(x):
# TODO(beam2d): Implement it
raise NotImplementedError
| 18.222222 | 32 | 0.707317 | 18 | 164 | 6.444444 | 0.555556 | 0.086207 | 0.189655 | 0.344828 | 0.793103 | 0.793103 | 0.793103 | 0 | 0 | 0 | 0 | 0.015385 | 0.207317 | 164 | 8 | 33 | 20.5 | 0.876923 | 0.323171 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
71771179b4e61ac1389442a180f7082f0111261a | 2,263 | py | Python | api/src/strategy/trade_condition.py | VincentKobz/crypto-trade-tools | 9c994985fa9cad7610923834d6513be1e7ad6095 | [
"MIT"
] | 1 | 2022-02-12T18:25:13.000Z | 2022-02-12T18:25:13.000Z | api/src/strategy/trade_condition.py | VincentKobz/crypto-trade-tools | 9c994985fa9cad7610923834d6513be1e7ad6095 | [
"MIT"
] | null | null | null | api/src/strategy/trade_condition.py | VincentKobz/crypto-trade-tools | 9c994985fa9cad7610923834d6513be1e7ad6095 | [
"MIT"
] | null | null | null | def buy_condition_trix(row, stoch_top, stoch_bottom, stoch_over_sold):
if row['TRIX_HISTO'] > 0 and row['STOCH_RSI'] < stoch_top:
return True
else:
return False
def sell_condition_trix(row, stoch_top, stoch_bottom, stoch_over_sold):
if row['TRIX_HISTO'] < 0 and row['STOCH_RSI'] > stoch_bottom:
return True
else:
return False
def buy_condition_ema(row, stoch_top, stoch_bottom, stoch_over_sold):
if row['EMA1p'] > row['EMA2p']:
return True
else:
return False
def sell_condition_ema(row, stoch_top, stock_bottom, stoch_over_sold):
if row['EMA2p'] > row['EMA1p']:
return True
else:
return False
def buy_condition_true_strategy(row, stoch_top, stock_bottom, stoch_over_sold):
if row['EMA28'] > row['EMA48'] and row['STOCH_RSI'] < stoch_top:
return True
else:
return False
def sell_condition_true_strategy(row, stoch_top, stock_bottom, stoch_over_sold):
if row['EMA28'] < row['EMA48'] and row['STOCH_RSI'] > stock_bottom:
return True
else:
return False
def buy_condition_aligator(row, stoch_top, stock_bottom, stoch_over_sold):
if row['EMA1'] > row['EMA2'] and row['EMA2'] > row['EMA3'] and row['EMA3'] > row['EMA4'] and row['EMA4'] > row['EMA5'] and row['EMA5'] > row['EMA6'] and row['STOCH_RSI'] < stock_bottom:
return True
else:
return False
def sell_condition_aligator(row, stoch_top, stock_bottom, stoch_over_sold):
if row['EMA6'] > row['EMA1'] and row['STOCH_RSI'] > stock_bottom:
return True
else:
return False
def buy_condition_big_will(row, stoch_top, stock_bottom, stoch_over_sold):
if (
row['AO'] >= 0
and row['WillR'] < -85
):
return True
else:
return False
def sell_condition_big_will(row, stoch_top, stock_bottom, stoch_over_sold):
if (
(row['AO'] < 0
and row['STOCH_RSI'] > stoch_over_sold)
or row['WillR'] > -10
):
return True
else:
return False
def buy_condition_macd(row, stoch_top, stoch_bottom, stoch_over_sold):
if row['MACD'] > row['MACD_SIGNAL']:
return True
else:
return False
def sell_condition_macd(row, stoch_top, stoch_bottom, stoch_over_sold):
if row['MACD'] < row['MACD_SIGNAL']:
return True
else:
return False | 29.012821 | 187 | 0.678303 | 339 | 2,263 | 4.247788 | 0.126844 | 0.105556 | 0.117361 | 0.158333 | 0.896528 | 0.892361 | 0.883333 | 0.883333 | 0.781944 | 0.75 | 0 | 0.01765 | 0.198851 | 2,263 | 78 | 188 | 29.012821 | 0.776613 | 0 | 0 | 0.597015 | 0 | 0 | 0.094965 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.179104 | false | 0 | 0 | 0 | 0.537313 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
7192d31092efeb77ed43e8f5852d9b5b4732da91 | 110 | py | Python | nipype/interfaces/fsl/tests/test_XFibres.py | Conxz/nipype | 1281723ae56eacd103597ff4081a205583706e62 | [
"Apache-2.0"
] | null | null | null | nipype/interfaces/fsl/tests/test_XFibres.py | Conxz/nipype | 1281723ae56eacd103597ff4081a205583706e62 | [
"Apache-2.0"
] | null | null | null | nipype/interfaces/fsl/tests/test_XFibres.py | Conxz/nipype | 1281723ae56eacd103597ff4081a205583706e62 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from nipype.testing import assert_equal
from nipype.interfaces.fsl.dti import XFibres
| 27.5 | 45 | 0.763636 | 16 | 110 | 5.1875 | 0.8125 | 0.240964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010309 | 0.118182 | 110 | 3 | 46 | 36.666667 | 0.845361 | 0.190909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
719ca2c02e19244fc60ad5b172e7d4779bad2afd | 502 | py | Python | fast_drf/signals.py | iashraful/fast-drf | df6d80b32a281f9921e0b30027d4e132ace3bdc2 | [
"MIT"
] | 15 | 2020-07-02T01:13:05.000Z | 2022-03-02T15:41:24.000Z | fast_drf/signals.py | iashraful/fast-drf | df6d80b32a281f9921e0b30027d4e132ace3bdc2 | [
"MIT"
] | 2 | 2020-07-16T04:10:11.000Z | 2020-08-12T13:32:47.000Z | fast_drf/signals.py | iashraful/fast-drf | df6d80b32a281f9921e0b30027d4e132ace3bdc2 | [
"MIT"
] | null | null | null | from django import dispatch
before_post_api = dispatch.Signal(providing_args=["requested_data"])
after_post_api = dispatch.Signal(providing_args=["instance", "requested_data"])
before_put_api = dispatch.Signal(providing_args=["instance", "requested_data"])
after_put_api = dispatch.Signal(providing_args=["instance", "requested_data"])
before_patch_api = dispatch.Signal(providing_args=["instance", "requested_data"])
after_patch_api = dispatch.Signal(providing_args=["instance", "requested_data"])
| 45.636364 | 81 | 0.802789 | 63 | 502 | 6.015873 | 0.253968 | 0.174142 | 0.269129 | 0.41161 | 0.873351 | 0.873351 | 0.773087 | 0.773087 | 0.773087 | 0 | 0 | 0 | 0.059761 | 502 | 10 | 82 | 50.2 | 0.802966 | 0 | 0 | 0 | 0 | 0 | 0.247012 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71b8fa7e5112591736347c95bc3a33e424a50f74 | 6,591 | py | Python | test/test_markov.py | alebymars/markov-bot | 574eed594a5a8baf5f7ebee98fbab7913d86ca33 | [
"MIT"
] | 30 | 2018-03-10T15:43:53.000Z | 2022-03-07T21:03:00.000Z | test/test_markov.py | Riccorl/markov-bot | 2072af89074e36ca65b84e26ec44b6acf558e512 | [
"MIT"
] | 258 | 2018-09-30T18:29:50.000Z | 2022-03-28T17:31:33.000Z | test/test_markov.py | Riccorl/markov-bot | 2072af89074e36ca65b84e26ec44b6acf558e512 | [
"MIT"
] | 27 | 2018-03-10T13:24:24.000Z | 2020-11-03T10:47:49.000Z | from markov import markov
from unittest import mock
@mock.patch('markov.markov.generate_sentence')
@mock.patch('markov.speech.update_model')
@mock.patch('markov.markov.bot')
def test_handle_message(
mock_bot, mock_update_model, mock_generate_sentence, message
):
mock_get_me = mock.Mock()
mock_get_me.return_value.username = 'markov_bot'
mock_bot.get_me = mock_get_me
markov.handle_message(message)
assert mock_update_model.called
assert not mock_generate_sentence.called
@mock.patch('markov.markov.generate_sentence')
@mock.patch('markov.speech.update_model')
@mock.patch('markov.markov.bot')
def test_handle_message_with_mention(
mock_bot, mock_update_model, mock_generate_sentence, message
):
message.text = 'hello, @markov_bot!'
mock_get_me = mock.Mock()
mock_get_me.return_value.username = 'markov_bot'
mock_bot.get_me = mock_get_me
markov.handle_message(message)
assert mock_update_model.called
assert mock_generate_sentence.called
@mock.patch('markov.markov.generate_sentence')
@mock.patch('markov.speech.update_model')
@mock.patch('markov.markov.bot')
def test_handle_message_raising_exception(
mock_bot, mock_update_model, mock_generate_sentence, message
):
mock_update_model.side_effect = ValueError('invalid param')
message.text = ''
markov.handle_message(message)
assert not mock_bot.get_me.called
assert not mock_generate_sentence.called
@mock.patch('markov.speech.new_message')
@mock.patch('markov.markov.bot')
def test_generate_sentence(mock_bot, mock_new_message, message):
markov.generate_sentence(message)
assert mock_new_message.called
assert not mock_bot.reply_to.called
assert mock_bot.send_message.called
@mock.patch('markov.speech.new_message')
@mock.patch('markov.markov.bot')
def test_generate_sentence_for_reply(mock_bot, mock_new_message, message):
markov.generate_sentence(message, reply=True)
assert mock_new_message.called
assert mock_bot.reply_to.called
assert not mock_bot.send_message.called
@mock.patch('markov.speech.delete_model')
@mock.patch('markov.markov.bot')
def test_remove_messages_no_permission(mock_bot, mock_delete_model, message):
mock_bot.get_chat_administrators.return_value = []
markov.remove_messages(message)
assert mock_bot.reply_to.called_once_with(message, 'u r not an admin 🤔')
assert not mock_delete_model.delete.called
@mock.patch('markov.markov.telebot.types.ReplyKeyboardMarkup')
@mock.patch('markov.speech.delete_model')
@mock.patch('markov.markov.bot')
def test_remove_messages_ask_confirm(
mock_bot, mock_delete_model, mock_markup, message
):
message.text = '/remove'
mock_bot.get_chat_administrators.return_value = [message]
markov.remove_messages(message)
assert mock_markup.add.called_once_with('yes', 'no')
assert mock_bot.reply_to.called_once_with(
message, 'are you sure?', reply_markup=mock_markup)
assert mock_bot.register_next_step_handler.called
assert not mock_delete_model.called
@mock.patch('markov.markov.telebot.types.ReplyKeyboardRemove')
@mock.patch('markov.speech.delete_model')
@mock.patch('markov.markov.bot')
def test_remove_messages_confirm(
mock_bot, mock_delete_model, mock_markup, message
):
message.text = 'yes'
mock_bot.get_chat_administrators.return_value = [message]
markov.remove_messages(message)
assert mock_delete_model.called
assert mock_bot.reply_to.called_once_with(
message, 'okay', reply_markup=mock_markup)
@mock.patch('markov.markov.telebot.types.ReplyKeyboardRemove')
@mock.patch('markov.speech.delete_model')
@mock.patch('markov.markov.bot')
def test_remove_messages_cancel(
mock_bot, mock_delete_model, mock_markup, message
):
message.text = 'no'
mock_bot.get_chat_administrators.return_value = [message]
markov.remove_messages(message)
assert mock_bot.reply_to.called_once_with(
message, 'okay', reply_markup=mock_markup)
assert not mock_delete_model.called
@mock.patch('markov.speech.flush')
@mock.patch('markov.markov.bot')
def test_flush_cache_no_permission(mock_bot, mock_flush, message):
mock_bot.get_chat_administrators.return_value = []
markov.flush_cache(message)
assert mock_bot.reply_to.called_once_with(message, 'u r not an admin 🤔')
assert not mock_flush.called
@mock.patch('markov.markov.telebot.types.ReplyKeyboardMarkup')
@mock.patch('markov.speech.flush')
@mock.patch('markov.markov.bot')
def test_flush_cache_ask_confirm(
mock_bot, mock_flush, mock_markup, message
):
message.text = '/flush'
mock_bot.get_chat_administrators.return_value = [message]
markov.flush_cache(message)
assert mock_markup.add.called_once_with('yes', 'no')
assert mock_bot.reply_to.called_once_with(
message, 'are you sure?', reply_markup=mock_markup)
assert mock_bot.register_next_step_handler.called
assert not mock_flush.called
@mock.patch('markov.markov.telebot.types.ReplyKeyboardRemove')
@mock.patch('markov.speech.flush')
@mock.patch('markov.markov.bot')
def test_flush_cache_confirm(mock_bot, mock_flush, mock_markup, message):
message.text = 'yes'
mock_bot.get_chat_administrators.return_value = [message]
markov.flush_cache(message)
assert mock_flush.called
assert mock_bot.reply_to.called_once_with(
message, 'okay', reply_markup=mock_markup)
@mock.patch('markov.markov.telebot.types.ReplyKeyboardRemove')
@mock.patch('markov.speech.flush')
@mock.patch('markov.markov.bot')
def test_flush_cache_cancel(mock_bot, mock_flush, mock_markup, message):
message.text = 'no'
mock_bot.get_chat_administrators.return_value = [message]
markov.flush_cache(message)
assert mock_bot.reply_to.called_once_with(
message, 'okay', reply_markup=mock_markup)
assert not mock_flush.called
@mock.patch('markov.markov.bot')
def test_get_repo_version(mock_bot, message):
markov.get_repo_version(message)
assert mock_bot.reply_to.called
@mock.patch('markov.markov.bot')
@mock.patch('markov.markov.settings')
def test_notify_admin(mock_settings, mock_bot):
mock_settings.ADMIN_CHAT_ID = 'ae'
message = 'ae'
markov.notify_admin(message)
assert mock_bot.send_message.called_once_with(
mock_settings.ADMIN_CHAT_ID,
message
)
@mock.patch('markov.markov.bot')
def test_help(mock_bot, message):
markov.help(message)
assert mock_bot.reply_to.called
@mock.patch('markov.markov.bot')
def test_start(mock_bot, message):
markov.start(message)
assert mock_bot.reply_to.called
| 33.456853 | 77 | 0.770141 | 934 | 6,591 | 5.118844 | 0.087794 | 0.06735 | 0.125497 | 0.118594 | 0.906505 | 0.873457 | 0.849613 | 0.836018 | 0.824514 | 0.770341 | 0 | 0 | 0.120771 | 6,591 | 196 | 78 | 33.627551 | 0.824676 | 0 | 0 | 0.7125 | 1 | 0 | 0.176149 | 0.095433 | 0 | 0 | 0 | 0 | 0.225 | 1 | 0.10625 | false | 0 | 0.0125 | 0 | 0.11875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
71c6d0e4ed1d719f7644f1db333fb2c55f81043b | 175 | py | Python | about/views.py | benjaminwsebastian/gidek | e24fa3b24913cb0e21512161bb31221f4422798b | [
"MIT"
] | null | null | null | about/views.py | benjaminwsebastian/gidek | e24fa3b24913cb0e21512161bb31221f4422798b | [
"MIT"
] | 5 | 2021-03-30T14:03:51.000Z | 2021-09-22T19:29:55.000Z | about/views.py | benjaminwsebastian/gidek | e24fa3b24913cb0e21512161bb31221f4422798b | [
"MIT"
] | null | null | null | from django.shortcuts import render
def team(request):
return render(request, 'about/our_team.html')
def plan(request):
return render(request, 'about/our_plan.html') | 25 | 49 | 0.748571 | 25 | 175 | 5.16 | 0.52 | 0.20155 | 0.294574 | 0.403101 | 0.527132 | 0.527132 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131429 | 175 | 7 | 50 | 25 | 0.848684 | 0 | 0 | 0 | 0 | 0 | 0.215909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
e0a340d81466a1f5124c5e86b1a92205c2beb1ec | 49 | py | Python | sevenproject/blueprints/help/__init__.py | xDiaym/7project | 27912f5a8626dffa38135c418a8ef19d0b6a7af3 | [
"MIT"
] | null | null | null | sevenproject/blueprints/help/__init__.py | xDiaym/7project | 27912f5a8626dffa38135c418a8ef19d0b6a7af3 | [
"MIT"
] | null | null | null | sevenproject/blueprints/help/__init__.py | xDiaym/7project | 27912f5a8626dffa38135c418a8ef19d0b6a7af3 | [
"MIT"
] | 1 | 2021-12-03T00:25:48.000Z | 2021-12-03T00:25:48.000Z | from sevenproject.blueprints.help.help import bp
| 24.5 | 48 | 0.857143 | 7 | 49 | 6 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
e0d071b728ed8e5b91299823124976046288dc45 | 2,754 | gyp | Python | src/trusted/gio/gio_wrapped_desc_tests.gyp | MicrohexHQ/nacl_contracts | 3efab5eecb3cf7ba43f2d61000e65918aa4ba77a | [
"BSD-3-Clause"
] | 6 | 2015-02-06T23:41:01.000Z | 2015-10-21T03:08:51.000Z | src/trusted/gio/gio_wrapped_desc_tests.gyp | MicrohexHQ/nacl_contracts | 3efab5eecb3cf7ba43f2d61000e65918aa4ba77a | [
"BSD-3-Clause"
] | null | null | null | src/trusted/gio/gio_wrapped_desc_tests.gyp | MicrohexHQ/nacl_contracts | 3efab5eecb3cf7ba43f2d61000e65918aa4ba77a | [
"BSD-3-Clause"
] | 1 | 2019-10-02T08:41:50.000Z | 2019-10-02T08:41:50.000Z | # -*- python -*-
# Copyright 2010 The Native Client Authors. All rights reserved. Use
# of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
{
'variables': {
'conditions': [
['OS=="win"', {
'msvs_cygwin_shell': 0,
}],
],
},
'includes': [
'../../../build/common.gypi',
],
'target_defaults': {
'variables': {
'target_base': 'none',
},
'target_conditions': [
['OS=="mac"', {
'xcode_settings': {
'GCC_ENABLE_CPP_RTTI': 'YES', # override -fno-rtti
},
}],
],
'cflags_cc!': ['-fno-rtti'],
},
'targets': [
{
'target_name': 'gio_shm_test',
'type': 'executable',
'dependencies': [
'<(DEPTH)/native_client/src/shared/gio/gio.gyp:gio',
'<(DEPTH)/native_client/src/shared/imc/imc.gyp:imc_c',
'<(DEPTH)/native_client/src/shared/platform/platform.gyp:platform',
'<(DEPTH)/native_client/src/shared/srpc/srpc.gyp:nonnacl_srpc',
'<(DEPTH)/native_client/src/trusted/desc/desc.gyp:nrd_xfer',
'<(DEPTH)/native_client/src/trusted/gio/gio_wrapped_desc.gyp:gio_wrapped_desc',
'<(DEPTH)/native_client/src/trusted/nonnacl_util/nonnacl_util.gyp:nonnacl_util',
],
'sources': [
'gio_shm_test.c',
],
},
{
'target_name': 'gio_shm_unbounded_test',
'type': 'executable',
'dependencies': [
'<(DEPTH)/native_client/src/shared/gio/gio.gyp:gio',
'<(DEPTH)/native_client/src/shared/imc/imc.gyp:imc_c',
'<(DEPTH)/native_client/src/shared/platform/platform.gyp:platform',
'<(DEPTH)/native_client/src/shared/srpc/srpc.gyp:nonnacl_srpc',
'<(DEPTH)/native_client/src/trusted/desc/desc.gyp:nrd_xfer',
'<(DEPTH)/native_client/src/trusted/gio/gio_wrapped_desc.gyp:gio_wrapped_desc',
'<(DEPTH)/native_client/src/trusted/nonnacl_util/nonnacl_util.gyp:nonnacl_util',
],
'sources': [
'gio_shm_unbounded_test.c',
],
},
{
'target_name': 'gio_nacl_desc_test',
'type': 'executable',
'dependencies': [
'<(DEPTH)/native_client/src/shared/gio/gio.gyp:gio',
'<(DEPTH)/native_client/src/shared/imc/imc.gyp:imc_c',
'<(DEPTH)/native_client/src/shared/platform/platform.gyp:platform',
'<(DEPTH)/native_client/src/shared/srpc/srpc.gyp:nonnacl_srpc',
'<(DEPTH)/native_client/src/trusted/desc/desc.gyp:nrd_xfer',
'<(DEPTH)/native_client/src/trusted/gio/gio_wrapped_desc.gyp:gio_wrapped_desc',
'<(DEPTH)/native_client/src/trusted/nonnacl_util/nonnacl_util.gyp:nonnacl_util',
],
'sources': [
'gio_nacl_desc_test.c',
],
},
],
}
| 34 | 88 | 0.604938 | 329 | 2,754 | 4.820669 | 0.24924 | 0.166456 | 0.225095 | 0.264817 | 0.758512 | 0.735813 | 0.735813 | 0.735813 | 0.735813 | 0.735813 | 0 | 0.002311 | 0.214234 | 2,754 | 80 | 89 | 34.425 | 0.730592 | 0.071169 | 0 | 0.586667 | 0 | 0 | 0.685737 | 0.538401 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
460f0b1101e63db24ef33ae246f5bd855e68db54 | 102 | py | Python | {{cookiecutter.first_app}}/managers.py | nyimbi/ultimate | 6334965faf86d9fa16ffef3f2e85177cb7d8a8a1 | [
"BSD-3-Clause"
] | 1 | 2018-04-06T10:25:38.000Z | 2018-04-06T10:25:38.000Z | {{cookiecutter.first_app}}/managers.py | nyimbi/ultimate | 6334965faf86d9fa16ffef3f2e85177cb7d8a8a1 | [
"BSD-3-Clause"
] | null | null | null | {{cookiecutter.first_app}}/managers.py | nyimbi/ultimate | 6334965faf86d9fa16ffef3f2e85177cb7d8a8a1 | [
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
from django.db.models import Manager, Q
from django.db.models.query import QuerySet
| 17 | 43 | 0.77451 | 17 | 102 | 4.647059 | 0.705882 | 0.253165 | 0.303797 | 0.455696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011364 | 0.137255 | 102 | 5 | 44 | 20.4 | 0.886364 | 0.127451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1caad1b8c2764bd61956e985558d80bdccb6c32d | 26,802 | py | Python | tests/st/ops/gpu/test_layer_norm_grad_grad_op.py | PowerOlive/mindspore | bda20724a94113cedd12c3ed9083141012da1f15 | [
"Apache-2.0"
] | 3,200 | 2020-02-17T12:45:41.000Z | 2022-03-31T20:21:16.000Z | tests/st/ops/gpu/test_layer_norm_grad_grad_op.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | 176 | 2020-02-12T02:52:11.000Z | 2022-03-28T22:15:55.000Z | tests/st/ops/gpu/test_layer_norm_grad_grad_op.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | 621 | 2020-03-09T01:31:41.000Z | 2022-03-30T03:43:19.000Z | # Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
import numpy as np
import pytest
import mindspore.context as context
import mindspore.nn as nn
from mindspore import Tensor
from mindspore.ops.operations import _grad_ops as G
context.set_context(mode=context.GRAPH_MODE, device_target="GPU")
np.random.seed(0)
class LayerNormGradGradNet(nn.Cell):
def __init__(self, begin_norm_axis, begin_params_axis):
super(LayerNormGradGradNet, self).__init__()
self.norm = G.LayerNormGradGrad(begin_norm_axis, begin_params_axis)
def construct(self, x, dy, var, mean, gamma, grad_dx, grad_dg, grad_db):
return self.norm(x, dy, var, mean, gamma, grad_dx, grad_dg, grad_db)
def LayerNormGradReference(x, dy, gamma, epsilon, begin_norm_axis, begin_params_axis):
begin_norm_axis = begin_norm_axis if begin_norm_axis >= 0 else begin_norm_axis + len(x.shape)
begin_params_axis = begin_params_axis if begin_params_axis >= 0 else begin_params_axis + len(x.shape)
norm_axis = [i for i in range(begin_norm_axis, len(x.shape))]
param_axis = [i for i in range(0, begin_params_axis)]
num = 1
for i in range(begin_norm_axis, len(x.shape)):
num *= x.shape[i]
mean = np.mean(x, axis=tuple(norm_axis), keepdims=True)
var = np.var(x, axis=tuple(norm_axis), keepdims=True)
gamma = gamma.reshape((*((1,) * begin_params_axis), *x.shape[begin_params_axis:]))
dg = np.sum(dy * np.power(var + epsilon, -0.5) * (x - mean), axis=tuple(param_axis), keepdims=True)
db = np.sum(dy, axis=tuple(param_axis), keepdims=True)
sum1 = np.sum((-0.5) * dy * gamma * (x - mean) * np.power(var + epsilon, -1.5), axis=tuple(norm_axis),
keepdims=True)
sum2 = np.sum(dy * gamma, axis=tuple(norm_axis), keepdims=True)
sum3 = np.sum(-2.0 * (x - mean), axis=tuple(norm_axis), keepdims=True)
dx1 = dy * gamma * np.power(var + epsilon, -0.5)
dx2 = sum1 * 2.0 / num * (x - mean)
dx3 = ((-1.0) * np.power(var + epsilon, -0.5) * sum2 + (1.0 / num) * sum1 * sum3) * (1.0 / num)
dx = dx1 + dx2 + dx3
return dx, dg, db, mean, var
def LayerNormGradGradReference(x, dy, gamma, epsilon, grad_dx_np, grad_dg_np, grad_db_np, begin_norm_axis,
begin_params_axis):
begin_norm_axis = begin_norm_axis if begin_norm_axis >= 0 else begin_norm_axis + len(x.shape)
begin_params_axis = begin_params_axis if begin_params_axis >= 0 else begin_params_axis + len(x.shape)
norm_axis = tuple([i for i in range(begin_norm_axis, len(x.shape))])
param_axis = [i for i in range(0, begin_params_axis)]
num = 1
for i in range(begin_norm_axis, len(x.shape)):
num *= x.shape[i]
mean = np.mean(x, tuple(norm_axis), keepdims=True)
var = np.mean(np.power((x - mean), 2), tuple(norm_axis), keepdims=True)
inv_std = np.power(var + epsilon, -0.5)
x_hat = (x - mean) * inv_std
sum1 = np.mean(-inv_std * grad_dx_np, tuple(norm_axis), keepdims=True)
sum2 = np.mean(-x_hat * inv_std * grad_dx_np, tuple(norm_axis), keepdims=True)
sum3 = np.mean(dy * gamma, tuple(norm_axis), keepdims=True)
sum4 = np.mean(dy * gamma * x_hat, tuple(norm_axis), keepdims=True)
part_sum1 = dy * gamma - sum3 - x_hat * sum4
part_sum2 = dy * gamma * sum2 - sum4 * grad_dx_np * inv_std + dy * grad_dg_np
part1 = np.mean(grad_dx_np * part_sum1, tuple(norm_axis), keepdims=True)
part2 = np.mean((x - mean) * part_sum2, tuple(norm_axis), keepdims=True)
part3 = inv_std * part_sum2
sum5 = np.mean(part1, tuple(norm_axis), keepdims=True)
sum6 = np.mean(part2, tuple(norm_axis), keepdims=True)
sum7 = np.mean(-part3, tuple(norm_axis), keepdims=True)
part4 = -(x - mean) * np.power(var + epsilon, -1.5) * (sum5 + sum6)
d_x = part3 + part4 + sum7
part5 = gamma * grad_dx_np * inv_std
part6 = gamma * sum1
part7 = gamma * x_hat * sum2
part8 = x_hat * grad_dg_np
d_dy = part5 + part6 + part7 + part8 + grad_db_np
part9 = np.sum(dy * x_hat * sum2, tuple(param_axis), keepdims=True)
part10 = np.sum(dy * sum1, tuple(param_axis), keepdims=True)
part11 = np.sum(dy * grad_dx_np * inv_std, tuple(param_axis), keepdims=True)
d_gamma = part9 + part10 + part11
return d_x, d_dy, d_gamma
@pytest.mark.level0
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad0():
begin_norm_axis = 1
begin_params_axis = 1
x_np = np.random.randn(4096, 3072).astype(np.float32)
dy_np = np.random.randn(4096, 3072).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad1():
begin_norm_axis = 1
begin_params_axis = 1
x_np = np.random.randn(640, 768).astype(np.float32)
dy_np = np.random.randn(640, 768).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
@pytest.mark.level0
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad2():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 128, 768).astype(np.float32)
dy_np = np.random.randn(32, 128, 768).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad3():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 64).astype(np.float32)
dy_np = np.random.randn(32, 64).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad4():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 64).astype(np.float32)
dy_np = np.random.randn(32, 64).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
@pytest.mark.level0
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad5():
begin_norm_axis = 2
begin_params_axis = 1
x_np = np.random.randn(128, 2, 16, 32).astype(np.float32)
dy_np = np.random.randn(128, 2, 16, 32).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-12
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float32))
x_ms = Tensor(x_np.astype(np.float32))
var_ms = Tensor(var_np.astype(np.float32))
mean_ms = Tensor(mean_np.astype(np.float32))
gamma_ms = Tensor(gamma_np.astype(np.float32))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float32))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float32))
grad_db_ms = Tensor(grad_db_np.astype(np.float32))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=3e-3, atol=3e-3)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=3e-3, atol=3e-3)
def test_layernormgradgrad6():
begin_norm_axis = 1
begin_params_axis = 1
x_np = np.random.randn(4096, 3072).astype(np.float32)
dy_np = np.random.randn(4096, 3072).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad7():
begin_norm_axis = 1
begin_params_axis = 1
x_np = np.random.randn(640, 768).astype(np.float32)
dy_np = np.random.randn(640, 768).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad8():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 128, 768).astype(np.float32)
dy_np = np.random.randn(32, 128, 768).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad9():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 64).astype(np.float32)
dy_np = np.random.randn(32, 64).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
@pytest.mark.level1
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad10():
begin_norm_axis = -1
begin_params_axis = -1
x_np = np.random.randn(32, 64).astype(np.float32)
dy_np = np.random.randn(32, 64).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
@pytest.mark.level0
@pytest.mark.platform_x86_gpu_training
@pytest.mark.env_onecard
def test_layernormgradgrad11():
begin_norm_axis = 2
begin_params_axis = 1
x_np = np.random.randn(128, 2, 16, 32).astype(np.float32)
dy_np = np.random.randn(128, 2, 16, 32).astype(np.float32)
gamma_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_dx_np = np.random.randn(*x_np.shape).astype(np.float32)
grad_dg_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
grad_db_np = np.random.randn(*x_np.shape[begin_params_axis:]).astype(np.float32)
epsilon = 1e-7
_, _, _, mean_np, var_np = LayerNormGradReference(x_np, dy_np, gamma_np, epsilon, begin_norm_axis,
begin_params_axis)
d_x_np, d_dy_np, d_gamma_np = LayerNormGradGradReference(x_np, dy_np, gamma_np, epsilon, grad_dx_np, grad_dg_np,
grad_db_np, begin_norm_axis, begin_params_axis)
dy_ms = Tensor(dy_np.astype(np.float16))
x_ms = Tensor(x_np.astype(np.float16))
var_ms = Tensor(var_np.astype(np.float16))
mean_ms = Tensor(mean_np.astype(np.float16))
gamma_ms = Tensor(gamma_np.astype(np.float16))
grad_dx_ms = Tensor(grad_dx_np.astype(np.float16))
grad_dg_ms = Tensor(grad_dg_np.astype(np.float16))
grad_db_ms = Tensor(grad_db_np.astype(np.float16))
net = LayerNormGradGradNet(begin_norm_axis, begin_params_axis)
d_x_ms, d_dy_ms, d_gamma_ms = net(x_ms, dy_ms, var_ms, mean_ms, gamma_ms, grad_dx_ms, grad_dg_ms, grad_db_ms)
assert np.allclose(d_x_ms.asnumpy(), d_x_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_dy_ms.asnumpy(), d_dy_np, rtol=5e-3, atol=5e-1)
assert np.allclose(d_gamma_ms.asnumpy(), d_gamma_np, rtol=5e-3, atol=5e-1)
| 47.690391 | 116 | 0.688829 | 4,480 | 26,802 | 3.78125 | 0.04442 | 0.079339 | 0.106257 | 0.063754 | 0.896989 | 0.87686 | 0.865584 | 0.852893 | 0.850059 | 0.850059 | 0 | 0.038388 | 0.184538 | 26,802 | 561 | 117 | 47.775401 | 0.736686 | 0.023804 | 0 | 0.830233 | 0 | 0 | 0.000115 | 0 | 0 | 0 | 0 | 0 | 0.083721 | 1 | 0.037209 | false | 0 | 0.013953 | 0.002326 | 0.060465 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e80f2e2f17669167d0410703935008f89a8d0fad | 113 | py | Python | nif_tools/__init__.py | luftsport/nif-tools | 2dabd20a62625fb6aa769d9655af0d16d41859c1 | [
"MIT"
] | null | null | null | nif_tools/__init__.py | luftsport/nif-tools | 2dabd20a62625fb6aa769d9655af0d16d41859c1 | [
"MIT"
] | null | null | null | nif_tools/__init__.py | luftsport/nif-tools | 2dabd20a62625fb6aa769d9655af0d16d41859c1 | [
"MIT"
] | null | null | null | from nif_tools.mi import *
from nif_tools.ka import *
from nif_tools.sa import *
from nif_tools.passbuy import *
| 22.6 | 31 | 0.787611 | 20 | 113 | 4.25 | 0.4 | 0.329412 | 0.564706 | 0.635294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141593 | 113 | 4 | 32 | 28.25 | 0.876289 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 8 |
e82aa4f5998a4dcecbc29ce4469c6bdd4c714e7c | 150 | py | Python | cellrank/tl/_mixins/__init__.py | WeilerP/cellrank | c8c2b9f6bd2448861fb414435aee7620ca5a0bad | [
"BSD-3-Clause"
] | 172 | 2020-03-19T19:50:53.000Z | 2022-03-28T09:36:04.000Z | cellrank/tl/_mixins/__init__.py | WeilerP/cellrank | c8c2b9f6bd2448861fb414435aee7620ca5a0bad | [
"BSD-3-Clause"
] | 702 | 2020-03-19T08:09:04.000Z | 2022-03-30T09:55:14.000Z | cellrank/tl/_mixins/__init__.py | WeilerP/cellrank | c8c2b9f6bd2448861fb414435aee7620ca5a0bad | [
"BSD-3-Clause"
] | 17 | 2020-04-07T03:11:02.000Z | 2022-02-02T20:39:16.000Z | from cellrank.tl._mixins._io import IOMixin
from cellrank.tl._mixins._kernel import KernelMixin
from cellrank.tl._mixins._anndata import AnnDataMixin
| 37.5 | 53 | 0.86 | 21 | 150 | 5.857143 | 0.52381 | 0.292683 | 0.341463 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 150 | 3 | 54 | 50 | 0.891304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1c202654ecf69a992668cb6b99a8f9d49c7b5a3d | 13,189 | py | Python | qf_lib_tests/integration_tests/data_providers/bloomberg_beap_hapi/test_bloomberg_beap_hapi.py | webclinic017/qf-lib | 96463876719bba8a76c8269cef76addf3a2d836d | [
"Apache-2.0"
] | 198 | 2019-08-16T15:09:23.000Z | 2022-03-30T12:44:00.000Z | qf_lib_tests/integration_tests/data_providers/bloomberg_beap_hapi/test_bloomberg_beap_hapi.py | webclinic017/qf-lib | 96463876719bba8a76c8269cef76addf3a2d836d | [
"Apache-2.0"
] | 13 | 2021-01-07T10:15:19.000Z | 2022-03-29T13:01:47.000Z | qf_lib_tests/integration_tests/data_providers/bloomberg_beap_hapi/test_bloomberg_beap_hapi.py | webclinic017/qf-lib | 96463876719bba8a76c8269cef76addf3a2d836d | [
"Apache-2.0"
] | 29 | 2019-08-16T15:21:28.000Z | 2022-02-23T09:53:49.000Z | import unittest
from pandas import to_datetime
from qf_lib.common.enums.expiration_date_field import ExpirationDateField
from qf_lib.common.enums.price_field import PriceField
from qf_lib.common.tickers.tickers import BloombergTicker
from qf_lib.common.utils.dateutils.string_to_date import str_to_date
from qf_lib.common.utils.dateutils.timer import SettableTimer
from qf_lib.containers.dataframe.prices_dataframe import PricesDataFrame
from qf_lib.containers.dataframe.qf_dataframe import QFDataFrame
from qf_lib.containers.futures.future_tickers.bloomberg_future_ticker import BloombergFutureTicker
from qf_lib.containers.qf_data_array import QFDataArray
from qf_lib.containers.series.prices_series import PricesSeries
from qf_lib.containers.series.qf_series import QFSeries
from qf_lib.data_providers.bloomberg.exceptions import BloombergError
from qf_lib.data_providers.bloomberg_beap_hapi.bloomberg_beap_hapi_data_provider import BloombergBeapHapiDataProvider
from qf_lib_tests.unit_tests.config.test_settings import get_test_settings
from strategies.cta_strategy.cta_tickers_universe import DAYS_BEFORE_EXP_DATE
from datetime import datetime
class TestBloombergBeapHapi(unittest.TestCase):
START_DATE = str_to_date('2021-01-01')
END_DATE = str_to_date('2021-07-01')
SINGLE_FIELD = 'PX_LAST'
MANY_FIELDS = ['PX_LAST', 'PX_OPEN', 'PX_HIGH']
EXPIRATION_DATES = ExpirationDateField.all_dates()
INVALID_INDEX = BloombergTicker('RTYADSM1 Index')
INVALID_INDEXES = [BloombergTicker('ASDVCXZASD Index'), BloombergTicker('ASDBVCX ComSADFdty')]
INVALID_FUTURE_INDEX = BloombergFutureTicker('E-mini Russell 2000 Inndex Futures', 'RTASY{} Index', 1, DAYS_BEFORE_EXP_DATE, 50, "HMUZ")
INVALID_FUTURE_INDEXES = [BloombergFutureTicker('E-mini Russella 2000 Index Futures', 'RTASY{} Index', 1, DAYS_BEFORE_EXP_DATE, 50, "HMUZ"),
BloombergFutureTicker("Cotton", "CT{} Comddty", 1, 3)]
SINGLE_INDEX = BloombergTicker('RTYM1 Index')
MANY_INDEXES = [BloombergTicker('RTYM1 Index'), BloombergTicker('CTA Comdty')]
SINGLE_FUTURE_INDEX = BloombergFutureTicker('E-mini Russell 2000 Index Futures', 'RTY{} Index', 1, DAYS_BEFORE_EXP_DATE, 50, "HMUZ")
MANY_FUTURE_INDEXES = [BloombergFutureTicker('E-mini Russell 2000 Index Futures', 'RTY{} Index', 1, DAYS_BEFORE_EXP_DATE, 50, "HMUZ"),
BloombergFutureTicker("Cotton", "CT{} Comdty", 1, 3)]
SINGLE_PRICE_FIELD = PriceField.Close
MANY_PRICE_FIELDS = [PriceField.Close, PriceField.Open, PriceField.High]
def setUp(self):
settings = get_test_settings()
# if there is no credentials or access to HAPI - integration tests are skipped
try:
self.bbg_beap_hapi = BloombergBeapHapiDataProvider(settings)
except Exception as e:
raise self.skipTest(e)
# =========================== Tests ==========================================================
def test_get_history_single_ticker_single_field(self):
data = self.bbg_beap_hapi.get_history(self.SINGLE_INDEX, self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(type(data), QFSeries)
self.assertEqual(self.SINGLE_INDEX.as_string(), data.name)
self.assertTrue(data.index.min() >= self.START_DATE and data.index.max() <= self.END_DATE)
def test_get_history_single_ticker_many_fields(self):
data = self.bbg_beap_hapi.get_history(self.SINGLE_INDEX, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(type(data), QFDataFrame)
self.assertEqual(len(self.MANY_FIELDS), data.shape[1])
self.assertTrue(data.index.min() >= self.START_DATE and data.index.max() <= self.END_DATE)
def test_get_history_many_tickers_many_fields(self):
data = self.bbg_beap_hapi.get_history(self.MANY_INDEXES, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(type(data), QFDataArray)
self.assertEqual(len(self.MANY_INDEXES), len(data.tickers))
self.assertEqual(len(self.MANY_FIELDS), len(data.fields))
self.assertTrue(to_datetime(data.dates.values.min()) >= self.START_DATE and to_datetime(data.dates.values.max()) <= self.END_DATE)
def test_get_futures_chain_tickers_single_ticker_expiration_dates(self):
data = self.bbg_beap_hapi.get_futures_chain_tickers(self.SINGLE_FUTURE_INDEX, self.EXPIRATION_DATES)
self.assertTrue(type(data), dict)
self.assertTrue(len(data), 1)
self.assertTrue(type(data[list(data.keys())[0]]), QFDataFrame)
self.assertTrue(list(data.keys()) == [self.SINGLE_FUTURE_INDEX])
def test_get_futures_chain_tickers_many_tickers_expiration_dates(self):
data = self.bbg_beap_hapi.get_futures_chain_tickers(self.MANY_FUTURE_INDEXES, self.EXPIRATION_DATES)
self.assertEqual(type(data), dict)
self.assertEqual(len(data), 2)
self.assertTrue(type(data[list(data.keys())[0]]), QFDataFrame)
self.assertTrue(list(data.keys()) == self.MANY_FUTURE_INDEXES)
self.assertCountEqual(data[list(data.keys())[0]].columns.values, self.EXPIRATION_DATES)
def test_get_history_invalid_ticker_single_field(self):
data = self.bbg_beap_hapi.get_history(self.INVALID_INDEX, self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(QFSeries, type(data))
self.assertTrue(data.empty)
self.assertEqual(self.INVALID_INDEX.as_string(), data.name)
def test_get_history_invalid_ticker_many_fields(self):
data = self.bbg_beap_hapi.get_history(self.INVALID_INDEX, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataFrame, type(data))
self.assertTrue(data.empty)
self.assertCountEqual(self.MANY_FIELDS, data.columns.values)
def test_get_history_invalid_many_tickers_single_field(self):
data = self.bbg_beap_hapi.get_history(self.INVALID_INDEXES, self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataFrame, type(data))
self.assertTrue(data.empty)
self.assertCountEqual(self.INVALID_INDEXES, data.columns.values)
def test_get_history_invalid_many_tickers_many_fields(self):
data = self.bbg_beap_hapi.get_history(self.INVALID_INDEXES, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataArray, type(data))
self.assertCountEqual(self.MANY_FIELDS, data.fields.values)
self.assertCountEqual(self.INVALID_INDEXES, data.tickers.values)
def test_get_history_invalid_and_correct_ticker_single_field(self):
data = self.bbg_beap_hapi.get_history([self.SINGLE_INDEX] + [self.INVALID_INDEX], self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataFrame, type(data))
self.assertFalse(data.empty)
self.assertCountEqual([self.SINGLE_INDEX] + [self.INVALID_INDEX], data.columns.values)
self.assertTrue(data.index.min() >= self.START_DATE and data.index.max() <= self.END_DATE)
self.assertTrue(data[self.INVALID_INDEX].isnull().all())
def test_get_history_invalid_and_correct_ticker_many_fields(self):
data = self.bbg_beap_hapi.get_history([self.SINGLE_INDEX] + [self.INVALID_INDEX], self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataArray, type(data))
self.assertCountEqual(self.MANY_FIELDS, data.fields.values)
self.assertCountEqual([self.SINGLE_INDEX] + [self.INVALID_INDEX], data.tickers.values)
self.assertTrue(to_datetime(data.dates.values.min()) >= self.START_DATE and to_datetime(data.dates.values.max()) <= self.END_DATE)
def test_get_history_invalid_and_correct_tickers_single_fields(self):
data = self.bbg_beap_hapi.get_history(self.MANY_INDEXES + self.INVALID_INDEXES, self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataFrame, type(data))
self.assertFalse(data.empty)
self.assertCountEqual(self.MANY_INDEXES + self.INVALID_INDEXES, data.columns.values)
self.assertTrue(data.index.min() >= self.START_DATE and data.index.max() <= self.END_DATE)
def test_get_history_invalid_and_correct_tickers_many_fields(self):
data = self.bbg_beap_hapi.get_history(self.MANY_INDEXES + self.INVALID_INDEXES, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataArray, type(data))
self.assertCountEqual(self.MANY_INDEXES + self.INVALID_INDEXES, data.tickers.values)
self.assertCountEqual(self.MANY_FIELDS, data.fields.values)
self.assertTrue(to_datetime(data.dates.values.min()) >= self.START_DATE and to_datetime(data.dates.values.max()) <= self.END_DATE)
def test_get_futures_chain_tickers_invalid_ticker(self):
self.assertRaises(BloombergError, self.bbg_beap_hapi.get_futures_chain_tickers, self.INVALID_FUTURE_INDEX, self.EXPIRATION_DATES)
def test_get_futures_chain_tickers_invalid_tickers(self):
self.assertRaises(BloombergError, self.bbg_beap_hapi.get_futures_chain_tickers, self.INVALID_FUTURE_INDEXES, self.EXPIRATION_DATES)
def test_get_futures_chain_tickers_correct_and_invalid_ticker(self):
data = self.bbg_beap_hapi.get_futures_chain_tickers([self.SINGLE_FUTURE_INDEX] + [self.INVALID_FUTURE_INDEX], self.EXPIRATION_DATES)
self.assertEqual(dict, type(data))
self.assertEqual(len(data.keys()), 2)
self.assertEqual(type(data[list(data.keys())[0]]), QFDataFrame)
self.assertCountEqual(data[list(data.keys())[0]].columns.values, self.EXPIRATION_DATES)
def test_get_futures_chain_tickers_correct_and_invalid_tickers(self):
data = self.bbg_beap_hapi.get_futures_chain_tickers(self.MANY_FUTURE_INDEXES + self.INVALID_FUTURE_INDEXES, self.EXPIRATION_DATES)
self.assertEqual(dict, type(data))
self.assertEqual(len(data.keys()), len(self.MANY_FUTURE_INDEXES + self.INVALID_FUTURE_INDEXES))
self.assertEqual(type(data[list(data.keys())[0]]), QFDataFrame)
self.assertCountEqual(data[list(data.keys())[0]].columns.values, self.EXPIRATION_DATES)
def test_get_price_single_ticker_single_price_field(self):
data = self.bbg_beap_hapi.get_price(self.SINGLE_INDEX, self.SINGLE_PRICE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(PricesSeries, type(data))
self.assertEqual(self.SINGLE_INDEX.as_string(), data.name)
self.assertTrue(to_datetime(data.index.values.min()) >= self.START_DATE and to_datetime(data.index.values.max()) <= self.END_DATE)
def test_get_price_single_ticker_many_price_fields(self):
data = self.bbg_beap_hapi.get_price(self.SINGLE_INDEX, self.MANY_PRICE_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(PricesDataFrame, type(data))
self.assertCountEqual(self.MANY_PRICE_FIELDS, data.columns.values)
self.assertTrue(to_datetime(data.index.values.min()) >= self.START_DATE and to_datetime(data.index.values.max()) <= self.END_DATE)
def test_get_price_invalid_and_correct_ticker_single_fields(self):
data = self.bbg_beap_hapi.get_price([self.SINGLE_INDEX]+[self.INVALID_INDEX], self.SINGLE_PRICE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(PricesDataFrame, type(data))
self.assertCountEqual([self.SINGLE_INDEX]+[self.INVALID_INDEX], data.columns.values)
self.assertTrue(to_datetime(data.index.values.min()) >= self.START_DATE and to_datetime(data.index.values.max()) <= self.END_DATE)
def test_get_price_invalid_and_correct_tickers_many_fields(self):
data = self.bbg_beap_hapi.get_price(self.MANY_INDEXES+self.INVALID_INDEXES, self.MANY_PRICE_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataArray, type(data))
self.assertCountEqual(self.MANY_INDEXES + self.INVALID_INDEXES, data.tickers.values)
self.assertCountEqual(self.MANY_PRICE_FIELDS, data.fields.values)
self.assertTrue(to_datetime(data.dates.values.min()) >= self.START_DATE and to_datetime(data.dates.values.max()) <= self.END_DATE)
def test_get_history_correct_single_future_ticker_single_field(self):
self.SINGLE_FUTURE_INDEX.initialize_data_provider(SettableTimer(datetime.now()), self.bbg_beap_hapi)
data = self.bbg_beap_hapi.get_history(self.SINGLE_FUTURE_INDEX, self.SINGLE_FIELD, self.START_DATE, self.END_DATE)
self.assertEqual(QFSeries, type(data))
self.assertTrue(data.index.min() >= self.START_DATE and data.index.max() <= self.END_DATE)
def test_get_history_correct_single_future_ticker_many_tickers_many_fields(self):
self.SINGLE_FUTURE_INDEX.initialize_data_provider(SettableTimer(datetime.now()), self.bbg_beap_hapi)
data = self.bbg_beap_hapi.get_history([self.SINGLE_FUTURE_INDEX] + self.MANY_INDEXES, self.MANY_FIELDS, self.START_DATE, self.END_DATE)
self.assertEqual(QFDataArray, type(data))
self.assertCountEqual(self.MANY_FIELDS, data.fields.values)
self.assertTrue(to_datetime(data.dates.values.min()) >= self.START_DATE and to_datetime(data.dates.values.max()) <= self.END_DATE)
if __name__ == '__main__':
unittest.main()
| 58.101322 | 144 | 0.753431 | 1,785 | 13,189 | 5.256583 | 0.083473 | 0.032399 | 0.041565 | 0.041565 | 0.809443 | 0.775019 | 0.7342 | 0.716935 | 0.705318 | 0.685602 | 0 | 0.005423 | 0.133141 | 13,189 | 226 | 145 | 58.358407 | 0.815272 | 0.012814 | 0 | 0.313253 | 0 | 0 | 0.028348 | 0 | 0 | 0 | 0 | 0 | 0.457831 | 1 | 0.144578 | false | 0 | 0.108434 | 0 | 0.349398 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c44ce797a5c460258f52bac245631d4c3a93ac6 | 258 | py | Python | tensorflow_tts/configs/__init__.py | ashishpatel26/TensorflowTTS | bd29c3eefa51041b76fd355d94025b4c13084296 | [
"Apache-2.0"
] | 2 | 2020-06-01T07:39:25.000Z | 2021-11-08T09:31:33.000Z | tensorflow_tts/configs/__init__.py | ashishpatel26/TensorflowTTS | bd29c3eefa51041b76fd355d94025b4c13084296 | [
"Apache-2.0"
] | null | null | null | tensorflow_tts/configs/__init__.py | ashishpatel26/TensorflowTTS | bd29c3eefa51041b76fd355d94025b4c13084296 | [
"Apache-2.0"
] | 1 | 2020-10-05T06:06:20.000Z | 2020-10-05T06:06:20.000Z | from tensorflow_tts.configs.fastspeech import FastSpeechConfig
from tensorflow_tts.configs.melgan import MelGANGeneratorConfig
from tensorflow_tts.configs.melgan import MelGANDiscriminatorConfig
from tensorflow_tts.configs.tacotron2 import Tacotron2Config
| 36.857143 | 67 | 0.899225 | 28 | 258 | 8.142857 | 0.428571 | 0.245614 | 0.298246 | 0.421053 | 0.315789 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.069767 | 258 | 6 | 68 | 43 | 0.941667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1c5e76ae74c1d06b076a427c06774efe224fb843 | 19,247 | py | Python | eisy/data/simulation/circuits.py | dacb/eisy | bb948590b03124ddb90d112c07ecf60908735daf | [
"MIT"
] | null | null | null | eisy/data/simulation/circuits.py | dacb/eisy | bb948590b03124ddb90d112c07ecf60908735daf | [
"MIT"
] | null | null | null | eisy/data/simulation/circuits.py | dacb/eisy | bb948590b03124ddb90d112c07ecf60908735daf | [
"MIT"
] | null | null | null | import numpy as np
def freq_gen(high_freq, low_freq, decades=7):
'''
Function that generates the frequency range used to investigate the
impedance response of an electrical circuit Frequency Generator with
logspaced freqencies
Parameters
----------
high_freq : single value (int or float)
initial frequency value (high frequency domain) [Hz]
high_freq : single value (int or float)
final frequency value (low frequency domain) [Hz]
decades : integer
number of frequency decades to be used as range. Default value
is set to be 7 [-]
Returns
----------
[0] = frequency range [Hz]
[1] = Angular frequency range [1/s]
'''
f_decades = np.log10(high_freq) - np.log10(low_freq)
f_range = np.logspace(np.log10(high_freq), np.log10(low_freq),
np.around(decades*f_decades), endpoint=True)
w_range = 2 * np.pi * f_range
return f_range, w_range
def cir_RC_parallel(angular_freq, resistance='none', capacitance='none',
peak_frequency='none'):
'''
Function that simulates the impedance response of a resistor and a
capacitor in a parallel configuration.
String representation for this circuit: -(RC)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
resistance : single value (int or float)
Solution resistance [ohm]
capacitance : single value (int or float)
Electrode capacitance [F]
peak_frequency : single value (int or float)
Peak frequency of RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [ohm]
'''
circuit = '-(RC)-'
if resistance == 'none':
resistance = (1/(capacitance*(2*np.pi*peak_frequency)))
elif capacitance == 'none':
capacitance = (1/(resistance*(2*np.pi*peak_frequency)))
# compute the impedance response as a complex array
Z_complex = (resistance/(1+resistance*capacitance*(angular_freq*1j)))
return Z_complex
def cir_RC_series(angular_freq, resistance='none', capacitance='none',
peak_frequency='none'):
'''
Function that simulates the impedance response of a resistor and a
capacitor in a series configuration.
This circuit configuration is used to simulate the response of an ideally
polarizable electrode, also known as a blocking electrode.
String representation for this circuit: -R-C-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
resistance : single value (int or float)
Solution resistance [ohm]
capacitance : single value (int or float)
Capacitance of an electrode surface [F]
peak_frequency : single value (int or float)
Peak frequency of RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [ohm]
'''
circuit = '-R-C-'
if (resistance, capacitance, peak_frequency) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (resistance, capacitance) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
elif resistance == 'none':
resistance = (1/(capacitance*(2*np.pi*peak_frequency)))
elif capacitance == 'none':
capacitance = (1/(resistance*(2*np.pi*peak_frequency)))
# compute the impedance response as a complex array
Z_complex = resistance + 1/(capacitance*(angular_freq*1j))
return Z_complex
def cir_RQ_parallel(angular_freq, resistance='none',
constant_phase_element='none', alpha='none',
peak_frequency='none'):
'''
Function that simulates the impedance response of a resistor and a
constant phase element in a parallel configuration.
String representation for this circuit: -(RQ)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
resistance : single value (int or float)
Solution resistance [Ohm]
constant_phase_element : single value (int or float)
Constant phase angle [s^(alpha-1)/ohm]
alpha : single value -float
Exponent of the constant phase element. Should be a value between
0 and 1 [-]
peak_frequency : single value (int or float)
Peak frequency of RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Ohm]
'''
circuit = '-(RQ)-'
if (resistance, constant_phase_element, alpha, peak_frequency) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (resistance, constant_phase_element, alpha) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
elif resistance == 'none':
resistance = (1/(constant_phase_element*(2*np.pi*peak_frequency
) ** alpha))
elif constant_phase_element == 'none':
constant_phase_element = (1/(resistance*(2*np.pi*peak_frequency
) ** alpha))
elif alpha == 'none':
alpha = np.log(constant_phase_element *
resistance)/np.log(1/(2*np.pi * peak_frequency))
Z_complex = (resistance/(1+resistance*constant_phase_element*(
angular_freq*1j)**alpha))
return Z_complex
def cir_RQ_series(angular_freq, resistance='none',
constant_phase_element='none', alpha='none',
peak_frequency='none'):
'''
Function that simulates the impedance response of a resistor and a
constant phase element in a series configuration.
This circuit configuration is used to simulate the response of a
blocking electrode with distribution of reactivity.
String representation for this circuit: -R-Q-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
resistance : single value (int or float)
Solution resistance [ohm]
constant_phase_element : single value (int or float)
Constant phase angle [s^(alpha-1)/ohm]
alpha : single value -float
Exponent of the constant phase element. Should be a value between
0 and 1 [-]
peak_frequency : single value (int or float)
Peak frequency of RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Oom]
'''
circuit = '-R-Q-'
if (resistance, constant_phase_element, alpha, peak_frequency) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (resistance, constant_phase_element, alpha) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
elif resistance == 'none':
resistance = (1/(constant_phase_element*(2*np.pi*peak_frequency) **
alpha))
elif constant_phase_element == 'none':
constant_phase_element = (1/(resistance*(2*np.pi*peak_frequency) **
alpha))
elif alpha == 'none':
alpha = np.log(constant_phase_element *
resistance)/np.log(1/(2*np.pi * peak_frequency))
# compute the impedance response as a complex array
Z_complex = resistance + 1/(constant_phase_element*(
angular_freq*1j)**alpha)
return Z_complex
def cir_RsRC(angular_freq, solution_resistance,
parallel_resistance='none', capacitance='none',
peak_frequency='none'):
''''
Function that simulates the impedance response of a solution resistor in
series with a resistor in parallel with a capacitor.
This circuit configuration is used to simulate the response of an ideally
polarizable electrode, also known as a blocking electrode.
String representation for this circuit: -Rs-(RC)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
solution_resistance : single value (int or float)
Solution resistance [ohm]
parallel_resistance : single value (int or float)
resistance of the element in parallel with
the capacitor [ohm]
capacitance : single value (int or float)
Capacitance of an electrode surface [F]
peak_frequency : single value (int or float)
Peak frequency of the parallel RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Ohm]
'''
circuit = '-Rs-(RC)-'
# compute the impedance response as a complex array
if (parallel_resistance, capacitance, peak_frequency) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (parallel_resistance, capacitance) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
elif parallel_resistance == 'none':
parallel_resistance = (1/(capacitance*(2*np.pi*peak_frequency)))
elif capacitance == 'none':
capacitance = (1/(parallel_resistance*(2*np.pi*peak_frequency)))
Z_parallel = (parallel_resistance/(1 + parallel_resistance *
capacitance * (angular_freq*1j)))
Z_complex = solution_resistance + Z_parallel
return Z_complex
def cir_RsRQRQ(angular_freq, solution_resistance='none',
parallel_resistance_1='none', constant_phase_element_1='none',
alpha_1='none', parallel_resistance_2='none',
constant_phase_element_2='none', alpha_2='none',
peak_frequency_1='none', peak_frequency_2='none'):
'''
Function that simulates the impedance response of a solution resistor in
series with two sets of a resistor in parallel with a constant phase
elements.
String representation for this circuit: -Rs-(RQ)-(RQ)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
solution_resistance : single value (int or float)
Solution resistance [ohm]
parallel_resistance_1 : single value (int or float)
first combination of resistor in parallel with
constant phase element [ohm]
constant_phase_element_1 : single value (int or float)
First constant phas angle [s^(alpha-1)/ohm]
alpha_1 : single value -float
Exponent of the first constant phase element.
Should be a value between 0 and 1 [-]
parallel_resistance_2 : single value (int or float)
Second combination of resistor in parallel with
constant phase element [ohm]
constant_phase_element_2 : single value (int or float)
Second Constant phas angle [s^(alpha-1)/ohm]
alpha_2 : single value -float
Exponent of the second constant phase element.
Should be a value between 0 and 1 [-]
peak_frequency_1 : single value (int or float)
Peak frequency of the first parallel RQ circuit [Hz]
peak_frequency_2 : single value (int or float)
Peak frequency of the second parallel RQ circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Ohm]
'''
circuit = '-Rs-(RQ)-(RQ)-'
if (parallel_resistance_1, constant_phase_element_1, peak_frequency_1,
parallel_resistance_2, constant_phase_element_2,
peak_frequency_2) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (parallel_resistance_1, constant_phase_element_1,
parallel_resistance_2, constant_phase_element_2) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
if parallel_resistance_1 == 'none':
parallel_resistance_1 = (1/(constant_phase_element_1 *
(2*np.pi*peak_frequency_1)**alpha_1))
elif constant_phase_element_1 == 'none':
constant_phase_element_1 = (1/(parallel_resistance_1 *
(2*np.pi*peak_frequency_1)**alpha_1))
if parallel_resistance_2 == 'none':
parallel_resistance_2 = (1/(constant_phase_element_2 *
(2*np.pi*peak_frequency_2)**alpha_2))
elif constant_phase_element_2 == 'none':
constant_phase_element_2 = (1/(parallel_resistance_2 *
(2*np.pi*peak_frequency_2)**alpha_2))
Z_parallel_1 = (parallel_resistance_1 /
(1+parallel_resistance_1*constant_phase_element_1
* (angular_freq*1j)**alpha_1))
Z_parallel_2 = (parallel_resistance_2 /
(1+parallel_resistance_2*constant_phase_element_2
* (angular_freq*1j)**alpha_2))
Z_complex = solution_resistance + Z_parallel_1 + Z_parallel_2
return Z_complex
def cir_RsRCRC(angular_freq, solution_resistance,
parallel_resistance_1='none', capacitance_1='none',
parallel_resistance_2='none', capacitance_2='none',
peak_frequency_1='none', peak_frequency_2='none'):
'''
Function that simulates the impedance response of a solution resistor in
series with two sets of a resistor in parallel with a capacitor.
String representation for this circuit: -Rs-(RC)-(RC)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
solution_resistance : single value (int or float)
Solution resistance [ohm]
parallel_resistance_1 : single value (int or float)
first combination of resistor in parallel with
capacitor [ohm]
capacitance_1 : single value (int or float)
Capacitance of an electrode surface whichi is part of the
first combination of RC in parallel [F]
parallel_resistance_2 : single value (int or float)
second combination of resistor in parallel with
capacitor [ohm]
capacitance_2 : single value (int or float)
Capacitance of an electrode surface whichi is part of the
second combination of RC in parallel [F]
peak_frequency_1 : single value (int or float)
Peak frequency of the first parallel RC circuit [Hz]
peak_frequency_2 : single value (int or float)
Peak frequency of the second parallel RC circuit [Hz]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Ohm]
'''
circuit = '-Rs-(RC)-(RC)-'
if (parallel_resistance_1, capacitance_1, peak_frequency_1,
parallel_resistance_2, capacitance_2, peak_frequency_2) == 'none':
raise AssertionError('No circuit element value was provided. Cannot\
compute the impedance response')
elif (parallel_resistance_1, capacitance_1,
parallel_resistance_2, capacitance_2) == 'none':
raise AssertionError('Not enough circuit element values were provided.\
Cannot compute the impedance response')
if parallel_resistance_1 == 'none':
parallel_resistance_1 = (1/(capacitance_1*(2*np.pi *
peak_frequency_1)))
elif capacitance_1 == 'none':
capacitance_1 = (1/(parallel_resistance_1*(2*np.pi *
peak_frequency_1)))
if parallel_resistance_2 == 'none':
parallel_resistance_2 = (1/(capacitance_2*(2*np.pi *
peak_frequency_2)))
elif capacitance_2 == 'none':
capacitance_2 = (1/(parallel_resistance_2*(2*np.pi *
peak_frequency_2)))
Z_parallel_1 = (parallel_resistance_1/(1 + parallel_resistance_1 *
capacitance_1*(angular_freq*1j)))
Z_parallel_2 = (parallel_resistance_2/(1 + parallel_resistance_2 *
capacitance_2*(angular_freq*1j)))
Z_complex = solution_resistance + Z_parallel_1 + Z_parallel_2
return Z_complex
def cir_Randles_simplified(angular_freq, solution_resistance,
parallel_resistance, alpha='none', sigma='none',
Q='none', fs='none'):
'''
Return the impedance of a Randles circuit with a simplified Warburg element
This form of the Randles circuit is only meant for to simulate
semi-infinate linear diffusion
String representation for this circuit: -Rs-(Q-(RW)-)-
Parameters
----------
angular_freq : array-like
Angular frequency [1/s]
solution_resistance : single value (int or float)
Solution resistance [ohm]
parallel_resistance : single value (int or float)
resistance of the element in parallel with
the capacitor [ohm]
capacitance : single value (int or float)
Capacitance of an electrode surface [F]
[[Need to add new parameters!!!!]]
Returns
---------
Z_complex : array-like
impedance response of the circuit under investigation [Ohm]
'''
circuit = '-Rs-(Q-(RW)-)-'
if parallel_resistance == 'none':
parallel_resistance = (1/(Q*(2*np.pi*fs)**alpha))
elif sigma == 'none':
sigma = (1/(parallel_resistance*(2*np.pi*fs)**alpha))
elif alpha == 'none':
alpha = np.log(Q*parallel_resistance)/np.log(1/(2*np.pi*fs))
Z_Q = 1/(Q*(angular_freq*1j)**alpha)
Z_R = parallel_resistance
Z_w = sigma*(angular_freq**(-0.5))-1j*sigma*(angular_freq**(-0.5))
return solution_resistance + 1/(1/Z_Q + 1/(Z_R+Z_w))
| 44.043478 | 79 | 0.608718 | 2,206 | 19,247 | 5.139166 | 0.070716 | 0.063068 | 0.074094 | 0.049396 | 0.890712 | 0.862838 | 0.801535 | 0.765194 | 0.741819 | 0.707595 | 0 | 0.015999 | 0.305035 | 19,247 | 436 | 80 | 44.144495 | 0.831564 | 0.449577 | 0 | 0.468208 | 1 | 0 | 0.036819 | 0 | 0 | 0 | 0 | 0 | 0.069364 | 1 | 0.052023 | false | 0 | 0.00578 | 0 | 0.109827 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
98c5b5964a7eee2e00be140e93234a2c95ca95e6 | 8,876 | py | Python | tests/auth_tests/test_migrations.py | beshrkayali/django | 84633905273fc916e3d17883810d9969c03f73c2 | [
"PSF-2.0",
"BSD-3-Clause"
] | 7 | 2020-01-13T18:26:41.000Z | 2021-04-20T04:22:26.000Z | tests/auth_tests/test_migrations.py | beshrkayali/django | 84633905273fc916e3d17883810d9969c03f73c2 | [
"PSF-2.0",
"BSD-3-Clause"
] | 10 | 2016-05-19T21:54:42.000Z | 2019-08-09T15:59:50.000Z | tests/auth_tests/test_migrations.py | beshrkayali/django | 84633905273fc916e3d17883810d9969c03f73c2 | [
"PSF-2.0",
"BSD-3-Clause"
] | 11 | 2019-09-14T20:57:30.000Z | 2022-01-19T17:59:26.000Z | from importlib import import_module
from django.apps import apps
from django.contrib.auth.models import Permission, User
from django.contrib.contenttypes.models import ContentType
from django.test import TestCase
from django.test.utils import captured_stdout
from .models import Proxy, UserProxy
update_proxy_permissions = import_module('django.contrib.auth.migrations.0011_update_proxy_permissions')
class ProxyModelWithDifferentAppLabelTests(TestCase):
available_apps = [
'auth_tests',
'django.contrib.auth',
'django.contrib.contenttypes',
]
def setUp(self):
"""
Create proxy permissions with content_type to the concrete model
rather than the proxy model (as they were before Django 2.2 and
migration 11).
"""
Permission.objects.all().delete()
self.concrete_content_type = ContentType.objects.get_for_model(UserProxy)
self.default_permission = Permission.objects.create(
content_type=self.concrete_content_type,
codename='add_userproxy',
name='Can add userproxy',
)
self.custom_permission = Permission.objects.create(
content_type=self.concrete_content_type,
codename='use_different_app_label',
name='May use a different app label',
)
def test_proxy_model_permissions_contenttype(self):
proxy_model_content_type = ContentType.objects.get_for_model(UserProxy, for_concrete_model=False)
self.assertEqual(self.default_permission.content_type, self.concrete_content_type)
self.assertEqual(self.custom_permission.content_type, self.concrete_content_type)
update_proxy_permissions.update_proxy_model_permissions(apps, None)
self.default_permission.refresh_from_db()
self.assertEqual(self.default_permission.content_type, proxy_model_content_type)
self.custom_permission.refresh_from_db()
self.assertEqual(self.custom_permission.content_type, proxy_model_content_type)
def test_user_has_now_proxy_model_permissions(self):
user = User.objects.create()
user.user_permissions.add(self.default_permission)
user.user_permissions.add(self.custom_permission)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth.' + permission.codename))
self.assertFalse(user.has_perm('auth_tests.' + permission.codename))
update_proxy_permissions.update_proxy_model_permissions(apps, None)
# Reload user to purge the _perm_cache.
user = User._default_manager.get(pk=user.pk)
for permission in [self.default_permission, self.custom_permission]:
self.assertFalse(user.has_perm('auth.' + permission.codename))
self.assertTrue(user.has_perm('auth_tests.' + permission.codename))
def test_migrate_backwards(self):
update_proxy_permissions.update_proxy_model_permissions(apps, None)
update_proxy_permissions.revert_proxy_model_permissions(apps, None)
self.default_permission.refresh_from_db()
self.assertEqual(self.default_permission.content_type, self.concrete_content_type)
self.custom_permission.refresh_from_db()
self.assertEqual(self.custom_permission.content_type, self.concrete_content_type)
def test_user_keeps_same_permissions_after_migrating_backward(self):
user = User.objects.create()
user.user_permissions.add(self.default_permission)
user.user_permissions.add(self.custom_permission)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth.' + permission.codename))
self.assertFalse(user.has_perm('auth_tests.' + permission.codename))
update_proxy_permissions.update_proxy_model_permissions(apps, None)
update_proxy_permissions.revert_proxy_model_permissions(apps, None)
# Reload user to purge the _perm_cache.
user = User._default_manager.get(pk=user.pk)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth.' + permission.codename))
self.assertFalse(user.has_perm('auth_tests.' + permission.codename))
class ProxyModelWithSameAppLabelTests(TestCase):
available_apps = [
'auth_tests',
'django.contrib.auth',
'django.contrib.contenttypes',
]
def setUp(self):
"""
Create proxy permissions with content_type to the concrete model
rather than the proxy model (as they were before Django 2.2 and
migration 11).
"""
Permission.objects.all().delete()
self.concrete_content_type = ContentType.objects.get_for_model(Proxy)
self.default_permission = Permission.objects.create(
content_type=self.concrete_content_type,
codename='add_proxy',
name='Can add proxy',
)
self.custom_permission = Permission.objects.create(
content_type=self.concrete_content_type,
codename='display_proxys',
name='May display proxys information',
)
def test_proxy_model_permissions_contenttype(self):
proxy_model_content_type = ContentType.objects.get_for_model(Proxy, for_concrete_model=False)
self.assertEqual(self.default_permission.content_type, self.concrete_content_type)
self.assertEqual(self.custom_permission.content_type, self.concrete_content_type)
update_proxy_permissions.update_proxy_model_permissions(apps, None)
self.default_permission.refresh_from_db()
self.custom_permission.refresh_from_db()
self.assertEqual(self.default_permission.content_type, proxy_model_content_type)
self.assertEqual(self.custom_permission.content_type, proxy_model_content_type)
def test_user_still_has_proxy_model_permissions(self):
user = User.objects.create()
user.user_permissions.add(self.default_permission)
user.user_permissions.add(self.custom_permission)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth_tests.' + permission.codename))
update_proxy_permissions.update_proxy_model_permissions(apps, None)
# Reload user to purge the _perm_cache.
user = User._default_manager.get(pk=user.pk)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth_tests.' + permission.codename))
def test_migrate_backwards(self):
update_proxy_permissions.update_proxy_model_permissions(apps, None)
update_proxy_permissions.revert_proxy_model_permissions(apps, None)
self.default_permission.refresh_from_db()
self.assertEqual(self.default_permission.content_type, self.concrete_content_type)
self.custom_permission.refresh_from_db()
self.assertEqual(self.custom_permission.content_type, self.concrete_content_type)
def test_user_keeps_same_permissions_after_migrating_backward(self):
user = User.objects.create()
user.user_permissions.add(self.default_permission)
user.user_permissions.add(self.custom_permission)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth_tests.' + permission.codename))
update_proxy_permissions.update_proxy_model_permissions(apps, None)
update_proxy_permissions.revert_proxy_model_permissions(apps, None)
# Reload user to purge the _perm_cache.
user = User._default_manager.get(pk=user.pk)
for permission in [self.default_permission, self.custom_permission]:
self.assertTrue(user.has_perm('auth_tests.' + permission.codename))
def test_migrate_with_existing_target_permission(self):
"""
Permissions may already exist:
- Old workaround was to manually create permissions for proxy models.
- Model may have been concrete and then converted to proxy.
Output a reminder to audit relevant permissions.
"""
proxy_model_content_type = ContentType.objects.get_for_model(Proxy, for_concrete_model=False)
Permission.objects.create(
content_type=proxy_model_content_type,
codename='add_proxy',
name='Can add proxy',
)
Permission.objects.create(
content_type=proxy_model_content_type,
codename='display_proxys',
name='May display proxys information',
)
with captured_stdout() as stdout:
update_proxy_permissions.update_proxy_model_permissions(apps, None)
self.assertIn('A problem arose migrating proxy model permissions', stdout.getvalue())
| 49.311111 | 105 | 0.725665 | 1,044 | 8,876 | 5.855364 | 0.110153 | 0.077376 | 0.082447 | 0.052675 | 0.869295 | 0.869295 | 0.867496 | 0.862097 | 0.858335 | 0.850319 | 0 | 0.001676 | 0.193218 | 8,876 | 179 | 106 | 49.586592 | 0.851976 | 0.073457 | 0 | 0.776978 | 0 | 0 | 0.067078 | 0.016924 | 0 | 0 | 0 | 0 | 0.179856 | 1 | 0.079137 | false | 0 | 0.057554 | 0 | 0.165468 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
98cac2e55103be35b74d19bc85f7d593a3874545 | 155 | py | Python | tests/test_gryffin.py | Atinary-technologies/gryffin | 9770ffa049076fc0b82619c6f0d3fc32437aaea4 | [
"Apache-2.0"
] | 1 | 2021-05-11T21:37:05.000Z | 2021-05-11T21:37:05.000Z | tests/test_gryffin.py | Atinary-technologies/gryffin | 9770ffa049076fc0b82619c6f0d3fc32437aaea4 | [
"Apache-2.0"
] | 1 | 2022-03-10T23:16:30.000Z | 2022-03-14T17:29:15.000Z | tests/test_gryffin.py | Atinary-technologies/gryffin | 9770ffa049076fc0b82619c6f0d3fc32437aaea4 | [
"Apache-2.0"
] | 1 | 2022-03-10T21:43:03.000Z | 2022-03-10T21:43:03.000Z | #!/usr/bin/env python
def test_import():
from gryffin import Gryffin
#==============================================================================
| 22.142857 | 79 | 0.341935 | 11 | 155 | 4.727273 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 155 | 6 | 80 | 25.833333 | 0.371429 | 0.632258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c7062a70c1732a4282ff4bc46ee34da21b80689b | 27,114 | py | Python | classification/PACS/resnet_SNR.py | microsoft/SNR | f3d51b5e3525fe5e1ea364fafdf0e4cc60b1362b | [
"MIT"
] | 44 | 2021-01-12T03:05:12.000Z | 2022-03-28T09:05:47.000Z | classification/PACS/resnet_SNR.py | microsoft/SNR | f3d51b5e3525fe5e1ea364fafdf0e4cc60b1362b | [
"MIT"
] | 7 | 2021-03-16T07:13:16.000Z | 2022-02-02T14:52:55.000Z | classification/PACS/resnet_SNR.py | microsoft/SNR | f3d51b5e3525fe5e1ea364fafdf0e4cc60b1362b | [
"MIT"
] | 8 | 2021-01-15T15:09:25.000Z | 2021-12-21T13:34:51.000Z | # Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
import torch.nn as nn
import torch.utils.model_zoo as model_zoo
import torch.nn.functional as F
import torch
__all__ = ['ResNet_SNR', 'resnet18_snr', 'resnet18_snr_nas', 'resnet18_snr_causality', 'resnet18_in']
model_urls = {
'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',
}
def conv3x3(in_planes, out_planes, stride=1):
"""3x3 convolution with padding"""
return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,
padding=1, bias=False)
class ChannelGate_sub(nn.Module):
"""A mini-network that generates channel-wise gates conditioned on input tensor."""
def __init__(self, in_channels, num_gates=None, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False):
super(ChannelGate_sub, self).__init__()
if num_gates is None:
num_gates = in_channels
self.return_gates = return_gates
self.global_avgpool = nn.AdaptiveAvgPool2d(1)
self.fc1 = nn.Conv2d(in_channels, in_channels//reduction, kernel_size=1, bias=True, padding=0)
self.norm1 = None
if layer_norm:
self.norm1 = nn.LayerNorm((in_channels//reduction, 1, 1))
self.relu = nn.ReLU(inplace=True)
self.fc2 = nn.Conv2d(in_channels//reduction, num_gates, kernel_size=1, bias=True, padding=0)
if gate_activation == 'sigmoid':
self.gate_activation = nn.Sigmoid()
elif gate_activation == 'relu':
self.gate_activation = nn.ReLU(inplace=True)
elif gate_activation == 'linear':
self.gate_activation = None
else:
raise RuntimeError("Unknown gate activation: {}".format(gate_activation))
def forward(self, x):
input = x
x = self.global_avgpool(x)
x = self.fc1(x)
if self.norm1 is not None:
x = self.norm1(x)
x = self.relu(x)
x = self.fc2(x)
if self.gate_activation is not None:
x = self.gate_activation(x)
if self.return_gates:
return x
return input * x, input * (1 - x), x
class BasicBlock(nn.Module):
expansion = 1
def __init__(self, inplanes, planes, stride=1, downsample=None):
super(BasicBlock, self).__init__()
self.conv1 = conv3x3(inplanes, planes, stride)
self.bn1 = nn.BatchNorm2d(planes)
self.relu = nn.ReLU(inplace=True)
self.conv2 = conv3x3(planes, planes)
self.bn2 = nn.BatchNorm2d(planes)
self.downsample = downsample
self.stride = stride
def forward(self, x):
residual = x
out = self.conv1(x)
out = self.bn1(out)
out = self.relu(out)
out = self.conv2(out)
out = self.bn2(out)
if self.downsample is not None:
residual = self.downsample(x)
out += residual
out = self.relu(out)
return out
class Bottleneck(nn.Module):
expansion = 4
def __init__(self, inplanes, planes, stride=1, downsample=None):
super(Bottleneck, self).__init__()
self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)
self.bn1 = nn.BatchNorm2d(planes)
self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride,
padding=1, bias=False)
self.bn2 = nn.BatchNorm2d(planes)
self.conv3 = nn.Conv2d(planes, planes * self.expansion, kernel_size=1, bias=False)
self.bn3 = nn.BatchNorm2d(planes * self.expansion)
self.relu = nn.ReLU(inplace=True)
self.downsample = downsample
self.stride = stride
def forward(self, x):
residual = x
out = self.conv1(x)
out = self.bn1(out)
out = self.relu(out)
out = self.conv2(out)
out = self.bn2(out)
out = self.relu(out)
out = self.conv3(out)
out = self.bn3(out)
if self.downsample is not None:
residual = self.downsample(x)
out += residual
out = self.relu(out)
return out
class UpBlock(nn.Module):
def __init__(self, inplanes, planes, upsample=False):
super(UpBlock, self).__init__()
self.conv = nn.Conv2d(inplanes, planes, 1, 1)
self.bn = nn.BatchNorm2d(planes)
self.relu = nn.ReLU(inplace=True)
self.will_ups = upsample
def forward(self, x):
if self.will_ups:
x = nn.functional.interpolate(x, scale_factor=2, mode="bilinear", align_corners=True)
x = self.conv(x)
x = self.bn(x)
x = self.relu(x)
return x
class Conv1x1nonLinear(nn.Module):
"""1x1 convolution + bn (w/o non-linearity)."""
def __init__(self, in_channels, out_channels, stride=1):
super(Conv1x1nonLinear, self).__init__()
self.conv = nn.Conv2d(in_channels, out_channels, 1, stride=stride, padding=0, bias=False)
self.bn = nn.BatchNorm2d(out_channels)
self.relu = nn.ReLU(inplace=True)
def forward(self, x):
x = self.conv(x)
x = self.relu(self.bn(x))
return x
class ResNet_SNR(nn.Module):
def __init__(self, block, layers, num_classes=1000):
self.inplanes = 64
super(ResNet_SNR, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
self.avgpool = nn.AvgPool2d(7, stride=1)
self.fc = nn.Linear(512 * block.expansion, num_classes)
# IN bridge:
self.IN1 = nn.InstanceNorm2d(64, affine=True)
self.IN2 = nn.InstanceNorm2d(128, affine=True)
self.IN3 = nn.InstanceNorm2d(256, affine=True)
self.IN4 = nn.InstanceNorm2d(512, affine=True)
# SE for selection:
self.style_reid_laye1 = ChannelGate_sub(64, num_gates=64, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye2 = ChannelGate_sub(128, num_gates=128, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye3 = ChannelGate_sub(256, num_gates=256, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye4 = ChannelGate_sub(512, num_gates=512, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
if m.bias is not None:
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm1d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.InstanceNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.Linear):
nn.init.normal_(m.weight, 0, 0.01)
if m.bias is not None:
nn.init.constant_(m.bias, 0)
def bn_eval(self):
for m in self.modules():
if isinstance(m, nn.BatchNorm2d):
m.eval()
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def forward(self, x):
end_points = {}
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x_1 = self.layer1(x) # torch.Size([64, 256, 64, 32])
x_1_ori = x_1
x_IN_1 = self.IN1(x_1)
x_style_1 = x_1 - x_IN_1
x_style_1_reid_useful, x_style_1_reid_useless, selective_weight_useful_1 = self.style_reid_laye1(x_style_1)
x_1 = x_IN_1 + x_style_1_reid_useful
x_1_useless = x_IN_1 + x_style_1_reid_useless
x_2 = self.layer2(x_1) # torch.Size([64, 512, 32, 16])
x_2_ori = x_2
x_IN_2 = self.IN2(x_2)
x_style_2 = x_2 - x_IN_2
x_style_2_reid_useful, x_style_2_reid_useless, selective_weight_useful_2 = self.style_reid_laye2(x_style_2)
x_2 = x_IN_2 + x_style_2_reid_useful
x_2_useless = x_IN_2 + x_style_2_reid_useless
x_3 = self.layer3(x_2) # torch.Size([64, 1024, 16, 8])
x_3_ori = x_3
x_IN_3 = self.IN3(x_3)
x_style_3 = x_3 - x_IN_3
x_style_3_reid_useful, x_style_3_reid_useless, selective_weight_useful_3 = self.style_reid_laye3(x_style_3)
x_3 = x_IN_3 + x_style_3_reid_useful
x_3_useless = x_IN_3 + x_style_3_reid_useless
x_4 = self.layer4(x_3) # torch.Size([64, 2048, 16, 8])
x_4 = self.avgpool(x_4)
x_4 = x_4.view(x_4.size(0), -1)
end_points['Feature'] = x_4
x_4 = self.fc(x_4)
end_points['Predictions'] = F.softmax(input=x_4, dim=-1)
return x_4, end_points
class ResNet_SNR_NAS(nn.Module):
def __init__(self, block, layers, num_classes=1000):
self.inplanes = 64
super(ResNet_SNR_NAS, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
self.avgpool = nn.AvgPool2d(7, stride=1)
self.fc = nn.Linear(512 * block.expansion, num_classes)
# IN bridge:
self.IN1 = nn.InstanceNorm2d(64, affine=True)
self.IN2 = nn.InstanceNorm2d(128, affine=True)
self.IN3 = nn.InstanceNorm2d(256, affine=True)
self.IN4 = nn.InstanceNorm2d(512, affine=True)
# SE for selection:
self.style_reid_layer1 = ChannelGate_sub(64, num_gates=64, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_layer2 = ChannelGate_sub(128, num_gates=128, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_layer3 = ChannelGate_sub(256, num_gates=256, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_layer4 = ChannelGate_sub(512, num_gates=512, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
# For NAS:
self.gamma_layer1 = torch.nn.Parameter(torch.ones(1) * 0.5).cuda()
self.gamma_layer2 = torch.nn.Parameter(torch.ones(1) * 0.5).cuda()
self.gamma_layer3 = torch.nn.Parameter(torch.ones(1) * 0.5).cuda()
self.gamma_layer4 = torch.nn.Parameter(torch.ones(1) * 0.5).cuda()
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
if m.bias is not None:
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm1d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.InstanceNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.Linear):
nn.init.normal_(m.weight, 0, 0.01)
if m.bias is not None:
nn.init.constant_(m.bias, 0)
def bn_eval(self):
for m in self.modules():
if isinstance(m, nn.BatchNorm2d):
m.eval()
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def forward(self, x):
end_points = {}
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x_1 = self.layer1(x) # torch.Size([64, 256, 64, 32])
x_1_ori = x_1
x_IN_1 = self.IN1(x_1)
x_style_1 = x_1 - x_IN_1
x_style_1_reid_useful, x_style_1_reid_useless, selective_weight_useful_1 = self.style_reid_layer1(x_style_1)
x_1 = x_IN_1 + F.sigmoid(self.gamma_layer1) * x_style_1_reid_useful
x_1_useless = x_IN_1 + x_style_1_reid_useless
x_2 = self.layer2(x_1) # torch.Size([64, 512, 32, 16])
x_2_ori = x_2
x_IN_2 = self.IN2(x_2)
x_style_2 = x_2 - x_IN_2
x_style_2_reid_useful, x_style_2_reid_useless, selective_weight_useful_2 = self.style_reid_layer2(x_style_2)
x_2 = x_IN_2 + F.sigmoid(self.gamma_layer2) * x_style_2_reid_useful
x_2_useless = x_IN_2 + x_style_2_reid_useless
x_3 = self.layer3(x_2) # torch.Size([64, 1024, 16, 8])
x_3_ori = x_3
x_IN_3 = self.IN3(x_3)
x_style_3 = x_3 - x_IN_3
x_style_3_reid_useful, x_style_3_reid_useless, selective_weight_useful_3 = self.style_reid_layer3(x_style_3)
x_3 = x_IN_3 + F.sigmoid(self.gamma_layer3) * x_style_3_reid_useful
x_3_useless = x_IN_3 + x_style_3_reid_useless
x_4 = self.layer4(x_3) # torch.Size([64, 2048, 16, 8])
x_4 = self.avgpool(x_4)
x_4 = x_4.view(x_4.size(0), -1)
end_points['Feature'] = x_4
x_4 = self.fc(x_4)
end_points['Predictions'] = F.softmax(input=x_4, dim=-1)
return x_4, end_points
class ResNet_SNR_Causality(nn.Module):
def __init__(self, block, layers, num_classes=1000):
self.inplanes = 64
super(ResNet_SNR_Causality, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
self.avgpool = nn.AvgPool2d(7, stride=1)
self.fc = nn.Linear(512 * block.expansion, num_classes)
# FC layers for stage-1-2-3:
self.global_avgpool = nn.AdaptiveAvgPool2d(1)
self.fc1 = nn.Linear(64, num_classes)
self.fc2 = nn.Linear(128, num_classes)
self.fc3 = nn.Linear(256, num_classes)
# IN bridge:
self.IN1 = nn.InstanceNorm2d(64, affine=True)
self.IN2 = nn.InstanceNorm2d(128, affine=True)
self.IN3 = nn.InstanceNorm2d(256, affine=True)
self.IN4 = nn.InstanceNorm2d(512, affine=True)
# SE for selection:
self.style_reid_laye1 = ChannelGate_sub(64, num_gates=64, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye2 = ChannelGate_sub(128, num_gates=128, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye3 = ChannelGate_sub(256, num_gates=256, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
self.style_reid_laye4 = ChannelGate_sub(512, num_gates=512, return_gates=False,
gate_activation='sigmoid', reduction=16, layer_norm=False)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
if m.bias is not None:
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm1d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.InstanceNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.Linear):
nn.init.normal_(m.weight, 0, 0.01)
if m.bias is not None:
nn.init.constant_(m.bias, 0)
def bn_eval(self):
for m in self.modules():
if isinstance(m, nn.BatchNorm2d):
m.eval()
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def forward(self, x):
end_points = {}
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x_1 = self.layer1(x) # torch.Size([64, 256, 64, 32])
x_1_ori = x_1
x_IN_1 = self.IN1(x_1)
x_style_1 = x_1 - x_IN_1
x_style_1_reid_useful, x_style_1_reid_useless, selective_weight_useful_1 = self.style_reid_laye1(x_style_1)
x_1 = x_IN_1 + x_style_1_reid_useful
x_1_useless = x_IN_1 + x_style_1_reid_useless
x_2 = self.layer2(x_1) # torch.Size([64, 512, 32, 16])
x_2_ori = x_2
x_IN_2 = self.IN2(x_2)
x_style_2 = x_2 - x_IN_2
x_style_2_reid_useful, x_style_2_reid_useless, selective_weight_useful_2 = self.style_reid_laye2(x_style_2)
x_2 = x_IN_2 + x_style_2_reid_useful
x_2_useless = x_IN_2 + x_style_2_reid_useless
x_3 = self.layer3(x_2) # torch.Size([64, 1024, 16, 8])
x_3_ori = x_3
x_IN_3 = self.IN3(x_3)
x_style_3 = x_3 - x_IN_3
x_style_3_reid_useful, x_style_3_reid_useless, selective_weight_useful_3 = self.style_reid_laye3(x_style_3)
x_3 = x_IN_3 + x_style_3_reid_useful
x_3_useless = x_IN_3 + x_style_3_reid_useless
x_4 = self.layer4(x_3) # torch.Size([64, 2048, 16, 8])
x_4_featmap = x_4
x_4 = self.avgpool(x_4)
x_4 = x_4.view(x_4.size(0), -1)
end_points['Feature'] = x_4
x_4 = self.fc(x_4)
end_points['Predictions'] = F.softmax(input=x_4, dim=-1)
if self.training:
return x_4, end_points, \
F.softmax(self.fc1(self.global_avgpool(x_IN_1).view(x_IN_1.size(0), -1))), \
F.softmax(self.fc1(self.global_avgpool(x_1).view(x_1.size(0), -1))), \
F.softmax(self.fc1(self.global_avgpool(x_1_useless).view(x_1_useless.size(0), -1))), \
F.softmax(self.fc2(self.global_avgpool(x_IN_2).view(x_IN_2.size(0), -1))), \
F.softmax(self.fc2(self.global_avgpool(x_2).view(x_2.size(0), -1))), \
F.softmax(self.fc2(self.global_avgpool(x_2_useless).view(x_2_useless.size(0), -1))), \
F.softmax(self.fc3(self.global_avgpool(x_IN_3).view(x_IN_3.size(0), -1))), \
F.softmax(self.fc3(self.global_avgpool(x_3).view(x_3.size(0), -1))), \
F.softmax(self.fc3(self.global_avgpool(x_3_useless).view(x_3_useless.size(0), -1))), \
self.fc3(self.global_avgpool(x_IN_3).view(x_IN_3.size(0), -1)), \
self.fc3(self.global_avgpool(x_3).view(x_3.size(0), -1)), \
self.fc3(self.global_avgpool(x_3_useless).view(x_3_useless.size(0), -1))
else:
return x_4, end_points, \
x_4_featmap, \
x_IN_1, x_1, x_1_useless, \
x_IN_2, x_2, x_2_useless, \
x_IN_3, x_3, x_3_useless
class ResNet_IN(nn.Module):
def __init__(self, block, layers, num_classes=1000):
self.inplanes = 64
super(ResNet_IN, self).__init__()
self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
self.avgpool = nn.AvgPool2d(7, stride=1)
self.fc = nn.Linear(512 * block.expansion, num_classes)
# IN bridge:
self.IN1 = nn.InstanceNorm2d(64, affine=True)
self.IN2 = nn.InstanceNorm2d(128, affine=True)
self.IN3 = nn.InstanceNorm2d(256, affine=True)
self.IN4 = nn.InstanceNorm2d(512, affine=True)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
if m.bias is not None:
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.BatchNorm1d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.InstanceNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.Linear):
nn.init.normal_(m.weight, 0, 0.01)
if m.bias is not None:
nn.init.constant_(m.bias, 0)
def bn_eval(self):
for m in self.modules():
if isinstance(m, nn.BatchNorm2d):
m.eval()
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def forward(self, x):
end_points = {}
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x_1 = self.layer1(x) # torch.Size([64, 256, 64, 32])
x_1 = self.IN1(x_1)
x_2 = self.layer2(x_1) # torch.Size([64, 512, 32, 16])
x_2 = self.IN2(x_2)
x_3 = self.layer3(x_2) # torch.Size([64, 1024, 16, 8])
x_3 = self.IN3(x_3)
x_4 = self.layer4(x_3) # torch.Size([64, 2048, 16, 8])
x_4 = self.avgpool(x_4)
x_4 = x_4.view(x_4.size(0), -1)
end_points['Feature'] = x_4
x_4 = self.fc(x_4)
end_points['Predictions'] = F.softmax(input=x_4, dim=-1)
if self.training:
return x_4, end_points
else:
return x_4, end_points, x_3
def resnet18_snr(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet_SNR(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']))
return model
def resnet18_snr_nas(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet_SNR_NAS(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']))
return model
def resnet18_snr_causality(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet_SNR_Causality(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']))
return model
def resnet18_in(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet_IN(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']))
return model | 38.514205 | 116 | 0.599469 | 3,863 | 27,114 | 3.968936 | 0.059021 | 0.021132 | 0.02922 | 0.031307 | 0.860749 | 0.851096 | 0.832051 | 0.81881 | 0.814114 | 0.806027 | 0 | 0.058929 | 0.280888 | 27,114 | 704 | 117 | 38.514205 | 0.727408 | 0.04754 | 0 | 0.765799 | 0 | 0 | 0.016607 | 0.000856 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057621 | false | 0 | 0.007435 | 0 | 0.124535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7272697bdfb1e871a06d71f13f522503e89ee9b | 8,451 | py | Python | bot.py | myaliasism/vore-tracker | 4fac291e368f1cad6aa40139ac9954f89046c445 | [
"MIT"
] | null | null | null | bot.py | myaliasism/vore-tracker | 4fac291e368f1cad6aa40139ac9954f89046c445 | [
"MIT"
] | null | null | null | bot.py | myaliasism/vore-tracker | 4fac291e368f1cad6aa40139ac9954f89046c445 | [
"MIT"
] | null | null | null | import discord
import asyncio
import re
import datetime
import math
# A pattern to match the word vore, and only the single word vore.
pattern = re.compile(r'\b[\*|_|~|`|-|\.]*[V|v][\*|_|~|`|-|\.]*[O|Ò|Ó|Ô|Õ|Ö|o|ò|ó|ô|õ|ö|ᴑ|о][\*|_|~|`|-|\.]*[R|r][\*|_|~|`|-|\.]*[E|È|É|Ê|Ë|Е|e|è|é|ê|ë|е][\*|_|~|`|-|\.]*[S|s]?\b')
serverAndDate = {}
botStartup = datetime.datetime.now()
lastMention = {}
awake = {}
client = discord.Client()
def readTimesFromFile():
global serverAndDate
with open("timeStamps.txt", "r") as target:
for line in target:
tmp = line.split(',')
tmp[1] = tmp[1][0:-1]
serverAndDate[tmp[0]] = datetime.datetime.strptime(tmp[1], "%Y-%m-%d %H:%M:%S")
if (len(tmp) >= 3):
awake[tmp[0]] = bool(tmp[2])
else:
awake[tmp[0]] = True
def writeTimesToFile():
with open('timeStamps.txt', 'w') as target:
for serverId in serverAndDate:
target.write('{},{},{}\n'.format(serverId, serverAndDate[serverId].strftime("%Y-%m-%d %H:%M:%S"), awake[serverId]))
@client.event
async def on_ready():
print('Logged in as')
print(client.user.name)
print(client.user.id)
print('------')
@client.event
async def on_message_edit(before, message):
global botStartup
global serverAndDate
global lastMention
global awake
currentTime = datetime.datetime.now()
serverId = message.server.id
lastReferenced = botStartup
if serverId in serverAndDate:
lastReferenced = serverAndDate[serverId]
if serverId not in lastMention:
bot = None
for x in message.server.members:
if client.user.id == x.id:
bot = x
break
lastMention[serverId] = bot.joined_at
if serverId not in awake:
awake[serverId] = True
# Begin time formatting
diff = currentTime - lastReferenced
hours = math.floor(diff.seconds/3600)
minutes = math.floor((diff.seconds - hours * 3600)/60)
seconds = diff.seconds - hours * 3600 - minutes * 60
dt = "{} days, ".format(diff.days)
ht = "{} hours, ".format(hours)
mt = "{} minutes, and ".format(minutes)
st = "{} seconds".format(seconds)
if diff.days == 1:
dt = "1 day, "
elif diff.days == 0:
dt = ""
if hours == 0:
ht = ""
mt = "{} minutes and ".format(minutes)
if minutes == 0:
mt = ""
if hours == 1:
ht = "1 hour, "
if minutes == 1:
if ht == "":
mt = "1 minute and "
else:
mt = "1 minute, and "
if seconds == 1:
st = "1 second"
# End Time formatting stuff
if (hasattr(message.author, 'server_permissions')):
permission = message.author.server_permissions
if message.content.startswith('!vtsilence') and permission.administrator:
await client.send_message(message.channel, "Ok {}, I'll be quiet now. use '!vtalert' to wake me back up!".format(message.author.mention))
awake[serverId] = False
writeTimesToFile()
elif message.content.startswith('!vtalert') and permission.administrator:
await client.send_message(message.channel, "Ok {}, I'm scanning now.".format(message.author.mention))
awake[serverId] = True
writeTimesToFile()
if message.content.startswith('!vtsilence') or message.content.startswith('!vtalert'):
pass
elif message.content.startswith('!vthelp'):
await client.send_message(message.channel, "You can ask me how long we've made it with '!vt'.\n If you're an admin you can silence me with '!vtsilence' and wake me back up with '!vtalert'")
elif message.content.startswith('!vt'):
await client.send_message(message.channel, 'The server has gone {}{}{}{} without mentioning the forbidden word.'.format(dt, ht, mt, st))
if ((pattern.search(message.content) is not None) and (message.author.id != client.user.id)):
serverAndDate[serverId] = currentTime
writeTimesToFile()
print ("{}::: {} lasted {} seconds.".format(currentTime, serverId, (currentTime - lastMention[serverId]).total_seconds()))
if (awake[serverId] and (currentTime - lastMention[serverId]).total_seconds() >= 1800):
await client.send_message(message.channel, '{} referenced the forbidden word, setting the counter back to 0. I\'ll wait a half hour before warning you again.\n The server went {}{}{}{} without mentioning it.'.format(message.author.mention, dt, ht, mt, st))
lastMention[serverId] = currentTime
@client.event
async def on_message(message):
global botStartup
global serverAndDate
global lastMention
global awake
currentTime = datetime.datetime.now()
serverId = message.server.id
lastReferenced = botStartup
if serverId in serverAndDate:
lastReferenced = serverAndDate[serverId]
if serverId not in lastMention:
bot = None
for x in message.server.members:
if client.user.id == x.id:
bot = x
break
lastMention[serverId] = bot.joined_at
if serverId not in awake:
awake[serverId] = True
# Begin time formatting
diff = currentTime - lastReferenced
hours = math.floor(diff.seconds/3600)
minutes = math.floor((diff.seconds - hours * 3600)/60)
seconds = diff.seconds - hours * 3600 - minutes * 60
dt = "{} days, ".format(diff.days)
ht = "{} hours, ".format(hours)
mt = "{} minutes, and ".format(minutes)
st = "{} seconds".format(seconds)
if diff.days == 1:
dt = "1 day, "
elif diff.days == 0:
dt = ""
if hours == 0:
ht = ""
mt = "{} minutes and ".format(minutes)
if minutes == 0:
mt = ""
if hours == 1:
ht = "1 hour, "
if minutes == 1:
if ht == "":
mt = "1 minute and "
else:
mt = "1 minute, and "
if seconds == 1:
st = "1 second"
# End Time formatting stuff
if (hasattr(message.author, 'server_permissions')):
permission = message.author.server_permissions
if message.content.startswith('!vtsilence') and permission.administrator:
await client.send_message(message.channel, "Ok {}, I'll be quiet now. use '!vtalert' to wake me back up!".format(message.author.mention))
awake[serverId] = False
writeTimesToFile()
elif message.content.startswith('!vtalert') and permission.administrator:
await client.send_message(message.channel, "Ok {}, I'm scanning now.".format(message.author.mention))
awake[serverId] = True
writeTimesToFile()
if message.content.startswith('!vtsilence') or message.content.startswith('!vtalert'):
pass
elif message.content.startswith('!vthelp'):
await client.send_message(message.channel, "You can ask me how long we've made it with '!vt'.\n If you're an admin you can silence me with '!vtsilence' and wake me back up with '!vtalert'")
elif message.content.startswith('!vt'):
await client.send_message(message.channel, 'The server has gone {}{}{}{} without mentioning the forbidden word.'.format(dt, ht, mt, st))
if ((pattern.search(message.content) is not None) and (message.author.id != client.user.id)):
serverAndDate[serverId] = currentTime
writeTimesToFile()
print ("{}::: {} lasted {} seconds.".format(currentTime, serverId, (currentTime - lastMention[serverId]).total_seconds()))
if (awake[serverId] and (currentTime - lastMention[serverId]).total_seconds() >= 1800):
await client.send_message(message.channel, '{} referenced the forbidden word, setting the counter back to 0. I\'ll wait a half hour before warning you again.\n The server went {}{}{}{} without mentioning it.'.format(message.author.mention, dt, ht, mt, st))
lastMention[serverId] = currentTime
readTimesFromFile()
print('Stored server info:')
count = 0
for id in serverAndDate:
print ("{}: id: {}, time: {}".format(count, id, serverAndDate[id]))
count = count + 1
while True:
try:
with open("key.txt", "r") as target:
for line in target:
client.loop.run_until_complete(client.start(line))
except BaseException:
time.sleep(5)
| 40.435407 | 268 | 0.609632 | 1,027 | 8,451 | 4.986368 | 0.189873 | 0.038274 | 0.056239 | 0.04296 | 0.843781 | 0.83968 | 0.821715 | 0.821715 | 0.81117 | 0.81117 | 0 | 0.012474 | 0.250621 | 8,451 | 208 | 269 | 40.629808 | 0.796147 | 0.018933 | 0 | 0.770492 | 0 | 0.021858 | 0.171153 | 0.018588 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010929 | false | 0.010929 | 0.027322 | 0 | 0.038251 | 0.043716 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c72f177b5618e4c18032ef9fbf710aaaa266d279 | 103,791 | py | Python | sdk/edgeorder/azure-mgmt-edgeorder/azure/mgmt/edgeorder/v2020_12_01_preview/operations/_edge_order_management_client_operations.py | vincenttran-msft/azure-sdk-for-python | 348b56f9f03eeb3f7b502eed51daf494ffff874d | [
"MIT"
] | 2,728 | 2015-01-09T10:19:32.000Z | 2022-03-31T14:50:33.000Z | sdk/edgeorder/azure-mgmt-edgeorder/azure/mgmt/edgeorder/v2020_12_01_preview/operations/_edge_order_management_client_operations.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 17,773 | 2015-01-05T15:57:17.000Z | 2022-03-31T23:50:25.000Z | sdk/edgeorder/azure-mgmt-edgeorder/azure/mgmt/edgeorder/v2020_12_01_preview/operations/_edge_order_management_client_operations.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 1,916 | 2015-01-19T05:05:41.000Z | 2022-03-31T19:36:44.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import TYPE_CHECKING
import warnings
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.paging import ItemPaged
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import HttpRequest, HttpResponse
from azure.core.polling import LROPoller, NoPolling, PollingMethod
from azure.mgmt.core.exceptions import ARMErrorFormat
from azure.mgmt.core.polling.arm_polling import ARMPolling
from .. import models as _models
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, Callable, Dict, Generic, Iterable, Optional, TypeVar, Union
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]]
class EdgeOrderManagementClientOperationsMixin(object):
def list_operations(
self,
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.OperationListResult"]
"""This method gets all the operations that are exposed for customer.
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either OperationListResult or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OperationListResult]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OperationListResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_operations.metadata['url'] # type: ignore
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('OperationListResult', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_operations.metadata = {'url': '/providers/Microsoft.EdgeOrder/operations'} # type: ignore
def list_addresses_at_subscription_level(
self,
filter=None, # type: Optional[str]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.AddressResourceList"]
"""Lists all the addresses available under the subscription.
:param filter: $filter is supported to filter based on shipping address properties. Filter
supports only equals operation.
:type filter: str
:param skip_token: $skipToken is supported on Get list of addresses, which provides the next
page in the list of addresses.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either AddressResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddressResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_addresses_at_subscription_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('AddressResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_addresses_at_subscription_level.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/addresses'} # type: ignore
def list_product_families(
self,
product_families_request, # type: "_models.ProductFamiliesRequest"
expand=None, # type: Optional[str]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.ProductFamilies"]
"""This method provides the list of product families for the given subscription.
:param product_families_request: Filters for showing the product families.
:type product_families_request: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.ProductFamiliesRequest
:param expand: $expand is supported on configurations parameter for product, which provides
details on the configurations for the product.
:type expand: str
:param skip_token: $skipToken is supported on list of product families, which provides the next
page in the list of product families.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either ProductFamilies or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.ProductFamilies]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ProductFamilies"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = "application/json"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_product_families.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if expand is not None:
query_parameters['$expand'] = self._serialize.query("expand", expand, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(product_families_request, 'ProductFamiliesRequest')
body_content_kwargs['content'] = body_content
request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(product_families_request, 'ProductFamiliesRequest')
body_content_kwargs['content'] = body_content
request = self._client.get(url, query_parameters, header_parameters, **body_content_kwargs)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('ProductFamilies', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_product_families.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/listProductFamilies'} # type: ignore
def list_configurations(
self,
configurations_request, # type: "_models.ConfigurationsRequest"
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.Configurations"]
"""This method provides the list of configurations for the given product family, product line and
product under subscription.
:param configurations_request: Filters for showing the configurations.
:type configurations_request: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.ConfigurationsRequest
:param skip_token: $skipToken is supported on list of configurations, which provides the next
page in the list of configurations.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either Configurations or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.Configurations]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.Configurations"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = "application/json"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_configurations.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(configurations_request, 'ConfigurationsRequest')
body_content_kwargs['content'] = body_content
request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(configurations_request, 'ConfigurationsRequest')
body_content_kwargs['content'] = body_content
request = self._client.get(url, query_parameters, header_parameters, **body_content_kwargs)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('Configurations', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_configurations.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/listConfigurations'} # type: ignore
def list_product_families_metadata(
self,
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.ProductFamiliesMetadata"]
"""This method provides the list of product families metadata for the given subscription.
:param skip_token: $skipToken is supported on list of product families metadata, which provides
the next page in the list of product families metadata.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either ProductFamiliesMetadata or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.ProductFamiliesMetadata]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ProductFamiliesMetadata"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_product_families_metadata.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.post(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('ProductFamiliesMetadata', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_product_families_metadata.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/productFamiliesMetadata'} # type: ignore
def list_order_at_subscription_level(
self,
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.OrderResourceList"]
"""Lists order at subscription level.
:param skip_token: $skipToken is supported on Get list of order, which provides the next page
in the list of order.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either OrderResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_order_at_subscription_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('OrderResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_order_at_subscription_level.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/orders'} # type: ignore
def list_order_items_at_subscription_level(
self,
filter=None, # type: Optional[str]
expand=None, # type: Optional[str]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.OrderItemResourceList"]
"""Lists order item at subscription level.
:param filter: $filter is supported to filter based on order id. Filter supports only equals
operation.
:type filter: str
:param expand: $expand is supported on device details, forward shipping details and reverse
shipping details parameters. Each of these can be provided as a comma separated list. Device
Details for order item provides details on the devices of the product, Forward and Reverse
Shipping details provide forward and reverse shipping details respectively.
:type expand: str
:param skip_token: $skipToken is supported on Get list of order items, which provides the next
page in the list of order items.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either OrderItemResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderItemResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_order_items_at_subscription_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if expand is not None:
query_parameters['$expand'] = self._serialize.query("expand", expand, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('OrderItemResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_order_items_at_subscription_level.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.EdgeOrder/orderItems'} # type: ignore
def list_addresses_at_resource_group_level(
self,
resource_group_name, # type: str
filter=None, # type: Optional[str]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.AddressResourceList"]
"""Lists all the addresses available under the given resource group.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param filter: $filter is supported to filter based on shipping address properties. Filter
supports only equals operation.
:type filter: str
:param skip_token: $skipToken is supported on Get list of addresses, which provides the next
page in the list of address.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either AddressResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddressResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_addresses_at_resource_group_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('AddressResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_addresses_at_resource_group_level.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses'} # type: ignore
def get_address_by_name(
self,
address_name, # type: str
resource_group_name, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.AddressResource"
"""Gets information about the specified address.
:param address_name: The name of the address Resource within the specified resource group.
address names must be between 3 and 24 characters in length and use any alphanumeric and
underscore only.
:type address_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: AddressResource, or the result of cls(response)
:rtype: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResource
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddressResource"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
# Construct URL
url = self.get_address_by_name.metadata['url'] # type: ignore
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = self._deserialize('AddressResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_address_by_name.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def _create_address_initial(
self,
address_name, # type: str
resource_group_name, # type: str
address_resource, # type: "_models.AddressResource"
**kwargs # type: Any
):
# type: (...) -> Optional["_models.AddressResource"]
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.AddressResource"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_address_initial.metadata['url'] # type: ignore
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(address_resource, 'AddressResource')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AddressResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_address_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def begin_create_address(
self,
address_name, # type: str
resource_group_name, # type: str
address_resource, # type: "_models.AddressResource"
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.AddressResource"]
"""Creates a new address with the specified parameters. Existing address cannot be updated with
this API and should instead be updated with the Update address API.
:param address_name: The name of the address Resource within the specified resource group.
address names must be between 3 and 24 characters in length and use any alphanumeric and
underscore only.
:type address_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param address_resource: Address details from request body.
:type address_resource: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResource
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either AddressResource or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResource]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddressResource"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._create_address_initial(
address_name=address_name,
resource_group_name=resource_group_name,
address_resource=address_resource,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('AddressResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_address.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def _delete_address_by_name_initial(
self,
address_name, # type: str
resource_group_name, # type: str
**kwargs # type: Any
):
# type: (...) -> None
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
# Construct URL
url = self._delete_address_by_name_initial.metadata['url'] # type: ignore
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_delete_address_by_name_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def begin_delete_address_by_name(
self,
address_name, # type: str
resource_group_name, # type: str
**kwargs # type: Any
):
# type: (...) -> LROPoller[None]
"""Deletes an address.
:param address_name: The name of the address Resource within the specified resource group.
address names must be between 3 and 24 characters in length and use any alphanumeric and
underscore only.
:type address_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._delete_address_by_name_initial(
address_name=address_name,
resource_group_name=resource_group_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_delete_address_by_name.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def _update_address_initial(
self,
address_name, # type: str
resource_group_name, # type: str
address_update_parameter, # type: "_models.AddressUpdateParameter"
if_match=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Optional["_models.AddressResource"]
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.AddressResource"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._update_address_initial.metadata['url'] # type: ignore
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
if if_match is not None:
header_parameters['If-Match'] = self._serialize.header("if_match", if_match, 'str')
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(address_update_parameter, 'AddressUpdateParameter')
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AddressResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_update_address_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def begin_update_address(
self,
address_name, # type: str
resource_group_name, # type: str
address_update_parameter, # type: "_models.AddressUpdateParameter"
if_match=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.AddressResource"]
"""Updates the properties of an existing address.
:param address_name: The name of the address Resource within the specified resource group.
address names must be between 3 and 24 characters in length and use any alphanumeric and
underscore only.
:type address_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param address_update_parameter: Address update parameters from request body.
:type address_update_parameter: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressUpdateParameter
:param if_match: Defines the If-Match condition. The patch will be performed only if the ETag
of the job on the server matches this value.
:type if_match: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either AddressResource or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.mgmt.edgeorder.v2020_12_01_preview.models.AddressResource]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddressResource"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._update_address_initial(
address_name=address_name,
resource_group_name=resource_group_name,
address_update_parameter=address_update_parameter,
if_match=if_match,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('AddressResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'addressName': self._serialize.url("address_name", address_name, 'str', max_length=24, min_length=3, pattern=r'^[-\w\.]+$'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_update_address.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/addresses/{addressName}'} # type: ignore
def list_order_at_resource_group_level(
self,
resource_group_name, # type: str
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.OrderResourceList"]
"""Lists order at resource group level.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param skip_token: $skipToken is supported on Get list of order, which provides the next page
in the list of order.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either OrderResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_order_at_resource_group_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('OrderResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_order_at_resource_group_level.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orders'} # type: ignore
def get_order_by_name(
self,
order_name, # type: str
resource_group_name, # type: str
location, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.OrderResource"
"""Gets an order.
:param order_name: The name of the order.
:type order_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param location: The name of Azure region.
:type location: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: OrderResource, or the result of cls(response)
:rtype: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderResource
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderResource"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
# Construct URL
url = self.get_order_by_name.metadata['url'] # type: ignore
path_format_arguments = {
'orderName': self._serialize.url("order_name", order_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'location': self._serialize.url("location", location, 'str', min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = self._deserialize('OrderResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_order_by_name.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/locations/{location}/orders/{orderName}'} # type: ignore
def list_order_items_at_resource_group_level(
self,
resource_group_name, # type: str
filter=None, # type: Optional[str]
expand=None, # type: Optional[str]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.OrderItemResourceList"]
"""Lists order item at resource group level.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param filter: $filter is supported to filter based on order id. Filter supports only equals
operation.
:type filter: str
:param expand: $expand is supported on device details, forward shipping details and reverse
shipping details parameters. Each of these can be provided as a comma separated list. Device
Details for order item provides details on the devices of the product, Forward and Reverse
Shipping details provide forward and reverse shipping details respectively.
:type expand: str
:param skip_token: $skipToken is supported on Get list of order items, which provides the next
page in the list of order items.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either OrderItemResourceList or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResourceList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderItemResourceList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_order_items_at_resource_group_level.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if expand is not None:
query_parameters['$expand'] = self._serialize.query("expand", expand, 'str')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('OrderItemResourceList', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list_order_items_at_resource_group_level.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems'} # type: ignore
def get_order_item_by_name(
self,
order_item_name, # type: str
resource_group_name, # type: str
expand=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> "_models.OrderItemResource"
"""Gets an order item.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param expand: $expand is supported on device details, forward shipping details and reverse
shipping details parameters. Each of these can be provided as a comma separated list. Device
Details for order item provides details on the devices of the product, Forward and Reverse
Shipping details provide forward and reverse shipping details respectively.
:type expand: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: OrderItemResource, or the result of cls(response)
:rtype: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResource
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderItemResource"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
# Construct URL
url = self.get_order_item_by_name.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
if expand is not None:
query_parameters['$expand'] = self._serialize.query("expand", expand, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = self._deserialize('OrderItemResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_order_item_by_name.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def _create_order_item_initial(
self,
order_item_name, # type: str
resource_group_name, # type: str
order_item_resource, # type: "_models.OrderItemResource"
**kwargs # type: Any
):
# type: (...) -> Optional["_models.OrderItemResource"]
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.OrderItemResource"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_order_item_initial.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(order_item_resource, 'OrderItemResource')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('OrderItemResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_order_item_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def begin_create_order_item(
self,
order_item_name, # type: str
resource_group_name, # type: str
order_item_resource, # type: "_models.OrderItemResource"
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.OrderItemResource"]
"""Creates an order item. Existing order item cannot be updated with this api and should instead
be updated with the Update order item API.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param order_item_resource: Order item details from request body.
:type order_item_resource: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResource
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either OrderItemResource or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResource]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderItemResource"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._create_order_item_initial(
order_item_name=order_item_name,
resource_group_name=resource_group_name,
order_item_resource=order_item_resource,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('OrderItemResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_order_item.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def _delete_order_item_by_name_initial(
self,
order_item_name, # type: str
resource_group_name, # type: str
**kwargs # type: Any
):
# type: (...) -> None
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
accept = "application/json"
# Construct URL
url = self._delete_order_item_by_name_initial.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_delete_order_item_by_name_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def begin_delete_order_item_by_name(
self,
order_item_name, # type: str
resource_group_name, # type: str
**kwargs # type: Any
):
# type: (...) -> LROPoller[None]
"""Deletes an order item.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._delete_order_item_by_name_initial(
order_item_name=order_item_name,
resource_group_name=resource_group_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_delete_order_item_by_name.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def _update_order_item_initial(
self,
order_item_name, # type: str
resource_group_name, # type: str
order_item_update_parameter, # type: "_models.OrderItemUpdateParameter"
if_match=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Optional["_models.OrderItemResource"]
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.OrderItemResource"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._update_order_item_initial.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
if if_match is not None:
header_parameters['If-Match'] = self._serialize.header("if_match", if_match, 'str')
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(order_item_update_parameter, 'OrderItemUpdateParameter')
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('OrderItemResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_update_order_item_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def begin_update_order_item(
self,
order_item_name, # type: str
resource_group_name, # type: str
order_item_update_parameter, # type: "_models.OrderItemUpdateParameter"
if_match=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.OrderItemResource"]
"""Updates the properties of an existing order item.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param order_item_update_parameter: order item update parameters from request body.
:type order_item_update_parameter: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemUpdateParameter
:param if_match: Defines the If-Match condition. The patch will be performed only if the ETag
of the order on the server matches this value.
:type if_match: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either OrderItemResource or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.mgmt.edgeorder.v2020_12_01_preview.models.OrderItemResource]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.OrderItemResource"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._update_order_item_initial(
order_item_name=order_item_name,
resource_group_name=resource_group_name,
order_item_update_parameter=order_item_update_parameter,
if_match=if_match,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('OrderItemResource', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_update_order_item.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}'} # type: ignore
def cancel_order_item(
self,
order_item_name, # type: str
resource_group_name, # type: str
cancellation_reason, # type: "_models.CancellationReason"
**kwargs # type: Any
):
# type: (...) -> None
"""Cancel order item.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param cancellation_reason: Reason for cancellation.
:type cancellation_reason: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.CancellationReason
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self.cancel_order_item.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(cancellation_reason, 'CancellationReason')
body_content_kwargs['content'] = body_content
request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
cancel_order_item.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}/cancel'} # type: ignore
def _return_order_item_initial(
self,
order_item_name, # type: str
resource_group_name, # type: str
return_order_item_details, # type: "_models.ReturnOrderItemDetails"
**kwargs # type: Any
):
# type: (...) -> None
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-12-01-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._return_order_item_initial.metadata['url'] # type: ignore
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(return_order_item_details, 'ReturnOrderItemDetails')
body_content_kwargs['content'] = body_content
request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_return_order_item_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}/return'} # type: ignore
def begin_return_order_item(
self,
order_item_name, # type: str
resource_group_name, # type: str
return_order_item_details, # type: "_models.ReturnOrderItemDetails"
**kwargs # type: Any
):
# type: (...) -> LROPoller[None]
"""Return order item.
:param order_item_name: The name of the order item.
:type order_item_name: str
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param return_order_item_details: Return order item CurrentStatus.
:type return_order_item_details: ~azure.mgmt.edgeorder.v2020_12_01_preview.models.ReturnOrderItemDetails
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be ARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._return_order_item_initial(
order_item_name=order_item_name,
resource_group_name=resource_group_name,
return_order_item_details=return_order_item_details,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'orderItemName': self._serialize.url("order_item_name", order_item_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
}
if polling is True: polling_method = ARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_return_order_item.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EdgeOrder/orderItems/{orderItemName}/return'} # type: ignore
| 52.366801 | 200 | 0.661107 | 11,499 | 103,791 | 5.72615 | 0.031046 | 0.028825 | 0.027109 | 0.013182 | 0.951781 | 0.946268 | 0.938659 | 0.929121 | 0.924778 | 0.918111 | 0 | 0.010021 | 0.243335 | 103,791 | 1,981 | 201 | 52.393236 | 0.828395 | 0.260726 | 0 | 0.846096 | 0 | 0.000758 | 0.128497 | 0.054275 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04928 | false | 0 | 0.00834 | 0 | 0.121304 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c75d31f776e327724d27f1dddf6f217639183d5b | 94 | py | Python | icsdll/__init__.py | cgohlke/icsdll | 97ecf9761e55ffd34eb736f62f178326a4912836 | [
"BSD-3-Clause"
] | 1 | 2021-03-02T23:49:06.000Z | 2021-03-02T23:49:06.000Z | icsdll/__init__.py | cgohlke/icsdll | 97ecf9761e55ffd34eb736f62f178326a4912836 | [
"BSD-3-Clause"
] | null | null | null | icsdll/__init__.py | cgohlke/icsdll | 97ecf9761e55ffd34eb736f62f178326a4912836 | [
"BSD-3-Clause"
] | null | null | null | # icsdll/__init__.py
from .icsdll import __doc__, __all__, __version__
from .icsdll import *
| 18.8 | 49 | 0.776596 | 12 | 94 | 4.75 | 0.666667 | 0.350877 | 0.561404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138298 | 94 | 4 | 50 | 23.5 | 0.703704 | 0.191489 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c78bc93d659742aebb0b8623fc23989630f8fe9e | 16,662 | py | Python | tests/unit/action/test_bring_resources_on_build.py | buxx/rolling | ef1268fe6ddabe768a125c3ce8b37e0b9cbad4a5 | [
"MIT"
] | 14 | 2019-11-16T18:51:51.000Z | 2022-01-15T17:50:34.000Z | tests/unit/action/test_bring_resources_on_build.py | buxx/rolling | ef1268fe6ddabe768a125c3ce8b37e0b9cbad4a5 | [
"MIT"
] | 148 | 2018-12-10T09:07:45.000Z | 2022-03-08T10:51:04.000Z | tests/unit/action/test_bring_resources_on_build.py | buxx/rolling | ef1268fe6ddabe768a125c3ce8b37e0b9cbad4a5 | [
"MIT"
] | 1 | 2020-08-05T14:25:48.000Z | 2020-08-05T14:25:48.000Z | # coding: utf-8
import pytest
from unittest import mock
from rolling.action.base import ActionDescriptionModel
from rolling.action.build import BringResourceModel
from rolling.action.build import BringResourcesOnBuild
from rolling.action.build_deposit import DepositToBuildAction
from rolling.action.build_deposit import DepositToModel
from rolling.action.build_take import TakeFromBuildAction
from rolling.action.build_take import TakeFromModel
from rolling.exception import ImpossibleAction
from rolling.kernel import Kernel
from rolling.model.character import CharacterModel
from rolling.model.stuff import StuffModel
from rolling.rolling_types import ActionType
from rolling.server.document.build import BuildDocument
from rolling.server.document.resource import ResourceDocument
@pytest.fixture
def worldmapc_mock_build_document(
worldmapc_kernel: Kernel,
) -> BuildDocument:
kernel = worldmapc_kernel
return kernel.build_lib.place_build(
world_col_i=0,
world_row_i=0,
zone_col_i=0,
zone_row_i=0,
build_id="TEST_BUILD_1",
under_construction=True,
)
@pytest.fixture
def build4(
worldmapc_kernel: Kernel,
) -> BuildDocument:
kernel = worldmapc_kernel
return kernel.build_lib.place_build(
world_col_i=0,
world_row_i=0,
zone_col_i=0,
zone_row_i=0,
build_id="TEST_BUILD_4",
under_construction=True,
)
@pytest.fixture
def build5(
worldmapc_kernel: Kernel,
) -> BuildDocument:
kernel = worldmapc_kernel
return kernel.build_lib.place_build(
world_col_i=0,
world_row_i=0,
zone_col_i=0,
zone_row_i=0,
build_id="TEST_BUILD_5",
under_construction=True,
)
@pytest.fixture
def build6(
worldmapc_kernel: Kernel,
) -> BuildDocument:
kernel = worldmapc_kernel
return kernel.build_lib.place_build(
world_col_i=0,
world_row_i=0,
zone_col_i=0,
zone_row_i=0,
build_id="TEST_BUILD_6",
under_construction=True,
)
@pytest.fixture
def action(worldmapc_kernel: Kernel) -> BringResourcesOnBuild:
action = BringResourcesOnBuild(
worldmapc_kernel,
description=ActionDescriptionModel(
id="ACTION_ID",
action_type=ActionType.BRING_RESOURCE_ON_BUILD,
base_cost=0.5,
properties={},
),
)
yield action
@pytest.fixture
def deposit_action(worldmapc_kernel: Kernel) -> DepositToBuildAction:
return DepositToBuildAction(
worldmapc_kernel,
description=ActionDescriptionModel(
id="DEPOSIT_ON_BUILD",
action_type=ActionType.DEPOSIT_ON_BUILD,
base_cost=1.0,
properties={},
),
)
@pytest.fixture
def take_action(worldmapc_kernel: Kernel) -> TakeFromBuildAction:
return TakeFromBuildAction(
worldmapc_kernel,
description=ActionDescriptionModel(
id="TAKE_FROM_BUILD",
action_type=ActionType.TAKE_FROM_BUILD,
base_cost=1.0,
properties={},
),
)
class TestBringResourcesOnBuild:
def test_unit__get_character_actions__nothing_on_place_and_no_progress(
self,
action: BringResourcesOnBuild,
worldmapc_xena_model: CharacterModel,
worldmapc_mock_build_document: BuildDocument,
) -> None:
build = worldmapc_mock_build_document
xena = worldmapc_xena_model
character_actions = action.get_character_actions(xena, build.id)
assert character_actions
assert 1 == len(character_actions)
character_action = character_actions.pop()
assert (
"/character/xena/with-build-action/"
f"BRING_RESOURCE_ON_BUILD/{build.id}/ACTION_ID"
"?resource_id=BRANCHES" == character_action.link
)
assert (
"Apporter Petit bois pour la construction "
"(manque 0.001 mètre cubes soit 100%)" == character_action.name
)
def test_unit__get_character_actions__something_on_place_and_no_progress(
self,
action: BringResourcesOnBuild,
worldmapc_kernel: Kernel,
worldmapc_xena_model: CharacterModel,
worldmapc_mock_build_document: BuildDocument,
) -> None:
kernel = worldmapc_kernel
build = worldmapc_mock_build_document
xena = worldmapc_xena_model
# Add some resources in building
kernel.server_db_session.add(
ResourceDocument(
resource_id="BRANCHES", # see src/game1/game.toml
quantity=0.00075, # 50%, see src/game1/game.toml
in_built_id=build.id,
)
)
kernel.server_db_session.commit()
character_actions = action.get_character_actions(xena, build.id)
assert character_actions
assert 1 == len(character_actions)
character_action = character_actions.pop()
assert (
"/character/xena/with-build-action/"
f"BRING_RESOURCE_ON_BUILD/{build.id}/ACTION_ID"
"?resource_id=BRANCHES" == character_action.link
)
assert (
"Apporter Petit bois pour la construction "
"(manque 0.00025 mètre cubes soit 25%)" == character_action.name
)
def test_unit__get_character_actions__something_on_place_and_have_progress(
self,
action: BringResourcesOnBuild,
worldmapc_kernel: Kernel,
worldmapc_xena_model: CharacterModel,
worldmapc_mock_build_document: BuildDocument,
) -> None:
kernel = worldmapc_kernel
build = worldmapc_mock_build_document
xena = worldmapc_xena_model
# Add some resources in building
kernel.server_db_session.add(
ResourceDocument(
resource_id="BRANCHES", # see src/game1/game.toml
quantity=0.00025, # 50%, see src/game1/game.toml
in_built_id=build.id,
)
)
build.ap_spent = 1.0 # 50%, see src/game1/game.toml
kernel.server_db_session.add(build)
kernel.server_db_session.commit()
character_actions = action.get_character_actions(xena, build.id)
assert character_actions
assert 1 == len(character_actions)
character_action = character_actions.pop()
assert (
"/character/xena/with-build-action/"
f"BRING_RESOURCE_ON_BUILD/{build.id}/ACTION_ID"
"?resource_id=BRANCHES" == character_action.link
)
assert (
"Apporter Petit bois pour la construction "
"(manque 0.00025 mètre cubes soit 25%)" == character_action.name
)
def test_unit__get_character_actions__all_on_place_and_have_progress(
self,
action: BringResourcesOnBuild,
worldmapc_kernel: Kernel,
worldmapc_xena_model: CharacterModel,
worldmapc_mock_build_document: BuildDocument,
) -> None:
kernel = worldmapc_kernel
build = worldmapc_mock_build_document
xena = worldmapc_xena_model
# Add some resources in building
kernel.server_db_session.add(
ResourceDocument(
resource_id="BRANCHES", # see src/game1/game.toml
quantity=0.0005, # 50%, see src/game1/game.toml
in_built_id=build.id,
)
)
build.ap_spent = 1.0 # 50%, see src/game1/game.toml
kernel.server_db_session.add(build)
kernel.server_db_session.commit()
character_actions = action.get_character_actions(xena, build.id)
assert not character_actions
async def test_unit__perform__nothing_on_place_and_no_progress(
self,
action: BringResourcesOnBuild,
worldmapc_kernel: Kernel,
worldmapc_xena_model: CharacterModel,
worldmapc_mock_build_document: BuildDocument,
) -> None:
kernel = worldmapc_kernel
build = worldmapc_mock_build_document
xena = worldmapc_xena_model
kernel.resource_lib.add_resource_to(
character_id=xena.id, resource_id="BRANCHES", quantity=0.00025
)
assert not kernel.resource_lib.get_stored_in_build(build.id)
await action.perform(
xena,
build_id=build.id,
input_=BringResourceModel(
resource_id="BRANCHES", quantity="0.00025" # see src/game1/game.toml
),
)
resources = kernel.resource_lib.get_stored_in_build(build.id)
assert resources
resource = resources.pop()
assert resource.id == "BRANCHES"
assert 0.00025 == resource.quantity
@pytest.mark.usefixtures("worldmapc_xena_wood")
async def test_deposit_resource_on_build_allowing_it_because_allow_all(
self,
worldmapc_kernel: Kernel,
build5: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
# Given
assert not kernel.resource_lib.get_stored_in_build(build5.id)
# When
await deposit_action.perform(
character=xena,
build_id=build5.id,
input_=DepositToModel(
deposit_resource_id="WOOD", deposit_resource_quantity="0.2"
),
)
# Then
assert kernel.resource_lib.get_stored_in_build(build5.id)
assert 1 == len(kernel.resource_lib.get_stored_in_build(build5.id))
assert kernel.resource_lib.get_stored_in_build(build5.id)[0].id == "WOOD"
@pytest.mark.usefixtures("worldmapc_xena_wood")
@pytest.mark.usefixtures("worldmapc_xena_stone")
async def test_deposit_resource_on_build_allowing_it_because_allow_limit(
self,
worldmapc_kernel: Kernel,
build6: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
# Given
assert not kernel.resource_lib.get_stored_in_build(build6.id)
# When
with pytest.raises(ImpossibleAction):
await deposit_action.perform(
character=xena,
build_id=build6.id,
input_=DepositToModel(
deposit_resource_id="WOOD", deposit_resource_quantity="0.2"
),
)
await deposit_action.perform(
character=xena,
build_id=build6.id,
input_=DepositToModel(
deposit_resource_id="STONE", deposit_resource_quantity="2"
),
)
# Then
assert kernel.resource_lib.get_stored_in_build(build6.id)
assert 1 == len(kernel.resource_lib.get_stored_in_build(build6.id))
assert kernel.resource_lib.get_stored_in_build(build6.id)[0].id == "STONE"
@pytest.mark.usefixtures("worldmapc_xena_stone")
async def test_deposit_resource_on_build_refusing_it_because_not_allow(
self,
worldmapc_kernel: Kernel,
build4: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
# Given
assert not kernel.resource_lib.get_stored_in_build(build4.id)
# When
with pytest.raises(ImpossibleAction):
await deposit_action.perform(
character=xena,
build_id=build4.id,
input_=DepositToModel(
deposit_resource_id="STONE", deposit_resource_quantity="2"
),
)
# Then
assert not kernel.resource_lib.get_stored_in_build(build4.id)
async def test_deposit_stuff_on_build_allowing_it_because_allow_all(
self,
worldmapc_kernel: Kernel,
build5: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
worldmapc_xena_haxe_weapon: StuffModel,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
haxe = worldmapc_xena_haxe_weapon
# Given
assert not kernel.resource_lib.get_stored_in_build(build5.id)
# When
await deposit_action.perform(
character=xena,
build_id=build5.id,
input_=DepositToModel(
deposit_stuff_id=haxe.id, deposit_resource_quantity="1"
),
)
# Then
assert kernel.stuff_lib.get_from_build(build5.id)
assert 1 == len(kernel.stuff_lib.get_from_build(build5.id))
assert kernel.stuff_lib.get_from_build(build5.id)[0].id == haxe.id
async def test_deposit_stuff_on_build_refusing_it_because_allow_limit(
self,
worldmapc_kernel: Kernel,
build6: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
worldmapc_xena_haxe_weapon: StuffModel,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
haxe = worldmapc_xena_haxe_weapon
# Given
assert not kernel.resource_lib.get_stored_in_build(build6.id)
# When
with pytest.raises(ImpossibleAction):
await deposit_action.perform(
character=xena,
build_id=build6.id,
input_=DepositToModel(
deposit_stuff_id=haxe.id, deposit_stuff_quantity=1
),
)
# Then
assert not kernel.stuff_lib.get_from_build(build6.id)
async def test_deposit_stuff_on_build_refusing_it_because_not_allow(
self,
worldmapc_kernel: Kernel,
build4: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
worldmapc_xena_haxe_weapon: StuffModel,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
haxe = worldmapc_xena_haxe_weapon
# Given
assert not kernel.resource_lib.get_stored_in_build(build4.id)
# When
with pytest.raises(ImpossibleAction):
await deposit_action.perform(
character=xena,
build_id=build4.id,
input_=DepositToModel(
deposit_stuff_id=haxe.id, deposit_stuff_quantity=1
),
)
# Then
assert not kernel.stuff_lib.get_from_build(build4.id)
@pytest.mark.usefixtures("worldmapc_xena_wood")
async def test_take_resource_from_build(
self,
worldmapc_kernel: Kernel,
build5: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
take_action: TakeFromBuildAction,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
# Given
await deposit_action.perform(
character=xena,
build_id=build5.id,
input_=DepositToModel(
deposit_resource_id="WOOD", deposit_resource_quantity="0.2"
),
)
assert kernel.resource_lib.get_stored_in_build(build5.id)
# When
await take_action.perform(
character=xena,
build_id=build5.id,
input_=TakeFromModel(take_resource_id="WOOD", take_resource_quantity="0.2"),
)
# Then
assert not kernel.resource_lib.get_stored_in_build(build5.id)
async def test_take_stuff_from_build(
self,
worldmapc_kernel: Kernel,
build5: BuildDocument,
worldmapc_xena_model: CharacterModel,
deposit_action: DepositToBuildAction,
take_action: TakeFromBuildAction,
worldmapc_xena_haxe_weapon: StuffModel,
) -> None:
kernel = worldmapc_kernel
xena = worldmapc_xena_model
haxe = worldmapc_xena_haxe_weapon
# Given
await deposit_action.perform(
character=xena,
build_id=build5.id,
input_=DepositToModel(deposit_stuff_id=haxe.id, deposit_stuff_quantity=1),
)
assert kernel.stuff_lib.get_from_build(build5.id)
# When
await take_action.perform(
character=xena,
build_id=build5.id,
input_=TakeFromModel(take_stuff_id=haxe.id, take_stuff_quantity=1),
)
# Then
assert not kernel.stuff_lib.get_from_build(build5.id)
| 32.606654 | 88 | 0.646141 | 1,802 | 16,662 | 5.633185 | 0.081576 | 0.049946 | 0.046104 | 0.033494 | 0.861196 | 0.83844 | 0.790661 | 0.790267 | 0.787115 | 0.745345 | 0 | 0.014694 | 0.281119 | 16,662 | 510 | 89 | 32.670588 | 0.832777 | 0.028508 | 0 | 0.72381 | 0 | 0 | 0.050777 | 0.018391 | 0 | 0 | 0 | 0 | 0.092857 | 1 | 0.02619 | false | 0 | 0.038095 | 0.004762 | 0.080952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7a334157090708f2ac29b1773d5e308550dd903 | 2,529 | py | Python | backend/tradersplatform/wallet/migrations/0001_initial.py | ybedirhanpak/bounswe2019group1 | 9572fd307345b3f842c2c2ff4426857086484ed5 | [
"MIT"
] | 10 | 2019-02-14T14:53:49.000Z | 2019-10-23T08:03:39.000Z | backend/tradersplatform/wallet/migrations/0001_initial.py | ybedirhanpak/bounswe2019group1 | 9572fd307345b3f842c2c2ff4426857086484ed5 | [
"MIT"
] | 364 | 2019-02-14T14:50:12.000Z | 2022-02-10T13:43:09.000Z | backend/tradersplatform/wallet/migrations/0001_initial.py | bounswe/bounswe2019group1 | 9572fd307345b3f842c2c2ff4426857086484ed5 | [
"MIT"
] | 8 | 2019-05-05T20:04:31.000Z | 2020-12-24T16:44:54.000Z | # Generated by Django 2.2.6 on 2019-11-23 12:08
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('myuser', '0005_templateuser_is_public'),
]
operations = [
migrations.CreateModel(
name='Wallet',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('USD', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('Sent_USD', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('Wealth_USD', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('BTC', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('ETH', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('LTC', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('XAG', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('XAU', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('XRH', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('GOOGL', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('AAPL', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('GM', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('EUR', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('GBP', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('TRY', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('SPY', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('IVV', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('VTI', models.DecimalField(blank=True, decimal_places=10, max_digits=20, null=True)),
('owner', models.ForeignKey(default='', on_delete=django.db.models.deletion.CASCADE, related_name='owner', to='myuser.TemplateUser')),
],
),
]
| 60.214286 | 150 | 0.637802 | 308 | 2,529 | 5.087662 | 0.243506 | 0.206765 | 0.264199 | 0.310147 | 0.706445 | 0.706445 | 0.706445 | 0.706445 | 0.706445 | 0.706445 | 0 | 0.045775 | 0.213919 | 2,529 | 41 | 151 | 61.682927 | 0.742455 | 0.017794 | 0 | 0 | 1 | 0 | 0.056406 | 0.010878 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c7b8282296dc817aa28d6bf1624de767d654ad67 | 3,635 | py | Python | tests/test_push.py | ProzorroUKR/prozorro_chronograph | f8a560322259b5bb07035b133f545a614130de73 | [
"Apache-2.0"
] | null | null | null | tests/test_push.py | ProzorroUKR/prozorro_chronograph | f8a560322259b5bb07035b133f545a614130de73 | [
"Apache-2.0"
] | null | null | null | tests/test_push.py | ProzorroUKR/prozorro_chronograph | f8a560322259b5bb07035b133f545a614130de73 | [
"Apache-2.0"
] | null | null | null | from uuid import uuid4
from datetime import timedelta
from unittest.mock import patch, AsyncMock
from prozorro_chronograph.utils import get_now
from prozorro_chronograph.scheduler import push
from .base import BaseTenderTest
class TestTenderPush(BaseTenderTest):
@patch("prozorro_chronograph.scheduler.asyncio.sleep")
@patch("prozorro_chronograph.scheduler.get_feed_position", AsyncMock(return_value={"server_id": "value"}))
@patch("prozorro_chronograph.scheduler.LOGGER.error")
async def test_push_recheck_mode(self, mock_logger_error, *args):
mode = "recheck"
tender_id = uuid4().hex
server_id = "value"
return_recheck_tender = (get_now() + timedelta(minutes=1)).isoformat()
with patch("prozorro_chronograph.scheduler.recheck_tender",
AsyncMock(side_effect=[Exception(), return_recheck_tender])) as mock_recheck_tender:
await push(mode, tender_id, server_id)
mock_logger_error.assert_called_once_with(f"Error on {mode} tender {tender_id}: {repr(Exception())}")
mock_recheck_tender.assert_called_with(tender_id)
@patch("prozorro_chronograph.scheduler.asyncio.sleep")
@patch("prozorro_chronograph.scheduler.get_feed_position", AsyncMock(return_value={"server_id": "value"}))
@patch("prozorro_chronograph.scheduler.LOGGER.error")
async def test_push_resync_mode(self, mock_logger_error, *args):
mode = "resync"
tender_id = uuid4().hex
server_id = "value"
return_resync_tender = (get_now() + timedelta(minutes=1)).isoformat()
with patch("prozorro_chronograph.scheduler.resync_tender",
AsyncMock(side_effect=[Exception(), return_resync_tender])) as mock_resync_tender:
await push(mode, tender_id, server_id)
mock_logger_error.assert_called_once_with(f"Error on {mode} tender {tender_id}: {repr(Exception())}")
mock_resync_tender.assert_called_with(tender_id)
@patch("prozorro_chronograph.scheduler.asyncio.sleep")
@patch("prozorro_chronograph.scheduler.get_feed_position", AsyncMock(return_value={"server_id": "value"}))
@patch("prozorro_chronograph.scheduler.SESSION.cookie_jar.update_cookies")
async def test_push_resync_server_id_none_with_feed_position(self, mock_update_cookies, *args):
mode = "resync"
tender_id = uuid4().hex
server_id = None
return_resync_tender = (get_now() + timedelta(minutes=1)).isoformat()
with patch("prozorro_chronograph.scheduler.resync_tender",
AsyncMock(return_value=return_resync_tender)) as mock_resync_tender:
await push(mode, tender_id, server_id)
mock_update_cookies.assert_called_with({"SERVER_ID": "value"})
mock_resync_tender.assert_called_with(tender_id)
@patch("prozorro_chronograph.scheduler.asyncio.sleep")
@patch("prozorro_chronograph.scheduler.get_feed_position", AsyncMock(return_value=None))
@patch("prozorro_chronograph.scheduler.SESSION.cookie_jar.update_cookies")
async def test_push_resync_server_id_none_without_feed_position(self, mock_update_cookies, *args):
mode = "resync"
tender_id = uuid4().hex
server_id = None
return_resync_tender = (get_now() + timedelta(minutes=1)).isoformat()
with patch("prozorro_chronograph.scheduler.resync_tender",
AsyncMock(return_value=return_resync_tender)) as mock_resync_tender:
await push(mode, tender_id, server_id)
mock_update_cookies.assert_called_with({"SERVER_ID": None})
mock_resync_tender.assert_called_with(tender_id)
| 54.253731 | 110 | 0.730949 | 444 | 3,635 | 5.619369 | 0.141892 | 0.137074 | 0.190782 | 0.211623 | 0.875752 | 0.873347 | 0.849699 | 0.828056 | 0.795992 | 0.780762 | 0 | 0.002964 | 0.164787 | 3,635 | 66 | 111 | 55.075758 | 0.818841 | 0 | 0 | 0.661017 | 0 | 0 | 0.266575 | 0.208803 | 0 | 0 | 0 | 0 | 0.135593 | 1 | 0 | false | 0 | 0.101695 | 0 | 0.118644 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c7edf043979693a438ba428e3a66ba9eac283aae | 852 | py | Python | Unit1_hw_assignments/chapter3_assignment.py | drewb101/LC101 | 9c6d71a9127f9e568f7c85d5c8f7824bff19c34f | [
"Unlicense"
] | null | null | null | Unit1_hw_assignments/chapter3_assignment.py | drewb101/LC101 | 9c6d71a9127f9e568f7c85d5c8f7824bff19c34f | [
"Unlicense"
] | null | null | null | Unit1_hw_assignments/chapter3_assignment.py | drewb101/LC101 | 9c6d71a9127f9e568f7c85d5c8f7824bff19c34f | [
"Unlicense"
] | null | null | null | # For each click variable, calculate the temperature and print it as shown in the instructions
click_1 = 0
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_1 + 40) % 50)
click_2 = 49
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_2 + 40))
click_3 = 74
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_3 + 40 - 50))
click_4 = 51
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_4 + 40) % 50)
click_5 = -1
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_5 + 90))
click_6 = 200
# TODO calculate the temperature, and report it back to the user
print("The temperature is", (click_6 + 40) % 50) | 34.08 | 94 | 0.732394 | 145 | 852 | 4.22069 | 0.234483 | 0.297386 | 0.263072 | 0.297386 | 0.754902 | 0.754902 | 0.754902 | 0.754902 | 0.754902 | 0.754902 | 0 | 0.061693 | 0.181925 | 852 | 25 | 95 | 34.08 | 0.816356 | 0.551643 | 0 | 0 | 0 | 0 | 0.288 | 0 | 0 | 0 | 0 | 0.04 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
403cec6f3e6e912017b6f0dc684eabe3092845ea | 3,786 | py | Python | tests/unit_tests/homeassistant/test_battery_sensor.py | ehendrix23/teslajsonpy | 7e5a86acc053df4e990bfece4db37d2cbb6ac5e0 | [
"Apache-2.0"
] | null | null | null | tests/unit_tests/homeassistant/test_battery_sensor.py | ehendrix23/teslajsonpy | 7e5a86acc053df4e990bfece4db37d2cbb6ac5e0 | [
"Apache-2.0"
] | null | null | null | tests/unit_tests/homeassistant/test_battery_sensor.py | ehendrix23/teslajsonpy | 7e5a86acc053df4e990bfece4db37d2cbb6ac5e0 | [
"Apache-2.0"
] | null | null | null | """Test battery sensor."""
import pytest
from teslajsonpy.controller import Controller
from teslajsonpy.homeassistant.battery_sensor import Battery
from tests.tesla_mock import TeslaMock, VIN, CAR_ID
def test_has_battery(monkeypatch):
"""Test has_battery()."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_data = _mock.data_request_vehicle()
_sensor = Battery(_data, _controller)
assert _sensor.has_battery()
def test_device_class(monkeypatch):
"""Test device_class()."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_data = _mock.data_request_vehicle()
_sensor = Battery(_data, _controller)
assert _sensor.device_class == "battery"
def test_get_value_on_init(monkeypatch):
"""Test get_value() after initialization."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_data = _mock.data_request_vehicle()
_sensor = Battery(_data, _controller)
assert _sensor is not None
assert _sensor.get_value() is None
@pytest.mark.asyncio
async def test_get_value_after_update(monkeypatch):
"""Test get_value() after an update."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_controller.set_id_vin(CAR_ID, VIN)
_data = _mock.data_request_vehicle()
_sensor = Battery(_data, _controller)
_controller.set_charging_params(vin=VIN, params=_data["charge_state"])
await _sensor.async_update()
assert _sensor is not None
assert not _sensor.get_value() is None
assert _sensor.get_value() == 64
@pytest.mark.asyncio
async def test_battery_level(monkeypatch):
"""Test battery_level()."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_controller.set_id_vin(CAR_ID, VIN)
_data = _mock.data_request_vehicle()
_sensor = Battery(_data, _controller)
_controller.set_charging_params(vin=VIN, params=_data["charge_state"])
await _sensor.async_update()
assert _sensor is not None
assert not _sensor.get_value() is None
assert _sensor.battery_level() == 64
@pytest.mark.asyncio
async def test_battery_charging_off(monkeypatch):
"""Test battery_charging() when not charging."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_controller.set_id_vin(CAR_ID, VIN)
_data = _mock.data_request_vehicle()
_data["charge_state"]["charging_state"] = "Disconnected"
_sensor = Battery(_data, _controller)
_controller.set_charging_params(vin=VIN, params=_data["charge_state"])
await _sensor.async_update()
assert _sensor is not None
assert not _sensor.battery_charging()
@pytest.mark.asyncio
async def test_battery_charging_on(monkeypatch):
"""Test battery_charging() when charging."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_controller.set_id_vin(CAR_ID, VIN)
_data = _mock.data_request_vehicle()
_data["charge_state"]["charging_state"] = "Charging"
_sensor = Battery(_data, _controller)
_controller.set_charging_params(vin=VIN, params=_data["charge_state"])
await _sensor.async_update()
assert _sensor is not None
assert _sensor.battery_charging()
@pytest.mark.asyncio
async def test_async_update(monkeypatch):
"""Test async_update()."""
_mock = TeslaMock(monkeypatch)
_controller = Controller(None)
_controller.set_id_vin(CAR_ID, VIN)
_data = _mock.data_request_vehicle()
_data["charge_state"]["battery_level"] = 12.3
_sensor = Battery(_data, _controller)
_controller.set_charging_params(vin=VIN, params=_data["charge_state"])
await _sensor.async_update()
assert _sensor is not None
assert not _sensor.get_value() is None
assert _sensor.get_value() == 12.3
| 25.755102 | 74 | 0.726889 | 461 | 3,786 | 5.542299 | 0.112798 | 0.101761 | 0.075147 | 0.106458 | 0.821526 | 0.767515 | 0.756164 | 0.747945 | 0.709198 | 0.672407 | 0 | 0.003178 | 0.16878 | 3,786 | 146 | 75 | 25.931507 | 0.808707 | 0.026413 | 0 | 0.72619 | 0 | 0 | 0.047113 | 0 | 0 | 0 | 0 | 0 | 0.202381 | 1 | 0.035714 | false | 0 | 0.047619 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4042bf9886b4b0edcbd2ea163dd166d977d866a1 | 1,528 | py | Python | coop_cms/migrations/0002_auto_20160108_1628.py | ljean/coop_cms | 531f65ceb9ad82c113597d15b764dbcf51264794 | [
"BSD-3-Clause"
] | 3 | 2016-01-29T10:55:09.000Z | 2022-03-08T16:02:12.000Z | coop_cms/migrations/0002_auto_20160108_1628.py | ljean/coop_cms | 531f65ceb9ad82c113597d15b764dbcf51264794 | [
"BSD-3-Clause"
] | 11 | 2015-03-07T17:30:24.000Z | 2016-07-13T09:40:43.000Z | coop_cms/migrations/0002_auto_20160108_1628.py | ljean/coop_cms | 531f65ceb9ad82c113597d15b764dbcf51264794 | [
"BSD-3-Clause"
] | 5 | 2018-08-30T09:03:22.000Z | 2019-09-10T13:01:56.000Z | # -*- coding: utf-8 -*-
from django.db import migrations, models
import django_extensions.db.fields
class Migration(migrations.Migration):
dependencies = [
('coop_cms', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='document',
name='created',
field=django_extensions.db.fields.CreationDateTimeField(auto_now_add=True, verbose_name='created'),
),
migrations.AlterField(
model_name='document',
name='modified',
field=django_extensions.db.fields.ModificationDateTimeField(auto_now=True, verbose_name='modified'),
),
migrations.AlterField(
model_name='image',
name='created',
field=django_extensions.db.fields.CreationDateTimeField(auto_now_add=True, verbose_name='created'),
),
migrations.AlterField(
model_name='image',
name='modified',
field=django_extensions.db.fields.ModificationDateTimeField(auto_now=True, verbose_name='modified'),
),
migrations.AlterField(
model_name='link',
name='created',
field=django_extensions.db.fields.CreationDateTimeField(auto_now_add=True, verbose_name='created'),
),
migrations.AlterField(
model_name='link',
name='modified',
field=django_extensions.db.fields.ModificationDateTimeField(auto_now=True, verbose_name='modified'),
),
]
| 33.955556 | 112 | 0.623037 | 141 | 1,528 | 6.539007 | 0.248227 | 0.121475 | 0.136659 | 0.182213 | 0.840564 | 0.840564 | 0.772234 | 0.772234 | 0.772234 | 0.772234 | 0 | 0.004437 | 0.262435 | 1,528 | 44 | 113 | 34.727273 | 0.813665 | 0.013743 | 0 | 0.789474 | 0 | 0 | 0.095681 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
405736648051e2ab206f4c9302972d42481e323c | 5,179 | py | Python | tests/opkg_conf.py | francois-berder/boardfarm | 7c0c8397e4e0f4ccd97ec88e7d737086bdde14d1 | [
"BSD-3-Clause-Clear"
] | null | null | null | tests/opkg_conf.py | francois-berder/boardfarm | 7c0c8397e4e0f4ccd97ec88e7d737086bdde14d1 | [
"BSD-3-Clause-Clear"
] | null | null | null | tests/opkg_conf.py | francois-berder/boardfarm | 7c0c8397e4e0f4ccd97ec88e7d737086bdde14d1 | [
"BSD-3-Clause-Clear"
] | null | null | null | # Copyright (c) 2015
#
# All rights reserved.
#
# This file is distributed under the Clear BSD license.
# The full text can be found in LICENSE in the root directory.
import time
import rootfs_boot
from devices import board, wan, lan, wlan, prompt
class OpkgConfUpdateMD5(rootfs_boot.RootFSBootTest):
'''Check that opkg will overwrite old configuration files with known MD5.'''
def runTest(self):
board.sendline('\ncp /etc/ulogd.conf /etc/ulogd.conf.bak')
board.expect(prompt)
board.sendline('echo boardfarmteststring > /etc/ulogd.conf')
board.expect(prompt)
board.sendline('touch -r /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
board.sendline('sed -i "s|/etc/ulogd.conf .*|/etc/ulogd.conf 48b1215c8d419a33818fc1f42c118aed|" /usr/lib/opkg/status')
board.expect(prompt)
board.sendline('opkg install --force-reinstall ulogd')
board.expect(prompt)
#board.sendline('grep boardfarmteststring /etc/ulogd.conf || uname')
board.sendline('grep boardfarmteststring /etc/ulogd.conf || uname')
board.expect('Linux')
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
def recover(self):
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
class OpkgConfNotUpdateMD5(rootfs_boot.RootFSBootTest):
'''Check that opkg will not overwrite old modified configuration files with known MD5.'''
def runTest(self):
board.sendline('\ncp /etc/ulogd.conf /etc/ulogd.conf.bak')
board.expect(prompt)
board.sendline('echo boardfarmteststring > /etc/ulogd.conf')
board.expect(prompt)
board.sendline('touch -r /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
board.sendline('sed -i "s|/etc/ulogd.conf .*|/etc/ulogd.conf aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa|" /usr/lib/opkg/status')
board.expect(prompt)
board.sendline('opkg install --force-reinstall ulogd')
#board.expect('Existing conffile /etc/ulogd.conf is different from the conffile in the new package')
board.expect('Existing conffile /etc/ulogd.conf is different from the conffile in the new package')
board.expect(prompt)
board.sendline('grep boardfarmteststring /etc/ulogd.conf && uname')
#board.sendline('grep boardfarmteststring /etc/ulogd.conf && uname')
board.expect('Linux')
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
def recover(self):
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
class OpkgConfUpdateSHA256(rootfs_boot.RootFSBootTest):
'''Check that opkg will overwrite old configuration files with known MD5.'''
def runTest(self):
board.sendline('\ncp /etc/ulogd.conf /etc/ulogd.conf.bak')
board.expect(prompt)
board.sendline('echo boardfarmteststring > /etc/ulogd.conf')
board.expect(prompt)
board.sendline('touch -r /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
board.sendline('sed -i "s|/etc/ulogd.conf .*|/etc/ulogd.conf 4212ae0a86f553b7aac741a734a0b973193a9fbe179b28a5d8c2a50cc51e25f0|" /usr/lib/opkg/status')
board.expect(prompt)
board.sendline('opkg install --force-reinstall ulogd')
board.expect(prompt)
#board.sendline('grep boardfarmteststring /etc/ulogd.conf || uname')
board.sendline('grep boardfarmteststring /etc/ulogd.conf || uname')
board.expect('Linux')
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
def recover(self):
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
class OpkgConfNotUpdateSHA256(rootfs_boot.RootFSBootTest):
'''Check that opkg will not overwrite old modified configuration files with known MD5.'''
def runTest(self):
board.sendline('\ncp /etc/ulogd.conf /etc/ulogd.conf.bak')
board.expect(prompt)
board.sendline('echo boardfarmteststring > /etc/ulogd.conf')
board.expect(prompt)
board.sendline('touch -r /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
board.sendline('sed -i "s|/etc/ulogd.conf .*|/etc/ulogd.conf aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa|" /usr/lib/opkg/status')
board.expect(prompt)
board.sendline('opkg install --force-reinstall ulogd')
#board.expect('Existing conffile /etc/ulogd.conf is different from the conffile in the new package')
board.expect('Existing conffile /etc/ulogd.conf is different from the conffile in the new package')
board.expect(prompt)
board.sendline('grep boardfarmteststring /etc/ulogd.conf && uname')
#board.sendline('grep boardfarmteststring /etc/ulogd.conf && uname')
board.expect('Linux')
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
def recover(self):
board.sendline('\nmv /etc/ulogd.conf.bak /etc/ulogd.conf')
board.expect(prompt)
| 49.32381 | 158 | 0.681599 | 640 | 5,179 | 5.507813 | 0.140625 | 0.127092 | 0.190638 | 0.124823 | 0.868369 | 0.868369 | 0.868369 | 0.868369 | 0.868369 | 0.868369 | 0 | 0.01832 | 0.188453 | 5,179 | 104 | 159 | 49.798077 | 0.820366 | 0.179764 | 0 | 0.864198 | 0 | 0.049383 | 0.430368 | 0.04745 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098765 | false | 0 | 0.037037 | 0 | 0.185185 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
40970fc5cd8f8f276469b90e7d19e2d8bb5241ea | 26,578 | py | Python | sdk/python/pulumi_oci/kms/key_version.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/kms/key_version.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/kms/key_version.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['KeyVersionArgs', 'KeyVersion']
@pulumi.input_type
class KeyVersionArgs:
def __init__(__self__, *,
key_id: pulumi.Input[str],
management_endpoint: pulumi.Input[str],
time_of_deletion: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a KeyVersion resource.
:param pulumi.Input[str] key_id: The OCID of the key.
:param pulumi.Input[str] management_endpoint: The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
:param pulumi.Input[str] time_of_deletion: (Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
"""
pulumi.set(__self__, "key_id", key_id)
pulumi.set(__self__, "management_endpoint", management_endpoint)
if time_of_deletion is not None:
pulumi.set(__self__, "time_of_deletion", time_of_deletion)
@property
@pulumi.getter(name="keyId")
def key_id(self) -> pulumi.Input[str]:
"""
The OCID of the key.
"""
return pulumi.get(self, "key_id")
@key_id.setter
def key_id(self, value: pulumi.Input[str]):
pulumi.set(self, "key_id", value)
@property
@pulumi.getter(name="managementEndpoint")
def management_endpoint(self) -> pulumi.Input[str]:
"""
The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
"""
return pulumi.get(self, "management_endpoint")
@management_endpoint.setter
def management_endpoint(self, value: pulumi.Input[str]):
pulumi.set(self, "management_endpoint", value)
@property
@pulumi.getter(name="timeOfDeletion")
def time_of_deletion(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
"""
return pulumi.get(self, "time_of_deletion")
@time_of_deletion.setter
def time_of_deletion(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_of_deletion", value)
@pulumi.input_type
class _KeyVersionState:
def __init__(__self__, *,
compartment_id: Optional[pulumi.Input[str]] = None,
is_primary: Optional[pulumi.Input[bool]] = None,
key_id: Optional[pulumi.Input[str]] = None,
key_version_id: Optional[pulumi.Input[str]] = None,
management_endpoint: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
replica_details: Optional[pulumi.Input['KeyVersionReplicaDetailsArgs']] = None,
restored_from_key_id: Optional[pulumi.Input[str]] = None,
restored_from_key_version_id: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_of_deletion: Optional[pulumi.Input[str]] = None,
vault_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering KeyVersion resources.
:param pulumi.Input[str] compartment_id: The OCID of the compartment that contains this key version.
:param pulumi.Input[bool] is_primary: A boolean that will be true when key version is primary, and will be false when key version is a replica from a primary key version.
:param pulumi.Input[str] key_id: The OCID of the key.
:param pulumi.Input[str] management_endpoint: The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
:param pulumi.Input[str] public_key: The public key in PEM format. (This value pertains only to RSA and ECDSA keys.)
:param pulumi.Input['KeyVersionReplicaDetailsArgs'] replica_details: KeyVersion replica details
:param pulumi.Input[str] restored_from_key_version_id: The OCID of the key version from which this key version was restored.
:param pulumi.Input[str] state: The key version's current lifecycle state. Example: `ENABLED`
:param pulumi.Input[str] time_created: The date and time this key version was created, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: "2018-04-03T21:10:29.600Z"
:param pulumi.Input[str] time_of_deletion: (Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
:param pulumi.Input[str] vault_id: The OCID of the vault that contains this key version.
"""
if compartment_id is not None:
pulumi.set(__self__, "compartment_id", compartment_id)
if is_primary is not None:
pulumi.set(__self__, "is_primary", is_primary)
if key_id is not None:
pulumi.set(__self__, "key_id", key_id)
if key_version_id is not None:
pulumi.set(__self__, "key_version_id", key_version_id)
if management_endpoint is not None:
pulumi.set(__self__, "management_endpoint", management_endpoint)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if replica_details is not None:
pulumi.set(__self__, "replica_details", replica_details)
if restored_from_key_id is not None:
pulumi.set(__self__, "restored_from_key_id", restored_from_key_id)
if restored_from_key_version_id is not None:
pulumi.set(__self__, "restored_from_key_version_id", restored_from_key_version_id)
if state is not None:
pulumi.set(__self__, "state", state)
if time_created is not None:
pulumi.set(__self__, "time_created", time_created)
if time_of_deletion is not None:
pulumi.set(__self__, "time_of_deletion", time_of_deletion)
if vault_id is not None:
pulumi.set(__self__, "vault_id", vault_id)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> Optional[pulumi.Input[str]]:
"""
The OCID of the compartment that contains this key version.
"""
return pulumi.get(self, "compartment_id")
@compartment_id.setter
def compartment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compartment_id", value)
@property
@pulumi.getter(name="isPrimary")
def is_primary(self) -> Optional[pulumi.Input[bool]]:
"""
A boolean that will be true when key version is primary, and will be false when key version is a replica from a primary key version.
"""
return pulumi.get(self, "is_primary")
@is_primary.setter
def is_primary(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_primary", value)
@property
@pulumi.getter(name="keyId")
def key_id(self) -> Optional[pulumi.Input[str]]:
"""
The OCID of the key.
"""
return pulumi.get(self, "key_id")
@key_id.setter
def key_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_id", value)
@property
@pulumi.getter(name="keyVersionId")
def key_version_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "key_version_id")
@key_version_id.setter
def key_version_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_version_id", value)
@property
@pulumi.getter(name="managementEndpoint")
def management_endpoint(self) -> Optional[pulumi.Input[str]]:
"""
The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
"""
return pulumi.get(self, "management_endpoint")
@management_endpoint.setter
def management_endpoint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "management_endpoint", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
The public key in PEM format. (This value pertains only to RSA and ECDSA keys.)
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="replicaDetails")
def replica_details(self) -> Optional[pulumi.Input['KeyVersionReplicaDetailsArgs']]:
"""
KeyVersion replica details
"""
return pulumi.get(self, "replica_details")
@replica_details.setter
def replica_details(self, value: Optional[pulumi.Input['KeyVersionReplicaDetailsArgs']]):
pulumi.set(self, "replica_details", value)
@property
@pulumi.getter(name="restoredFromKeyId")
def restored_from_key_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "restored_from_key_id")
@restored_from_key_id.setter
def restored_from_key_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "restored_from_key_id", value)
@property
@pulumi.getter(name="restoredFromKeyVersionId")
def restored_from_key_version_id(self) -> Optional[pulumi.Input[str]]:
"""
The OCID of the key version from which this key version was restored.
"""
return pulumi.get(self, "restored_from_key_version_id")
@restored_from_key_version_id.setter
def restored_from_key_version_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "restored_from_key_version_id", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[str]]:
"""
The key version's current lifecycle state. Example: `ENABLED`
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "state", value)
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> Optional[pulumi.Input[str]]:
"""
The date and time this key version was created, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: "2018-04-03T21:10:29.600Z"
"""
return pulumi.get(self, "time_created")
@time_created.setter
def time_created(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_created", value)
@property
@pulumi.getter(name="timeOfDeletion")
def time_of_deletion(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
"""
return pulumi.get(self, "time_of_deletion")
@time_of_deletion.setter
def time_of_deletion(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_of_deletion", value)
@property
@pulumi.getter(name="vaultId")
def vault_id(self) -> Optional[pulumi.Input[str]]:
"""
The OCID of the vault that contains this key version.
"""
return pulumi.get(self, "vault_id")
@vault_id.setter
def vault_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vault_id", value)
class KeyVersion(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
key_id: Optional[pulumi.Input[str]] = None,
management_endpoint: Optional[pulumi.Input[str]] = None,
time_of_deletion: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
This resource provides the Key Version resource in Oracle Cloud Infrastructure Kms service.
Generates a new [KeyVersion](https://docs.cloud.oracle.com/iaas/api/#/en/key/latest/KeyVersion/) resource that provides new cryptographic
material for a master encryption key. The key must be in an `ENABLED` state to be rotated.
As a management operation, this call is subject to a Key Management limit that applies to the total number
of requests across all management write operations. Key Management might throttle this call to reject an
otherwise valid request when the total rate of management write operations exceeds 10 requests per second
for a given tenancy.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_key_version = oci.kms.KeyVersion("testKeyVersion",
key_id=oci_kms_key["test_key"]["id"],
management_endpoint=var["key_version_management_endpoint"])
```
## Import
KeyVersions can be imported using the `id`, e.g.
```sh
$ pulumi import oci:kms/keyVersion:KeyVersion test_key_version "managementEndpoint/{managementEndpoint}/keys/{keyId}/keyVersions/{keyVersionId}"
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] key_id: The OCID of the key.
:param pulumi.Input[str] management_endpoint: The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
:param pulumi.Input[str] time_of_deletion: (Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: KeyVersionArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provides the Key Version resource in Oracle Cloud Infrastructure Kms service.
Generates a new [KeyVersion](https://docs.cloud.oracle.com/iaas/api/#/en/key/latest/KeyVersion/) resource that provides new cryptographic
material for a master encryption key. The key must be in an `ENABLED` state to be rotated.
As a management operation, this call is subject to a Key Management limit that applies to the total number
of requests across all management write operations. Key Management might throttle this call to reject an
otherwise valid request when the total rate of management write operations exceeds 10 requests per second
for a given tenancy.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_key_version = oci.kms.KeyVersion("testKeyVersion",
key_id=oci_kms_key["test_key"]["id"],
management_endpoint=var["key_version_management_endpoint"])
```
## Import
KeyVersions can be imported using the `id`, e.g.
```sh
$ pulumi import oci:kms/keyVersion:KeyVersion test_key_version "managementEndpoint/{managementEndpoint}/keys/{keyId}/keyVersions/{keyVersionId}"
```
:param str resource_name: The name of the resource.
:param KeyVersionArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(KeyVersionArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
key_id: Optional[pulumi.Input[str]] = None,
management_endpoint: Optional[pulumi.Input[str]] = None,
time_of_deletion: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = KeyVersionArgs.__new__(KeyVersionArgs)
if key_id is None and not opts.urn:
raise TypeError("Missing required property 'key_id'")
__props__.__dict__["key_id"] = key_id
if management_endpoint is None and not opts.urn:
raise TypeError("Missing required property 'management_endpoint'")
__props__.__dict__["management_endpoint"] = management_endpoint
__props__.__dict__["time_of_deletion"] = time_of_deletion
__props__.__dict__["compartment_id"] = None
__props__.__dict__["is_primary"] = None
__props__.__dict__["key_version_id"] = None
__props__.__dict__["public_key"] = None
__props__.__dict__["replica_details"] = None
__props__.__dict__["restored_from_key_id"] = None
__props__.__dict__["restored_from_key_version_id"] = None
__props__.__dict__["state"] = None
__props__.__dict__["time_created"] = None
__props__.__dict__["vault_id"] = None
super(KeyVersion, __self__).__init__(
'oci:kms/keyVersion:KeyVersion',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
is_primary: Optional[pulumi.Input[bool]] = None,
key_id: Optional[pulumi.Input[str]] = None,
key_version_id: Optional[pulumi.Input[str]] = None,
management_endpoint: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
replica_details: Optional[pulumi.Input[pulumi.InputType['KeyVersionReplicaDetailsArgs']]] = None,
restored_from_key_id: Optional[pulumi.Input[str]] = None,
restored_from_key_version_id: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_of_deletion: Optional[pulumi.Input[str]] = None,
vault_id: Optional[pulumi.Input[str]] = None) -> 'KeyVersion':
"""
Get an existing KeyVersion resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] compartment_id: The OCID of the compartment that contains this key version.
:param pulumi.Input[bool] is_primary: A boolean that will be true when key version is primary, and will be false when key version is a replica from a primary key version.
:param pulumi.Input[str] key_id: The OCID of the key.
:param pulumi.Input[str] management_endpoint: The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
:param pulumi.Input[str] public_key: The public key in PEM format. (This value pertains only to RSA and ECDSA keys.)
:param pulumi.Input[pulumi.InputType['KeyVersionReplicaDetailsArgs']] replica_details: KeyVersion replica details
:param pulumi.Input[str] restored_from_key_version_id: The OCID of the key version from which this key version was restored.
:param pulumi.Input[str] state: The key version's current lifecycle state. Example: `ENABLED`
:param pulumi.Input[str] time_created: The date and time this key version was created, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: "2018-04-03T21:10:29.600Z"
:param pulumi.Input[str] time_of_deletion: (Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
:param pulumi.Input[str] vault_id: The OCID of the vault that contains this key version.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _KeyVersionState.__new__(_KeyVersionState)
__props__.__dict__["compartment_id"] = compartment_id
__props__.__dict__["is_primary"] = is_primary
__props__.__dict__["key_id"] = key_id
__props__.__dict__["key_version_id"] = key_version_id
__props__.__dict__["management_endpoint"] = management_endpoint
__props__.__dict__["public_key"] = public_key
__props__.__dict__["replica_details"] = replica_details
__props__.__dict__["restored_from_key_id"] = restored_from_key_id
__props__.__dict__["restored_from_key_version_id"] = restored_from_key_version_id
__props__.__dict__["state"] = state
__props__.__dict__["time_created"] = time_created
__props__.__dict__["time_of_deletion"] = time_of_deletion
__props__.__dict__["vault_id"] = vault_id
return KeyVersion(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> pulumi.Output[str]:
"""
The OCID of the compartment that contains this key version.
"""
return pulumi.get(self, "compartment_id")
@property
@pulumi.getter(name="isPrimary")
def is_primary(self) -> pulumi.Output[bool]:
"""
A boolean that will be true when key version is primary, and will be false when key version is a replica from a primary key version.
"""
return pulumi.get(self, "is_primary")
@property
@pulumi.getter(name="keyId")
def key_id(self) -> pulumi.Output[str]:
"""
The OCID of the key.
"""
return pulumi.get(self, "key_id")
@property
@pulumi.getter(name="keyVersionId")
def key_version_id(self) -> pulumi.Output[str]:
return pulumi.get(self, "key_version_id")
@property
@pulumi.getter(name="managementEndpoint")
def management_endpoint(self) -> pulumi.Output[str]:
"""
The service endpoint to perform management operations against. Management operations include 'Create,' 'Update,' 'List,' 'Get,' and 'Delete' operations. See Vault Management endpoint.
"""
return pulumi.get(self, "management_endpoint")
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> pulumi.Output[str]:
"""
The public key in PEM format. (This value pertains only to RSA and ECDSA keys.)
"""
return pulumi.get(self, "public_key")
@property
@pulumi.getter(name="replicaDetails")
def replica_details(self) -> pulumi.Output['outputs.KeyVersionReplicaDetails']:
"""
KeyVersion replica details
"""
return pulumi.get(self, "replica_details")
@property
@pulumi.getter(name="restoredFromKeyId")
def restored_from_key_id(self) -> pulumi.Output[str]:
return pulumi.get(self, "restored_from_key_id")
@property
@pulumi.getter(name="restoredFromKeyVersionId")
def restored_from_key_version_id(self) -> pulumi.Output[str]:
"""
The OCID of the key version from which this key version was restored.
"""
return pulumi.get(self, "restored_from_key_version_id")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
"""
The key version's current lifecycle state. Example: `ENABLED`
"""
return pulumi.get(self, "state")
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> pulumi.Output[str]:
"""
The date and time this key version was created, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: "2018-04-03T21:10:29.600Z"
"""
return pulumi.get(self, "time_created")
@property
@pulumi.getter(name="timeOfDeletion")
def time_of_deletion(self) -> pulumi.Output[str]:
"""
(Updatable) An optional property for the deletion time of the key version, expressed in [RFC 3339](https://tools.ietf.org/html/rfc3339) timestamp format. Example: `2019-04-03T21:10:29.600Z`
"""
return pulumi.get(self, "time_of_deletion")
@property
@pulumi.getter(name="vaultId")
def vault_id(self) -> pulumi.Output[str]:
"""
The OCID of the vault that contains this key version.
"""
return pulumi.get(self, "vault_id")
| 47.124113 | 240 | 0.667733 | 3,304 | 26,578 | 5.124697 | 0.076271 | 0.064316 | 0.070281 | 0.068864 | 0.863572 | 0.823293 | 0.797484 | 0.779057 | 0.755847 | 0.694484 | 0 | 0.013664 | 0.228986 | 26,578 | 563 | 241 | 47.207815 | 0.81261 | 0.367522 | 0 | 0.512579 | 1 | 0 | 0.127159 | 0.026424 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163522 | false | 0.003145 | 0.022013 | 0.012579 | 0.289308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40f10ba9cba8e68dff7b7eb5342f4e0f3d77da91 | 215,751 | py | Python | boto3_type_annotations_with_docs/boto3_type_annotations/greengrass/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/greengrass/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/greengrass/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from typing import Optional
from botocore.client import BaseClient
from typing import Dict
from botocore.paginate import Paginator
from botocore.waiter import Waiter
from typing import Union
from typing import List
class Client(BaseClient):
def associate_role_to_group(self, GroupId: str, RoleArn: str = None) -> Dict:
"""
Associates a role with a group. Your Greengrass core will use the role to access AWS cloud services. The role's permissions should allow Greengrass core Lambda functions to perform actions against the cloud.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/AssociateRoleToGroup>`_
**Request Syntax**
::
response = client.associate_role_to_group(
GroupId='string',
RoleArn='string'
)
**Response Syntax**
::
{
'AssociatedAt': 'string'
}
**Response Structure**
- *(dict) --* success
- **AssociatedAt** *(string) --* The time, in milliseconds since the epoch, when the role ARN was associated with the group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type RoleArn: string
:param RoleArn: The ARN of the role you wish to associate with this group.
:rtype: dict
:returns:
"""
pass
def associate_service_role_to_account(self, RoleArn: str = None) -> Dict:
"""
Associates a role with your account. AWS IoT Greengrass will use the role to access your Lambda functions and AWS IoT resources. This is necessary for deployments to succeed. The role must have at least minimum permissions in the policy ''AWSGreengrassResourceAccessRolePolicy''.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/AssociateServiceRoleToAccount>`_
**Request Syntax**
::
response = client.associate_service_role_to_account(
RoleArn='string'
)
**Response Syntax**
::
{
'AssociatedAt': 'string'
}
**Response Structure**
- *(dict) --* success
- **AssociatedAt** *(string) --* The time when the service role was associated with the account.
:type RoleArn: string
:param RoleArn: The ARN of the service role you wish to associate with your account.
:rtype: dict
:returns:
"""
pass
def can_paginate(self, operation_name: str = None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:return: ``True`` if the operation can be paginated,
``False`` otherwise.
"""
pass
def create_connector_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a connector definition. You may provide the initial version of the connector definition now or use ''CreateConnectorDefinitionVersion'' at a later time.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateConnectorDefinition>`_
**Request Syntax**
::
response = client.create_connector_definition(
AmznClientToken='string',
InitialVersion={
'Connectors': [
{
'ConnectorArn': 'string',
'Id': 'string',
'Parameters': {
'string': 'string'
}
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the connector definition.
- **Connectors** *(list) --* A list of references to connectors in this version, with their corresponding configuration settings.
- *(dict) --* Information about a connector. Connectors run on the Greengrass core and contain built-in integration with local infrastructure, device protocols, AWS, and other cloud services.
- **ConnectorArn** *(string) --* The ARN of the connector.
- **Id** *(string) --* A descriptive or arbitrary ID for the connector. This value must be unique within the connector definition version. Max length is 128 characters with pattern [a-zA-Z0-9:_-]+.
- **Parameters** *(dict) --* The parameters or configuration that the connector uses.
- *(string) --*
- *(string) --*
:type Name: string
:param Name: The name of the connector definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_connector_definition_version(self, ConnectorDefinitionId: str, AmznClientToken: str = None, Connectors: List = None) -> Dict:
"""
Creates a version of a connector definition which has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateConnectorDefinitionVersion>`_
**Request Syntax**
::
response = client.create_connector_definition_version(
AmznClientToken='string',
ConnectorDefinitionId='string',
Connectors=[
{
'ConnectorArn': 'string',
'Id': 'string',
'Parameters': {
'string': 'string'
}
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:type Connectors: list
:param Connectors: A list of references to connectors in this version, with their corresponding configuration settings.
- *(dict) --* Information about a connector. Connectors run on the Greengrass core and contain built-in integration with local infrastructure, device protocols, AWS, and other cloud services.
- **ConnectorArn** *(string) --* The ARN of the connector.
- **Id** *(string) --* A descriptive or arbitrary ID for the connector. This value must be unique within the connector definition version. Max length is 128 characters with pattern [a-zA-Z0-9:_-]+.
- **Parameters** *(dict) --* The parameters or configuration that the connector uses.
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_core_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a core definition. You may provide the initial version of the core definition now or use ''CreateCoreDefinitionVersion'' at a later time. Greengrass groups must each contain exactly one Greengrass core.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateCoreDefinition>`_
**Request Syntax**
::
response = client.create_core_definition(
AmznClientToken='string',
InitialVersion={
'Cores': [
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the core definition.
- **Cores** *(list) --* A list of cores in the core definition version.
- *(dict) --* Information about a core.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the core.
- **Id** *(string) --* A descriptive or arbitrary ID for the core. This value must be unique within the core definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **SyncShadow** *(boolean) --* If true, the core\'s local shadow is automatically synced with the cloud.
- **ThingArn** *(string) --* The ARN of the thing which is the core.
:type Name: string
:param Name: The name of the core definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_core_definition_version(self, CoreDefinitionId: str, AmznClientToken: str = None, Cores: List = None) -> Dict:
"""
Creates a version of a core definition that has already been defined. Greengrass groups must each contain exactly one Greengrass core.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateCoreDefinitionVersion>`_
**Request Syntax**
::
response = client.create_core_definition_version(
AmznClientToken='string',
CoreDefinitionId='string',
Cores=[
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:type Cores: list
:param Cores: A list of cores in the core definition version.
- *(dict) --* Information about a core.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the core.
- **Id** *(string) --* A descriptive or arbitrary ID for the core. This value must be unique within the core definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **SyncShadow** *(boolean) --* If true, the core\'s local shadow is automatically synced with the cloud.
- **ThingArn** *(string) --* The ARN of the thing which is the core.
:rtype: dict
:returns:
"""
pass
def create_deployment(self, GroupId: str, AmznClientToken: str = None, DeploymentId: str = None, DeploymentType: str = None, GroupVersionId: str = None) -> Dict:
"""
Creates a deployment. ''CreateDeployment'' requests are idempotent with respect to the ''X-Amzn-Client-Token'' token and the request parameters.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateDeployment>`_
**Request Syntax**
::
response = client.create_deployment(
AmznClientToken='string',
DeploymentId='string',
DeploymentType='NewDeployment'|'Redeployment'|'ResetDeployment'|'ForceResetDeployment',
GroupId='string',
GroupVersionId='string'
)
**Response Syntax**
::
{
'DeploymentArn': 'string',
'DeploymentId': 'string'
}
**Response Structure**
- *(dict) --* Success. The group was deployed.
- **DeploymentArn** *(string) --* The ARN of the deployment.
- **DeploymentId** *(string) --* The ID of the deployment.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type DeploymentId: string
:param DeploymentId: The ID of the deployment if you wish to redeploy a previous deployment.
:type DeploymentType: string
:param DeploymentType: The type of deployment. When used for \'\'CreateDeployment\'\', only \'\'NewDeployment\'\' and \'\'Redeployment\'\' are valid.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type GroupVersionId: string
:param GroupVersionId: The ID of the group version to be deployed.
:rtype: dict
:returns:
"""
pass
def create_device_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a device definition. You may provide the initial version of the device definition now or use ''CreateDeviceDefinitionVersion'' at a later time.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateDeviceDefinition>`_
**Request Syntax**
::
response = client.create_device_definition(
AmznClientToken='string',
InitialVersion={
'Devices': [
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the device definition.
- **Devices** *(list) --* A list of devices in the definition version.
- *(dict) --* Information about a device.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the device.
- **Id** *(string) --* A descriptive or arbitrary ID for the device. This value must be unique within the device definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **SyncShadow** *(boolean) --* If true, the device\'s local shadow will be automatically synced with the cloud.
- **ThingArn** *(string) --* The thing ARN of the device.
:type Name: string
:param Name: The name of the device definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_device_definition_version(self, DeviceDefinitionId: str, AmznClientToken: str = None, Devices: List = None) -> Dict:
"""
Creates a version of a device definition that has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateDeviceDefinitionVersion>`_
**Request Syntax**
::
response = client.create_device_definition_version(
AmznClientToken='string',
DeviceDefinitionId='string',
Devices=[
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:type Devices: list
:param Devices: A list of devices in the definition version.
- *(dict) --* Information about a device.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the device.
- **Id** *(string) --* A descriptive or arbitrary ID for the device. This value must be unique within the device definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **SyncShadow** *(boolean) --* If true, the device\'s local shadow will be automatically synced with the cloud.
- **ThingArn** *(string) --* The thing ARN of the device.
:rtype: dict
:returns:
"""
pass
def create_function_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a Lambda function definition which contains a list of Lambda functions and their configurations to be used in a group. You can create an initial version of the definition by providing a list of Lambda functions and their configurations now, or use ''CreateFunctionDefinitionVersion'' later.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateFunctionDefinition>`_
**Request Syntax**
::
response = client.create_function_definition(
AmznClientToken='string',
InitialVersion={
'DefaultConfig': {
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
}
},
'Functions': [
{
'FunctionArn': 'string',
'FunctionConfiguration': {
'EncodingType': 'binary'|'json',
'Environment': {
'AccessSysfs': True|False,
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
},
'ResourceAccessPolicies': [
{
'Permission': 'ro'|'rw',
'ResourceId': 'string'
},
],
'Variables': {
'string': 'string'
}
},
'ExecArgs': 'string',
'Executable': 'string',
'MemorySize': 123,
'Pinned': True|False,
'Timeout': 123
},
'Id': 'string'
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the function definition.
- **DefaultConfig** *(dict) --* The default configuration that applies to all Lambda functions in this function definition version. Individual Lambda functions can override these settings.
- **Execution** *(dict) --* Configuration information that specifies how a Lambda function runs.
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set \'\'IsolationMode\'\' to \'\'NoContainer\'\' and update config.json in \'\'greengrass-root/config\'\' to set \'\'allowFunctionsToRunAsRoot\'\' to \'\'yes\'\'.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
- **Functions** *(list) --* A list of Lambda functions in this function definition version.
- *(dict) --* Information about a Lambda function.
- **FunctionArn** *(string) --* The ARN of the Lambda function.
- **FunctionConfiguration** *(dict) --* The configuration of the Lambda function.
- **EncodingType** *(string) --* The expected encoding type of the input payload for the function. The default is \'\'json\'\'.
- **Environment** *(dict) --* The environment configuration of the function.
- **AccessSysfs** *(boolean) --* If true, the Lambda function is allowed to access the host\'s /sys folder. Use this when the Lambda function needs to read device information from /sys. This setting applies only when you run the Lambda function in a Greengrass container.
- **Execution** *(dict) --* Configuration related to executing the Lambda function
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set \'\'IsolationMode\'\' to \'\'NoContainer\'\' and update config.json in \'\'greengrass-root/config\'\' to set \'\'allowFunctionsToRunAsRoot\'\' to \'\'yes\'\'.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
- **ResourceAccessPolicies** *(list) --* A list of the resources, with their permissions, to which the Lambda function will be granted access. A Lambda function can have at most 10 resources. ResourceAccessPolicies apply only when you run the Lambda function in a Greengrass container.
- *(dict) --* A policy used by the function to access a resource.
- **Permission** *(string) --* The permissions that the Lambda function has to the resource. Can be one of \'\'rw\'\' (read/write) or \'\'ro\'\' (read-only).
- **ResourceId** *(string) --* The ID of the resource. (This ID is assigned to the resource when you create the resource definiton.)
- **Variables** *(dict) --* Environment variables for the Lambda function\'s configuration.
- *(string) --*
- *(string) --*
- **ExecArgs** *(string) --* The execution arguments.
- **Executable** *(string) --* The name of the function executable.
- **MemorySize** *(integer) --* The memory size, in KB, which the function requires. This setting is not applicable and should be cleared when you run the Lambda function without containerization.
- **Pinned** *(boolean) --* True if the function is pinned. Pinned means the function is long-lived and starts when the core starts.
- **Timeout** *(integer) --* The allowed function execution time, after which Lambda should terminate the function. This timeout still applies to pinned Lambda functions for each request.
- **Id** *(string) --* A descriptive or arbitrary ID for the function. This value must be unique within the function definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
:type Name: string
:param Name: The name of the function definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_function_definition_version(self, FunctionDefinitionId: str, AmznClientToken: str = None, DefaultConfig: Dict = None, Functions: List = None) -> Dict:
"""
Creates a version of a Lambda function definition that has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateFunctionDefinitionVersion>`_
**Request Syntax**
::
response = client.create_function_definition_version(
AmznClientToken='string',
DefaultConfig={
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
}
},
FunctionDefinitionId='string',
Functions=[
{
'FunctionArn': 'string',
'FunctionConfiguration': {
'EncodingType': 'binary'|'json',
'Environment': {
'AccessSysfs': True|False,
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
},
'ResourceAccessPolicies': [
{
'Permission': 'ro'|'rw',
'ResourceId': 'string'
},
],
'Variables': {
'string': 'string'
}
},
'ExecArgs': 'string',
'Executable': 'string',
'MemorySize': 123,
'Pinned': True|False,
'Timeout': 123
},
'Id': 'string'
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type DefaultConfig: dict
:param DefaultConfig: The default configuration that applies to all Lambda functions in this function definition version. Individual Lambda functions can override these settings.
- **Execution** *(dict) --* Configuration information that specifies how a Lambda function runs.
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set \'\'IsolationMode\'\' to \'\'NoContainer\'\' and update config.json in \'\'greengrass-root/config\'\' to set \'\'allowFunctionsToRunAsRoot\'\' to \'\'yes\'\'.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:type Functions: list
:param Functions: A list of Lambda functions in this function definition version.
- *(dict) --* Information about a Lambda function.
- **FunctionArn** *(string) --* The ARN of the Lambda function.
- **FunctionConfiguration** *(dict) --* The configuration of the Lambda function.
- **EncodingType** *(string) --* The expected encoding type of the input payload for the function. The default is \'\'json\'\'.
- **Environment** *(dict) --* The environment configuration of the function.
- **AccessSysfs** *(boolean) --* If true, the Lambda function is allowed to access the host\'s /sys folder. Use this when the Lambda function needs to read device information from /sys. This setting applies only when you run the Lambda function in a Greengrass container.
- **Execution** *(dict) --* Configuration related to executing the Lambda function
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set \'\'IsolationMode\'\' to \'\'NoContainer\'\' and update config.json in \'\'greengrass-root/config\'\' to set \'\'allowFunctionsToRunAsRoot\'\' to \'\'yes\'\'.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
- **ResourceAccessPolicies** *(list) --* A list of the resources, with their permissions, to which the Lambda function will be granted access. A Lambda function can have at most 10 resources. ResourceAccessPolicies apply only when you run the Lambda function in a Greengrass container.
- *(dict) --* A policy used by the function to access a resource.
- **Permission** *(string) --* The permissions that the Lambda function has to the resource. Can be one of \'\'rw\'\' (read/write) or \'\'ro\'\' (read-only).
- **ResourceId** *(string) --* The ID of the resource. (This ID is assigned to the resource when you create the resource definiton.)
- **Variables** *(dict) --* Environment variables for the Lambda function\'s configuration.
- *(string) --*
- *(string) --*
- **ExecArgs** *(string) --* The execution arguments.
- **Executable** *(string) --* The name of the function executable.
- **MemorySize** *(integer) --* The memory size, in KB, which the function requires. This setting is not applicable and should be cleared when you run the Lambda function without containerization.
- **Pinned** *(boolean) --* True if the function is pinned. Pinned means the function is long-lived and starts when the core starts.
- **Timeout** *(integer) --* The allowed function execution time, after which Lambda should terminate the function. This timeout still applies to pinned Lambda functions for each request.
- **Id** *(string) --* A descriptive or arbitrary ID for the function. This value must be unique within the function definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
:rtype: dict
:returns:
"""
pass
def create_group(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a group. You may provide the initial version of the group or use ''CreateGroupVersion'' at a later time. Tip: You can use the ''gg_group_setup'' package (https://github.com/awslabs/aws-greengrass-group-setup) as a library or command-line application to create and deploy Greengrass groups.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateGroup>`_
**Request Syntax**
::
response = client.create_group(
AmznClientToken='string',
InitialVersion={
'ConnectorDefinitionVersionArn': 'string',
'CoreDefinitionVersionArn': 'string',
'DeviceDefinitionVersionArn': 'string',
'FunctionDefinitionVersionArn': 'string',
'LoggerDefinitionVersionArn': 'string',
'ResourceDefinitionVersionArn': 'string',
'SubscriptionDefinitionVersionArn': 'string'
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --* Success. The group was created.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the group.
- **ConnectorDefinitionVersionArn** *(string) --* The ARN of the connector definition version for this group.
- **CoreDefinitionVersionArn** *(string) --* The ARN of the core definition version for this group.
- **DeviceDefinitionVersionArn** *(string) --* The ARN of the device definition version for this group.
- **FunctionDefinitionVersionArn** *(string) --* The ARN of the function definition version for this group.
- **LoggerDefinitionVersionArn** *(string) --* The ARN of the logger definition version for this group.
- **ResourceDefinitionVersionArn** *(string) --* The ARN of the resource definition version for this group.
- **SubscriptionDefinitionVersionArn** *(string) --* The ARN of the subscription definition version for this group.
:type Name: string
:param Name: The name of the group.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_group_certificate_authority(self, GroupId: str, AmznClientToken: str = None) -> Dict:
"""
Creates a CA for the group. If a CA already exists, it will rotate the existing CA.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateGroupCertificateAuthority>`_
**Request Syntax**
::
response = client.create_group_certificate_authority(
AmznClientToken='string',
GroupId='string'
)
**Response Syntax**
::
{
'GroupCertificateAuthorityArn': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the new active CA ARN.
- **GroupCertificateAuthorityArn** *(string) --* The ARN of the group certificate authority.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def create_group_version(self, GroupId: str, AmznClientToken: str = None, ConnectorDefinitionVersionArn: str = None, CoreDefinitionVersionArn: str = None, DeviceDefinitionVersionArn: str = None, FunctionDefinitionVersionArn: str = None, LoggerDefinitionVersionArn: str = None, ResourceDefinitionVersionArn: str = None, SubscriptionDefinitionVersionArn: str = None) -> Dict:
"""
Creates a version of a group which has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateGroupVersion>`_
**Request Syntax**
::
response = client.create_group_version(
AmznClientToken='string',
ConnectorDefinitionVersionArn='string',
CoreDefinitionVersionArn='string',
DeviceDefinitionVersionArn='string',
FunctionDefinitionVersionArn='string',
GroupId='string',
LoggerDefinitionVersionArn='string',
ResourceDefinitionVersionArn='string',
SubscriptionDefinitionVersionArn='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* Success. The response contains information about the group version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type ConnectorDefinitionVersionArn: string
:param ConnectorDefinitionVersionArn: The ARN of the connector definition version for this group.
:type CoreDefinitionVersionArn: string
:param CoreDefinitionVersionArn: The ARN of the core definition version for this group.
:type DeviceDefinitionVersionArn: string
:param DeviceDefinitionVersionArn: The ARN of the device definition version for this group.
:type FunctionDefinitionVersionArn: string
:param FunctionDefinitionVersionArn: The ARN of the function definition version for this group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type LoggerDefinitionVersionArn: string
:param LoggerDefinitionVersionArn: The ARN of the logger definition version for this group.
:type ResourceDefinitionVersionArn: string
:param ResourceDefinitionVersionArn: The ARN of the resource definition version for this group.
:type SubscriptionDefinitionVersionArn: string
:param SubscriptionDefinitionVersionArn: The ARN of the subscription definition version for this group.
:rtype: dict
:returns:
"""
pass
def create_logger_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a logger definition. You may provide the initial version of the logger definition now or use ''CreateLoggerDefinitionVersion'' at a later time.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateLoggerDefinition>`_
**Request Syntax**
::
response = client.create_logger_definition(
AmznClientToken='string',
InitialVersion={
'Loggers': [
{
'Component': 'GreengrassSystem'|'Lambda',
'Id': 'string',
'Level': 'DEBUG'|'INFO'|'WARN'|'ERROR'|'FATAL',
'Space': 123,
'Type': 'FileSystem'|'AWSCloudWatch'
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the logger definition.
- **Loggers** *(list) --* A list of loggers.
- *(dict) --* Information about a logger
- **Component** *(string) --* The component that will be subject to logging.
- **Id** *(string) --* A descriptive or arbitrary ID for the logger. This value must be unique within the logger definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **Level** *(string) --* The level of the logs.
- **Space** *(integer) --* The amount of file space, in KB, to use if the local file system is used for logging purposes.
- **Type** *(string) --* The type of log output which will be used.
:type Name: string
:param Name: The name of the logger definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_logger_definition_version(self, LoggerDefinitionId: str, AmznClientToken: str = None, Loggers: List = None) -> Dict:
"""
Creates a version of a logger definition that has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateLoggerDefinitionVersion>`_
**Request Syntax**
::
response = client.create_logger_definition_version(
AmznClientToken='string',
LoggerDefinitionId='string',
Loggers=[
{
'Component': 'GreengrassSystem'|'Lambda',
'Id': 'string',
'Level': 'DEBUG'|'INFO'|'WARN'|'ERROR'|'FATAL',
'Space': 123,
'Type': 'FileSystem'|'AWSCloudWatch'
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:type Loggers: list
:param Loggers: A list of loggers.
- *(dict) --* Information about a logger
- **Component** *(string) --* The component that will be subject to logging.
- **Id** *(string) --* A descriptive or arbitrary ID for the logger. This value must be unique within the logger definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **Level** *(string) --* The level of the logs.
- **Space** *(integer) --* The amount of file space, in KB, to use if the local file system is used for logging purposes.
- **Type** *(string) --* The type of log output which will be used.
:rtype: dict
:returns:
"""
pass
def create_resource_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a resource definition which contains a list of resources to be used in a group. You can create an initial version of the definition by providing a list of resources now, or use ''CreateResourceDefinitionVersion'' later.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateResourceDefinition>`_
**Request Syntax**
::
response = client.create_resource_definition(
AmznClientToken='string',
InitialVersion={
'Resources': [
{
'Id': 'string',
'Name': 'string',
'ResourceDataContainer': {
'LocalDeviceResourceData': {
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'LocalVolumeResourceData': {
'DestinationPath': 'string',
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'S3MachineLearningModelResourceData': {
'DestinationPath': 'string',
'S3Uri': 'string'
},
'SageMakerMachineLearningModelResourceData': {
'DestinationPath': 'string',
'SageMakerJobArn': 'string'
},
'SecretsManagerSecretResourceData': {
'ARN': 'string',
'AdditionalStagingLabelsToDownload': [
'string',
]
}
}
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the resource definition.
- **Resources** *(list) --* A list of resources.
- *(dict) --* Information about a resource.
- **Id** *(string) --* The resource ID, used to refer to a resource in the Lambda function configuration. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'. This must be unique within a Greengrass group.
- **Name** *(string) --* The descriptive resource name, which is displayed on the AWS IoT Greengrass console. Max length 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'. This must be unique within a Greengrass group.
- **ResourceDataContainer** *(dict) --* A container of data for all resource types.
- **LocalDeviceResourceData** *(dict) --* Attributes that define the local device resource.
- **GroupOwnerSetting** *(dict) --* Group/owner related settings for local resources.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the device resource. The source path for a device resource can refer only to a character device or block device under \'\'/dev\'\'.
- **LocalVolumeResourceData** *(dict) --* Attributes that define the local volume resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **GroupOwnerSetting** *(dict) --* Allows you to configure additional group privileges for the Lambda process. This field is optional.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the volume resource on the host. The source path for a volume resource type cannot start with \'\'/sys\'\'.
- **S3MachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon S3 machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **S3Uri** *(string) --* The URI of the source model in an S3 bucket. The model package must be in tar.gz or .zip format.
- **SageMakerMachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon SageMaker machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **SageMakerJobArn** *(string) --* The ARN of the Amazon SageMaker training job that represents the source model.
- **SecretsManagerSecretResourceData** *(dict) --* Attributes that define a secret resource, which references a secret from AWS Secrets Manager.
- **ARN** *(string) --* The ARN of the Secrets Manager secret to make available on the core. The value of the secret\'s latest version (represented by the \'\'AWSCURRENT\'\' staging label) is included by default.
- **AdditionalStagingLabelsToDownload** *(list) --* Optional. The staging labels whose values you want to make available on the core, in addition to \'\'AWSCURRENT\'\'.
- *(string) --*
:type Name: string
:param Name: The name of the resource definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_resource_definition_version(self, ResourceDefinitionId: str, AmznClientToken: str = None, Resources: List = None) -> Dict:
"""
Creates a version of a resource definition that has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateResourceDefinitionVersion>`_
**Request Syntax**
::
response = client.create_resource_definition_version(
AmznClientToken='string',
ResourceDefinitionId='string',
Resources=[
{
'Id': 'string',
'Name': 'string',
'ResourceDataContainer': {
'LocalDeviceResourceData': {
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'LocalVolumeResourceData': {
'DestinationPath': 'string',
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'S3MachineLearningModelResourceData': {
'DestinationPath': 'string',
'S3Uri': 'string'
},
'SageMakerMachineLearningModelResourceData': {
'DestinationPath': 'string',
'SageMakerJobArn': 'string'
},
'SecretsManagerSecretResourceData': {
'ARN': 'string',
'AdditionalStagingLabelsToDownload': [
'string',
]
}
}
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:type Resources: list
:param Resources: A list of resources.
- *(dict) --* Information about a resource.
- **Id** *(string) --* The resource ID, used to refer to a resource in the Lambda function configuration. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'. This must be unique within a Greengrass group.
- **Name** *(string) --* The descriptive resource name, which is displayed on the AWS IoT Greengrass console. Max length 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'. This must be unique within a Greengrass group.
- **ResourceDataContainer** *(dict) --* A container of data for all resource types.
- **LocalDeviceResourceData** *(dict) --* Attributes that define the local device resource.
- **GroupOwnerSetting** *(dict) --* Group/owner related settings for local resources.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the device resource. The source path for a device resource can refer only to a character device or block device under \'\'/dev\'\'.
- **LocalVolumeResourceData** *(dict) --* Attributes that define the local volume resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **GroupOwnerSetting** *(dict) --* Allows you to configure additional group privileges for the Lambda process. This field is optional.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the volume resource on the host. The source path for a volume resource type cannot start with \'\'/sys\'\'.
- **S3MachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon S3 machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **S3Uri** *(string) --* The URI of the source model in an S3 bucket. The model package must be in tar.gz or .zip format.
- **SageMakerMachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon SageMaker machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **SageMakerJobArn** *(string) --* The ARN of the Amazon SageMaker training job that represents the source model.
- **SecretsManagerSecretResourceData** *(dict) --* Attributes that define a secret resource, which references a secret from AWS Secrets Manager.
- **ARN** *(string) --* The ARN of the Secrets Manager secret to make available on the core. The value of the secret\'s latest version (represented by the \'\'AWSCURRENT\'\' staging label) is included by default.
- **AdditionalStagingLabelsToDownload** *(list) --* Optional. The staging labels whose values you want to make available on the core, in addition to \'\'AWSCURRENT\'\'.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_software_update_job(self, AmznClientToken: str = None, S3UrlSignerRole: str = None, SoftwareToUpdate: str = None, UpdateAgentLogLevel: str = None, UpdateTargets: List = None, UpdateTargetsArchitecture: str = None, UpdateTargetsOperatingSystem: str = None) -> Dict:
"""
Creates a software update for a core or group of cores (specified as an IoT thing group.) Use this to update the OTA Agent as well as the Greengrass core software. It makes use of the IoT Jobs feature which provides additional commands to manage a Greengrass core software update job.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateSoftwareUpdateJob>`_
**Request Syntax**
::
response = client.create_software_update_job(
AmznClientToken='string',
S3UrlSignerRole='string',
SoftwareToUpdate='core'|'ota_agent',
UpdateAgentLogLevel='NONE'|'TRACE'|'DEBUG'|'VERBOSE'|'INFO'|'WARN'|'ERROR'|'FATAL',
UpdateTargets=[
'string',
],
UpdateTargetsArchitecture='armv7l'|'x86_64'|'aarch64',
UpdateTargetsOperatingSystem='ubuntu'|'raspbian'|'amazon_linux'
)
**Response Syntax**
::
{
'IotJobArn': 'string',
'IotJobId': 'string'
}
**Response Structure**
- *(dict) --* success
- **IotJobArn** *(string) --* The IoT Job ARN corresponding to this update.
- **IotJobId** *(string) --* The IoT Job Id corresponding to this update.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type S3UrlSignerRole: string
:param S3UrlSignerRole: The IAM Role that Greengrass will use to create pre-signed URLs pointing towards the update artifact.
:type SoftwareToUpdate: string
:param SoftwareToUpdate: The piece of software on the Greengrass core that will be updated.
:type UpdateAgentLogLevel: string
:param UpdateAgentLogLevel: The minimum level of log statements that should be logged by the OTA Agent during an update.
:type UpdateTargets: list
:param UpdateTargets: The ARNs of the targets (IoT things or IoT thing groups) that this update will be applied to.
- *(string) --*
:type UpdateTargetsArchitecture: string
:param UpdateTargetsArchitecture: The architecture of the cores which are the targets of an update.
:type UpdateTargetsOperatingSystem: string
:param UpdateTargetsOperatingSystem: The operating system of the cores which are the targets of an update.
:rtype: dict
:returns:
"""
pass
def create_subscription_definition(self, AmznClientToken: str = None, InitialVersion: Dict = None, Name: str = None, tags: Dict = None) -> Dict:
"""
Creates a subscription definition. You may provide the initial version of the subscription definition now or use ''CreateSubscriptionDefinitionVersion'' at a later time.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateSubscriptionDefinition>`_
**Request Syntax**
::
response = client.create_subscription_definition(
AmznClientToken='string',
InitialVersion={
'Subscriptions': [
{
'Id': 'string',
'Source': 'string',
'Subject': 'string',
'Target': 'string'
},
]
},
Name='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type InitialVersion: dict
:param InitialVersion: Information about the initial version of the subscription definition.
- **Subscriptions** *(list) --* A list of subscriptions.
- *(dict) --* Information about a subscription.
- **Id** *(string) --* A descriptive or arbitrary ID for the subscription. This value must be unique within the subscription definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **Source** *(string) --* The source of the subscription. Can be a thing ARN, a Lambda function ARN, a connector ARN, \'cloud\' (which represents the AWS IoT cloud), or \'GGShadowService\'.
- **Subject** *(string) --* The MQTT topic used to route the message.
- **Target** *(string) --* Where the message is sent to. Can be a thing ARN, a Lambda function ARN, a connector ARN, \'cloud\' (which represents the AWS IoT cloud), or \'GGShadowService\'.
:type Name: string
:param Name: The name of the subscription definition.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def create_subscription_definition_version(self, SubscriptionDefinitionId: str, AmznClientToken: str = None, Subscriptions: List = None) -> Dict:
"""
Creates a version of a subscription definition which has already been defined.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/CreateSubscriptionDefinitionVersion>`_
**Request Syntax**
::
response = client.create_subscription_definition_version(
AmznClientToken='string',
SubscriptionDefinitionId='string',
Subscriptions=[
{
'Id': 'string',
'Source': 'string',
'Subject': 'string',
'Target': 'string'
},
]
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:type Subscriptions: list
:param Subscriptions: A list of subscriptions.
- *(dict) --* Information about a subscription.
- **Id** *(string) --* A descriptive or arbitrary ID for the subscription. This value must be unique within the subscription definition version. Max length is 128 characters with pattern \'\'[a-zA-Z0-9:_-]+\'\'.
- **Source** *(string) --* The source of the subscription. Can be a thing ARN, a Lambda function ARN, a connector ARN, \'cloud\' (which represents the AWS IoT cloud), or \'GGShadowService\'.
- **Subject** *(string) --* The MQTT topic used to route the message.
- **Target** *(string) --* Where the message is sent to. Can be a thing ARN, a Lambda function ARN, a connector ARN, \'cloud\' (which represents the AWS IoT cloud), or \'GGShadowService\'.
:rtype: dict
:returns:
"""
pass
def delete_connector_definition(self, ConnectorDefinitionId: str) -> Dict:
"""
Deletes a connector definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteConnectorDefinition>`_
**Request Syntax**
::
response = client.delete_connector_definition(
ConnectorDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:rtype: dict
:returns:
"""
pass
def delete_core_definition(self, CoreDefinitionId: str) -> Dict:
"""
Deletes a core definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteCoreDefinition>`_
**Request Syntax**
::
response = client.delete_core_definition(
CoreDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:rtype: dict
:returns:
"""
pass
def delete_device_definition(self, DeviceDefinitionId: str) -> Dict:
"""
Deletes a device definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteDeviceDefinition>`_
**Request Syntax**
::
response = client.delete_device_definition(
DeviceDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:rtype: dict
:returns:
"""
pass
def delete_function_definition(self, FunctionDefinitionId: str) -> Dict:
"""
Deletes a Lambda function definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteFunctionDefinition>`_
**Request Syntax**
::
response = client.delete_function_definition(
FunctionDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:rtype: dict
:returns:
"""
pass
def delete_group(self, GroupId: str) -> Dict:
"""
Deletes a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteGroup>`_
**Request Syntax**
::
response = client.delete_group(
GroupId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def delete_logger_definition(self, LoggerDefinitionId: str) -> Dict:
"""
Deletes a logger definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteLoggerDefinition>`_
**Request Syntax**
::
response = client.delete_logger_definition(
LoggerDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:rtype: dict
:returns:
"""
pass
def delete_resource_definition(self, ResourceDefinitionId: str) -> Dict:
"""
Deletes a resource definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteResourceDefinition>`_
**Request Syntax**
::
response = client.delete_resource_definition(
ResourceDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:rtype: dict
:returns:
"""
pass
def delete_subscription_definition(self, SubscriptionDefinitionId: str) -> Dict:
"""
Deletes a subscription definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DeleteSubscriptionDefinition>`_
**Request Syntax**
::
response = client.delete_subscription_definition(
SubscriptionDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:rtype: dict
:returns:
"""
pass
def disassociate_role_from_group(self, GroupId: str) -> Dict:
"""
Disassociates the role from a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DisassociateRoleFromGroup>`_
**Request Syntax**
::
response = client.disassociate_role_from_group(
GroupId='string'
)
**Response Syntax**
::
{
'DisassociatedAt': 'string'
}
**Response Structure**
- *(dict) --* success
- **DisassociatedAt** *(string) --* The time, in milliseconds since the epoch, when the role was disassociated from the group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def disassociate_service_role_from_account(self) -> Dict:
"""
Disassociates the service role from your account. Without a service role, deployments will not work.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/DisassociateServiceRoleFromAccount>`_
**Request Syntax**
::
response = client.disassociate_service_role_from_account()
**Response Syntax**
::
{
'DisassociatedAt': 'string'
}
**Response Structure**
- *(dict) --* success
- **DisassociatedAt** *(string) --* The time when the service role was disassociated from the account.
:rtype: dict
:returns:
"""
pass
def generate_presigned_url(self, ClientMethod: str = None, Params: Dict = None, ExpiresIn: int = None, HttpMethod: str = None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
``ClientMethod``.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method\'s model.
:returns: The presigned url
"""
pass
def get_associated_role(self, GroupId: str) -> Dict:
"""
Retrieves the role associated with a particular group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetAssociatedRole>`_
**Request Syntax**
::
response = client.get_associated_role(
GroupId='string'
)
**Response Syntax**
::
{
'AssociatedAt': 'string',
'RoleArn': 'string'
}
**Response Structure**
- *(dict) --* success
- **AssociatedAt** *(string) --* The time when the role was associated with the group.
- **RoleArn** *(string) --* The ARN of the role that is associated with the group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def get_bulk_deployment_status(self, BulkDeploymentId: str) -> Dict:
"""
Returns the status of a bulk deployment.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetBulkDeploymentStatus>`_
**Request Syntax**
::
response = client.get_bulk_deployment_status(
BulkDeploymentId='string'
)
**Response Syntax**
::
{
'BulkDeploymentMetrics': {
'InvalidInputRecords': 123,
'RecordsProcessed': 123,
'RetryAttempts': 123
},
'BulkDeploymentStatus': 'Initializing'|'Running'|'Completed'|'Stopping'|'Stopped'|'Failed',
'CreatedAt': 'string',
'ErrorDetails': [
{
'DetailedErrorCode': 'string',
'DetailedErrorMessage': 'string'
},
],
'ErrorMessage': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --* Success. The response body contains the status of the bulk deployment.
- **BulkDeploymentMetrics** *(dict) --* Relevant metrics on input records processed during bulk deployment.
- **InvalidInputRecords** *(integer) --* The total number of records that returned a non-retryable error. For example, this can occur if a group record from the input file uses an invalid format or specifies a nonexistent group version, or if the execution role doesn't grant permission to deploy a group or group version.
- **RecordsProcessed** *(integer) --* The total number of group records from the input file that have been processed so far, or attempted.
- **RetryAttempts** *(integer) --* The total number of deployment attempts that returned a retryable error. For example, a retry is triggered if the attempt to deploy a group returns a throttling error. ''StartBulkDeployment'' retries a group deployment up to five times.
- **BulkDeploymentStatus** *(string) --* The status of the bulk deployment.
- **CreatedAt** *(string) --* The time, in ISO format, when the deployment was created.
- **ErrorDetails** *(list) --* Error details
- *(dict) --* Details about the error.
- **DetailedErrorCode** *(string) --* A detailed error code.
- **DetailedErrorMessage** *(string) --* A detailed error message.
- **ErrorMessage** *(string) --* Error message
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type BulkDeploymentId: string
:param BulkDeploymentId: **[REQUIRED]** The ID of the bulk deployment.
:rtype: dict
:returns:
"""
pass
def get_connectivity_info(self, ThingName: str) -> Dict:
"""
Retrieves the connectivity information for a core.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetConnectivityInfo>`_
**Request Syntax**
::
response = client.get_connectivity_info(
ThingName='string'
)
**Response Syntax**
::
{
'ConnectivityInfo': [
{
'HostAddress': 'string',
'Id': 'string',
'Metadata': 'string',
'PortNumber': 123
},
],
'Message': 'string'
}
**Response Structure**
- *(dict) --* success
- **ConnectivityInfo** *(list) --* Connectivity info list.
- *(dict) --* Information about a Greengrass core's connectivity.
- **HostAddress** *(string) --* The endpoint for the Greengrass core. Can be an IP address or DNS.
- **Id** *(string) --* The ID of the connectivity information.
- **Metadata** *(string) --* Metadata for this endpoint.
- **PortNumber** *(integer) --* The port of the Greengrass core. Usually 8883.
- **Message** *(string) --* A message about the connectivity info request.
:type ThingName: string
:param ThingName: **[REQUIRED]** The thing name.
:rtype: dict
:returns:
"""
pass
def get_connector_definition(self, ConnectorDefinitionId: str) -> Dict:
"""
Retrieves information about a connector definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetConnectorDefinition>`_
**Request Syntax**
::
response = client.get_connector_definition(
ConnectorDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:rtype: dict
:returns:
"""
pass
def get_connector_definition_version(self, ConnectorDefinitionId: str, ConnectorDefinitionVersionId: str, NextToken: str = None) -> Dict:
"""
Retrieves information about a connector definition version, including the connectors that the version contains. Connectors are prebuilt modules that interact with local infrastructure, device protocols, AWS, and other cloud services.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetConnectorDefinitionVersion>`_
**Request Syntax**
::
response = client.get_connector_definition_version(
ConnectorDefinitionId='string',
ConnectorDefinitionVersionId='string',
NextToken='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Connectors': [
{
'ConnectorArn': 'string',
'Id': 'string',
'Parameters': {
'string': 'string'
}
},
]
},
'Id': 'string',
'NextToken': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the connector definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the connector definition version was created.
- **Definition** *(dict) --* Information about the connector definition version.
- **Connectors** *(list) --* A list of references to connectors in this version, with their corresponding configuration settings.
- *(dict) --* Information about a connector. Connectors run on the Greengrass core and contain built-in integration with local infrastructure, device protocols, AWS, and other cloud services.
- **ConnectorArn** *(string) --* The ARN of the connector.
- **Id** *(string) --* A descriptive or arbitrary ID for the connector. This value must be unique within the connector definition version. Max length is 128 characters with pattern [a-zA-Z0-9:_-]+.
- **Parameters** *(dict) --* The parameters or configuration that the connector uses.
- *(string) --*
- *(string) --*
- **Id** *(string) --* The ID of the connector definition version.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Version** *(string) --* The version of the connector definition version.
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:type ConnectorDefinitionVersionId: string
:param ConnectorDefinitionVersionId: **[REQUIRED]** The ID of the connector definition version.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def get_core_definition(self, CoreDefinitionId: str) -> Dict:
"""
Retrieves information about a core definition version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetCoreDefinition>`_
**Request Syntax**
::
response = client.get_core_definition(
CoreDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:rtype: dict
:returns:
"""
pass
def get_core_definition_version(self, CoreDefinitionId: str, CoreDefinitionVersionId: str) -> Dict:
"""
Retrieves information about a core definition version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetCoreDefinitionVersion>`_
**Request Syntax**
::
response = client.get_core_definition_version(
CoreDefinitionId='string',
CoreDefinitionVersionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Cores': [
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
},
'Id': 'string',
'NextToken': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the core definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the core definition version was created.
- **Definition** *(dict) --* Information about the core definition version.
- **Cores** *(list) --* A list of cores in the core definition version.
- *(dict) --* Information about a core.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the core.
- **Id** *(string) --* A descriptive or arbitrary ID for the core. This value must be unique within the core definition version. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''.
- **SyncShadow** *(boolean) --* If true, the core's local shadow is automatically synced with the cloud.
- **ThingArn** *(string) --* The ARN of the thing which is the core.
- **Id** *(string) --* The ID of the core definition version.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Version** *(string) --* The version of the core definition version.
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:type CoreDefinitionVersionId: string
:param CoreDefinitionVersionId: **[REQUIRED]** The ID of the core definition version.
:rtype: dict
:returns:
"""
pass
def get_deployment_status(self, DeploymentId: str, GroupId: str) -> Dict:
"""
Returns the status of a deployment.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetDeploymentStatus>`_
**Request Syntax**
::
response = client.get_deployment_status(
DeploymentId='string',
GroupId='string'
)
**Response Syntax**
::
{
'DeploymentStatus': 'string',
'DeploymentType': 'NewDeployment'|'Redeployment'|'ResetDeployment'|'ForceResetDeployment',
'ErrorDetails': [
{
'DetailedErrorCode': 'string',
'DetailedErrorMessage': 'string'
},
],
'ErrorMessage': 'string',
'UpdatedAt': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the status of the deployment for the group.
- **DeploymentStatus** *(string) --* The status of the deployment: ''InProgress'', ''Building'', ''Success'', or ''Failure''.
- **DeploymentType** *(string) --* The type of the deployment.
- **ErrorDetails** *(list) --* Error details
- *(dict) --* Details about the error.
- **DetailedErrorCode** *(string) --* A detailed error code.
- **DetailedErrorMessage** *(string) --* A detailed error message.
- **ErrorMessage** *(string) --* Error message
- **UpdatedAt** *(string) --* The time, in milliseconds since the epoch, when the deployment status was updated.
:type DeploymentId: string
:param DeploymentId: **[REQUIRED]** The ID of the deployment.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def get_device_definition(self, DeviceDefinitionId: str) -> Dict:
"""
Retrieves information about a device definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetDeviceDefinition>`_
**Request Syntax**
::
response = client.get_device_definition(
DeviceDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:rtype: dict
:returns:
"""
pass
def get_device_definition_version(self, DeviceDefinitionId: str, DeviceDefinitionVersionId: str, NextToken: str = None) -> Dict:
"""
Retrieves information about a device definition version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetDeviceDefinitionVersion>`_
**Request Syntax**
::
response = client.get_device_definition_version(
DeviceDefinitionId='string',
DeviceDefinitionVersionId='string',
NextToken='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Devices': [
{
'CertificateArn': 'string',
'Id': 'string',
'SyncShadow': True|False,
'ThingArn': 'string'
},
]
},
'Id': 'string',
'NextToken': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the device definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the device definition version was created.
- **Definition** *(dict) --* Information about the device definition version.
- **Devices** *(list) --* A list of devices in the definition version.
- *(dict) --* Information about a device.
- **CertificateArn** *(string) --* The ARN of the certificate associated with the device.
- **Id** *(string) --* A descriptive or arbitrary ID for the device. This value must be unique within the device definition version. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''.
- **SyncShadow** *(boolean) --* If true, the device's local shadow will be automatically synced with the cloud.
- **ThingArn** *(string) --* The thing ARN of the device.
- **Id** *(string) --* The ID of the device definition version.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Version** *(string) --* The version of the device definition version.
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:type DeviceDefinitionVersionId: string
:param DeviceDefinitionVersionId: **[REQUIRED]** The ID of the device definition version.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def get_function_definition(self, FunctionDefinitionId: str) -> Dict:
"""
Retrieves information about a Lambda function definition, including its creation time and latest version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetFunctionDefinition>`_
**Request Syntax**
::
response = client.get_function_definition(
FunctionDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:rtype: dict
:returns:
"""
pass
def get_function_definition_version(self, FunctionDefinitionId: str, FunctionDefinitionVersionId: str, NextToken: str = None) -> Dict:
"""
Retrieves information about a Lambda function definition version, including which Lambda functions are included in the version and their configurations.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetFunctionDefinitionVersion>`_
**Request Syntax**
::
response = client.get_function_definition_version(
FunctionDefinitionId='string',
FunctionDefinitionVersionId='string',
NextToken='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'DefaultConfig': {
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
}
},
'Functions': [
{
'FunctionArn': 'string',
'FunctionConfiguration': {
'EncodingType': 'binary'|'json',
'Environment': {
'AccessSysfs': True|False,
'Execution': {
'IsolationMode': 'GreengrassContainer'|'NoContainer',
'RunAs': {
'Gid': 123,
'Uid': 123
}
},
'ResourceAccessPolicies': [
{
'Permission': 'ro'|'rw',
'ResourceId': 'string'
},
],
'Variables': {
'string': 'string'
}
},
'ExecArgs': 'string',
'Executable': 'string',
'MemorySize': 123,
'Pinned': True|False,
'Timeout': 123
},
'Id': 'string'
},
]
},
'Id': 'string',
'NextToken': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the function definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the function definition version was created.
- **Definition** *(dict) --* Information on the definition.
- **DefaultConfig** *(dict) --* The default configuration that applies to all Lambda functions in this function definition version. Individual Lambda functions can override these settings.
- **Execution** *(dict) --* Configuration information that specifies how a Lambda function runs.
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set ''IsolationMode'' to ''NoContainer'' and update config.json in ''greengrass-root/config'' to set ''allowFunctionsToRunAsRoot'' to ''yes''.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
- **Functions** *(list) --* A list of Lambda functions in this function definition version.
- *(dict) --* Information about a Lambda function.
- **FunctionArn** *(string) --* The ARN of the Lambda function.
- **FunctionConfiguration** *(dict) --* The configuration of the Lambda function.
- **EncodingType** *(string) --* The expected encoding type of the input payload for the function. The default is ''json''.
- **Environment** *(dict) --* The environment configuration of the function.
- **AccessSysfs** *(boolean) --* If true, the Lambda function is allowed to access the host's /sys folder. Use this when the Lambda function needs to read device information from /sys. This setting applies only when you run the Lambda function in a Greengrass container.
- **Execution** *(dict) --* Configuration related to executing the Lambda function
- **IsolationMode** *(string) --* Specifies whether the Lambda function runs in a Greengrass container (default) or without containerization. Unless your scenario requires that you run without containerization, we recommend that you run in a Greengrass container. Omit this value to run the Lambda function with the default containerization for the group.
- **RunAs** *(dict) --* Specifies the user and group whose permissions are used when running the Lambda function. You can specify one or both values to override the default values. We recommend that you avoid running as root unless absolutely necessary to minimize the risk of unintended changes or malicious attacks. To run as root, you must set ''IsolationMode'' to ''NoContainer'' and update config.json in ''greengrass-root/config'' to set ''allowFunctionsToRunAsRoot'' to ''yes''.
- **Gid** *(integer) --* The group ID whose permissions are used to run a Lambda function.
- **Uid** *(integer) --* The user ID whose permissions are used to run a Lambda function.
- **ResourceAccessPolicies** *(list) --* A list of the resources, with their permissions, to which the Lambda function will be granted access. A Lambda function can have at most 10 resources. ResourceAccessPolicies apply only when you run the Lambda function in a Greengrass container.
- *(dict) --* A policy used by the function to access a resource.
- **Permission** *(string) --* The permissions that the Lambda function has to the resource. Can be one of ''rw'' (read/write) or ''ro'' (read-only).
- **ResourceId** *(string) --* The ID of the resource. (This ID is assigned to the resource when you create the resource definiton.)
- **Variables** *(dict) --* Environment variables for the Lambda function's configuration.
- *(string) --*
- *(string) --*
- **ExecArgs** *(string) --* The execution arguments.
- **Executable** *(string) --* The name of the function executable.
- **MemorySize** *(integer) --* The memory size, in KB, which the function requires. This setting is not applicable and should be cleared when you run the Lambda function without containerization.
- **Pinned** *(boolean) --* True if the function is pinned. Pinned means the function is long-lived and starts when the core starts.
- **Timeout** *(integer) --* The allowed function execution time, after which Lambda should terminate the function. This timeout still applies to pinned Lambda functions for each request.
- **Id** *(string) --* A descriptive or arbitrary ID for the function. This value must be unique within the function definition version. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''.
- **Id** *(string) --* The ID of the function definition version.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Version** *(string) --* The version of the function definition version.
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:type FunctionDefinitionVersionId: string
:param FunctionDefinitionVersionId: **[REQUIRED]** The ID of the function definition version.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def get_group(self, GroupId: str) -> Dict:
"""
Retrieves information about a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetGroup>`_
**Request Syntax**
::
response = client.get_group(
GroupId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def get_group_certificate_authority(self, CertificateAuthorityId: str, GroupId: str) -> Dict:
"""
Retreives the CA associated with a group. Returns the public key of the CA.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetGroupCertificateAuthority>`_
**Request Syntax**
::
response = client.get_group_certificate_authority(
CertificateAuthorityId='string',
GroupId='string'
)
**Response Syntax**
::
{
'GroupCertificateAuthorityArn': 'string',
'GroupCertificateAuthorityId': 'string',
'PemEncodedCertificate': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the PKI Configuration.
- **GroupCertificateAuthorityArn** *(string) --* The ARN of the certificate authority for the group.
- **GroupCertificateAuthorityId** *(string) --* The ID of the certificate authority for the group.
- **PemEncodedCertificate** *(string) --* The PEM encoded certificate for the group.
:type CertificateAuthorityId: string
:param CertificateAuthorityId: **[REQUIRED]** The ID of the certificate authority.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def get_group_certificate_configuration(self, GroupId: str) -> Dict:
"""
Retrieves the current configuration for the CA used by the group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetGroupCertificateConfiguration>`_
**Request Syntax**
::
response = client.get_group_certificate_configuration(
GroupId='string'
)
**Response Syntax**
::
{
'CertificateAuthorityExpiryInMilliseconds': 'string',
'CertificateExpiryInMilliseconds': 'string',
'GroupId': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the PKI Configuration.
- **CertificateAuthorityExpiryInMilliseconds** *(string) --* The amount of time remaining before the certificate authority expires, in milliseconds.
- **CertificateExpiryInMilliseconds** *(string) --* The amount of time remaining before the certificate expires, in milliseconds.
- **GroupId** *(string) --* The ID of the group certificate configuration.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def get_group_version(self, GroupId: str, GroupVersionId: str) -> Dict:
"""
Retrieves information about a group version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetGroupVersion>`_
**Request Syntax**
::
response = client.get_group_version(
GroupId='string',
GroupVersionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'ConnectorDefinitionVersionArn': 'string',
'CoreDefinitionVersionArn': 'string',
'DeviceDefinitionVersionArn': 'string',
'FunctionDefinitionVersionArn': 'string',
'LoggerDefinitionVersionArn': 'string',
'ResourceDefinitionVersionArn': 'string',
'SubscriptionDefinitionVersionArn': 'string'
},
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the group version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the group version was created.
- **Definition** *(dict) --* Information about the group version definition.
- **ConnectorDefinitionVersionArn** *(string) --* The ARN of the connector definition version for this group.
- **CoreDefinitionVersionArn** *(string) --* The ARN of the core definition version for this group.
- **DeviceDefinitionVersionArn** *(string) --* The ARN of the device definition version for this group.
- **FunctionDefinitionVersionArn** *(string) --* The ARN of the function definition version for this group.
- **LoggerDefinitionVersionArn** *(string) --* The ARN of the logger definition version for this group.
- **ResourceDefinitionVersionArn** *(string) --* The ARN of the resource definition version for this group.
- **SubscriptionDefinitionVersionArn** *(string) --* The ARN of the subscription definition version for this group.
- **Id** *(string) --* The ID of the group version.
- **Version** *(string) --* The unique ID for the version of the group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type GroupVersionId: string
:param GroupVersionId: **[REQUIRED]** The ID of the group version.
:rtype: dict
:returns:
"""
pass
def get_logger_definition(self, LoggerDefinitionId: str) -> Dict:
"""
Retrieves information about a logger definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetLoggerDefinition>`_
**Request Syntax**
::
response = client.get_logger_definition(
LoggerDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:rtype: dict
:returns:
"""
pass
def get_logger_definition_version(self, LoggerDefinitionId: str, LoggerDefinitionVersionId: str, NextToken: str = None) -> Dict:
"""
Retrieves information about a logger definition version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetLoggerDefinitionVersion>`_
**Request Syntax**
::
response = client.get_logger_definition_version(
LoggerDefinitionId='string',
LoggerDefinitionVersionId='string',
NextToken='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Loggers': [
{
'Component': 'GreengrassSystem'|'Lambda',
'Id': 'string',
'Level': 'DEBUG'|'INFO'|'WARN'|'ERROR'|'FATAL',
'Space': 123,
'Type': 'FileSystem'|'AWSCloudWatch'
},
]
},
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the logger definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the logger definition version was created.
- **Definition** *(dict) --* Information about the logger definition version.
- **Loggers** *(list) --* A list of loggers.
- *(dict) --* Information about a logger
- **Component** *(string) --* The component that will be subject to logging.
- **Id** *(string) --* A descriptive or arbitrary ID for the logger. This value must be unique within the logger definition version. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''.
- **Level** *(string) --* The level of the logs.
- **Space** *(integer) --* The amount of file space, in KB, to use if the local file system is used for logging purposes.
- **Type** *(string) --* The type of log output which will be used.
- **Id** *(string) --* The ID of the logger definition version.
- **Version** *(string) --* The version of the logger definition version.
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:type LoggerDefinitionVersionId: string
:param LoggerDefinitionVersionId: **[REQUIRED]** The ID of the logger definition version.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def get_paginator(self, operation_name: str = None) -> Paginator:
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:raise OperationNotPageableError: Raised if the operation is not
pageable. You can use the ``client.can_paginate`` method to
check if an operation is pageable.
:rtype: L{botocore.paginate.Paginator}
:return: A paginator object.
"""
pass
def get_resource_definition(self, ResourceDefinitionId: str) -> Dict:
"""
Retrieves information about a resource definition, including its creation time and latest version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetResourceDefinition>`_
**Request Syntax**
::
response = client.get_resource_definition(
ResourceDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:rtype: dict
:returns:
"""
pass
def get_resource_definition_version(self, ResourceDefinitionId: str, ResourceDefinitionVersionId: str) -> Dict:
"""
Retrieves information about a resource definition version, including which resources are included in the version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetResourceDefinitionVersion>`_
**Request Syntax**
::
response = client.get_resource_definition_version(
ResourceDefinitionId='string',
ResourceDefinitionVersionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Resources': [
{
'Id': 'string',
'Name': 'string',
'ResourceDataContainer': {
'LocalDeviceResourceData': {
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'LocalVolumeResourceData': {
'DestinationPath': 'string',
'GroupOwnerSetting': {
'AutoAddGroupOwner': True|False,
'GroupOwner': 'string'
},
'SourcePath': 'string'
},
'S3MachineLearningModelResourceData': {
'DestinationPath': 'string',
'S3Uri': 'string'
},
'SageMakerMachineLearningModelResourceData': {
'DestinationPath': 'string',
'SageMakerJobArn': 'string'
},
'SecretsManagerSecretResourceData': {
'ARN': 'string',
'AdditionalStagingLabelsToDownload': [
'string',
]
}
}
},
]
},
'Id': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Arn** *(string) --* Arn of the resource definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the resource definition version was created.
- **Definition** *(dict) --* Information about the definition.
- **Resources** *(list) --* A list of resources.
- *(dict) --* Information about a resource.
- **Id** *(string) --* The resource ID, used to refer to a resource in the Lambda function configuration. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''. This must be unique within a Greengrass group.
- **Name** *(string) --* The descriptive resource name, which is displayed on the AWS IoT Greengrass console. Max length 128 characters with pattern ''[a-zA-Z0-9:_-]+''. This must be unique within a Greengrass group.
- **ResourceDataContainer** *(dict) --* A container of data for all resource types.
- **LocalDeviceResourceData** *(dict) --* Attributes that define the local device resource.
- **GroupOwnerSetting** *(dict) --* Group/owner related settings for local resources.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the device resource. The source path for a device resource can refer only to a character device or block device under ''/dev''.
- **LocalVolumeResourceData** *(dict) --* Attributes that define the local volume resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **GroupOwnerSetting** *(dict) --* Allows you to configure additional group privileges for the Lambda process. This field is optional.
- **AutoAddGroupOwner** *(boolean) --* If true, AWS IoT Greengrass automatically adds the specified Linux OS group owner of the resource to the Lambda process privileges. Thus the Lambda process will have the file access permissions of the added Linux group.
- **GroupOwner** *(string) --* The name of the Linux OS group whose privileges will be added to the Lambda process. This field is optional.
- **SourcePath** *(string) --* The local absolute path of the volume resource on the host. The source path for a volume resource type cannot start with ''/sys''.
- **S3MachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon S3 machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **S3Uri** *(string) --* The URI of the source model in an S3 bucket. The model package must be in tar.gz or .zip format.
- **SageMakerMachineLearningModelResourceData** *(dict) --* Attributes that define an Amazon SageMaker machine learning resource.
- **DestinationPath** *(string) --* The absolute local path of the resource inside the Lambda environment.
- **SageMakerJobArn** *(string) --* The ARN of the Amazon SageMaker training job that represents the source model.
- **SecretsManagerSecretResourceData** *(dict) --* Attributes that define a secret resource, which references a secret from AWS Secrets Manager.
- **ARN** *(string) --* The ARN of the Secrets Manager secret to make available on the core. The value of the secret's latest version (represented by the ''AWSCURRENT'' staging label) is included by default.
- **AdditionalStagingLabelsToDownload** *(list) --* Optional. The staging labels whose values you want to make available on the core, in addition to ''AWSCURRENT''.
- *(string) --*
- **Id** *(string) --* The ID of the resource definition version.
- **Version** *(string) --* The version of the resource definition version.
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:type ResourceDefinitionVersionId: string
:param ResourceDefinitionVersionId: **[REQUIRED]** The ID of the resource definition version.
:rtype: dict
:returns:
"""
pass
def get_service_role_for_account(self) -> Dict:
"""
Retrieves the service role that is attached to your account.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetServiceRoleForAccount>`_
**Request Syntax**
::
response = client.get_service_role_for_account()
**Response Syntax**
::
{
'AssociatedAt': 'string',
'RoleArn': 'string'
}
**Response Structure**
- *(dict) --* success
- **AssociatedAt** *(string) --* The time when the service role was associated with the account.
- **RoleArn** *(string) --* The ARN of the role which is associated with the account.
:rtype: dict
:returns:
"""
pass
def get_subscription_definition(self, SubscriptionDefinitionId: str) -> Dict:
"""
Retrieves information about a subscription definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetSubscriptionDefinition>`_
**Request Syntax**
::
response = client.get_subscription_definition(
SubscriptionDefinitionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:rtype: dict
:returns:
"""
pass
def get_subscription_definition_version(self, SubscriptionDefinitionId: str, SubscriptionDefinitionVersionId: str, NextToken: str = None) -> Dict:
"""
Retrieves information about a subscription definition version.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/GetSubscriptionDefinitionVersion>`_
**Request Syntax**
::
response = client.get_subscription_definition_version(
NextToken='string',
SubscriptionDefinitionId='string',
SubscriptionDefinitionVersionId='string'
)
**Response Syntax**
::
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Definition': {
'Subscriptions': [
{
'Id': 'string',
'Source': 'string',
'Subject': 'string',
'Target': 'string'
},
]
},
'Id': 'string',
'NextToken': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --*
- **Arn** *(string) --* The ARN of the subscription definition version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the subscription definition version was created.
- **Definition** *(dict) --* Information about the subscription definition version.
- **Subscriptions** *(list) --* A list of subscriptions.
- *(dict) --* Information about a subscription.
- **Id** *(string) --* A descriptive or arbitrary ID for the subscription. This value must be unique within the subscription definition version. Max length is 128 characters with pattern ''[a-zA-Z0-9:_-]+''.
- **Source** *(string) --* The source of the subscription. Can be a thing ARN, a Lambda function ARN, a connector ARN, 'cloud' (which represents the AWS IoT cloud), or 'GGShadowService'.
- **Subject** *(string) --* The MQTT topic used to route the message.
- **Target** *(string) --* Where the message is sent to. Can be a thing ARN, a Lambda function ARN, a connector ARN, 'cloud' (which represents the AWS IoT cloud), or 'GGShadowService'.
- **Id** *(string) --* The ID of the subscription definition version.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Version** *(string) --* The version of the subscription definition version.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:type SubscriptionDefinitionVersionId: string
:param SubscriptionDefinitionVersionId: **[REQUIRED]** The ID of the subscription definition version.
:rtype: dict
:returns:
"""
pass
def get_waiter(self, waiter_name: str = None) -> Waiter:
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters
section of the service docs for a list of available waiters.
:returns: The specified waiter object.
:rtype: botocore.waiter.Waiter
"""
pass
def list_bulk_deployment_detailed_reports(self, BulkDeploymentId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Gets a paginated list of the deployments that have been started in a bulk deployment operation, and their current deployment status.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListBulkDeploymentDetailedReports>`_
**Request Syntax**
::
response = client.list_bulk_deployment_detailed_reports(
BulkDeploymentId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Deployments': [
{
'CreatedAt': 'string',
'DeploymentArn': 'string',
'DeploymentId': 'string',
'DeploymentStatus': 'string',
'DeploymentType': 'NewDeployment'|'Redeployment'|'ResetDeployment'|'ForceResetDeployment',
'ErrorDetails': [
{
'DetailedErrorCode': 'string',
'DetailedErrorMessage': 'string'
},
],
'ErrorMessage': 'string',
'GroupArn': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the list of deployments for the given group.
- **Deployments** *(list) --* A list of the individual group deployments in the bulk deployment operation.
- *(dict) --* Information about an individual group deployment in a bulk deployment operation.
- **CreatedAt** *(string) --* The time, in ISO format, when the deployment was created.
- **DeploymentArn** *(string) --* The ARN of the group deployment.
- **DeploymentId** *(string) --* The ID of the group deployment.
- **DeploymentStatus** *(string) --* The current status of the group deployment: ''InProgress'', ''Building'', ''Success'', or ''Failure''.
- **DeploymentType** *(string) --* The type of the deployment.
- **ErrorDetails** *(list) --* Details about the error.
- *(dict) --* Details about the error.
- **DetailedErrorCode** *(string) --* A detailed error code.
- **DetailedErrorMessage** *(string) --* A detailed error message.
- **ErrorMessage** *(string) --* The error message for a failed deployment
- **GroupArn** *(string) --* The ARN of the Greengrass group.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type BulkDeploymentId: string
:param BulkDeploymentId: **[REQUIRED]** The ID of the bulk deployment.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_bulk_deployments(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Returns a list of bulk deployments.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListBulkDeployments>`_
**Request Syntax**
::
response = client.list_bulk_deployments(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'BulkDeployments': [
{
'BulkDeploymentArn': 'string',
'BulkDeploymentId': 'string',
'CreatedAt': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the list of bulk deployments.
- **BulkDeployments** *(list) --* A list of bulk deployments.
- *(dict) --* Information about a bulk deployment. You cannot start a new bulk deployment while another one is still running or in a non-terminal state.
- **BulkDeploymentArn** *(string) --* The ARN of the bulk deployment.
- **BulkDeploymentId** *(string) --* The ID of the bulk deployment.
- **CreatedAt** *(string) --* The time, in ISO format, when the deployment was created.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_connector_definition_versions(self, ConnectorDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a connector definition, which are containers for connectors. Connectors run on the Greengrass core and contain built-in integration with local infrastructure, device protocols, AWS, and other cloud services.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListConnectorDefinitionVersions>`_
**Request Syntax**
::
response = client.list_connector_definition_versions(
ConnectorDefinitionId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_connector_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of connector definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListConnectorDefinitions>`_
**Request Syntax**
::
response = client.list_connector_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_core_definition_versions(self, CoreDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a core definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListCoreDefinitionVersions>`_
**Request Syntax**
::
response = client.list_core_definition_versions(
CoreDefinitionId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_core_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of core definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListCoreDefinitions>`_
**Request Syntax**
::
response = client.list_core_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_deployments(self, GroupId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Returns a history of deployments for the group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListDeployments>`_
**Request Syntax**
::
response = client.list_deployments(
GroupId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Deployments': [
{
'CreatedAt': 'string',
'DeploymentArn': 'string',
'DeploymentId': 'string',
'DeploymentType': 'NewDeployment'|'Redeployment'|'ResetDeployment'|'ForceResetDeployment',
'GroupArn': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the list of deployments for the given group.
- **Deployments** *(list) --* A list of deployments for the requested groups.
- *(dict) --* Information about a deployment.
- **CreatedAt** *(string) --* The time, in milliseconds since the epoch, when the deployment was created.
- **DeploymentArn** *(string) --* The ARN of the deployment.
- **DeploymentId** *(string) --* The ID of the deployment.
- **DeploymentType** *(string) --* The type of the deployment.
- **GroupArn** *(string) --* The ARN of the group for this deployment.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_device_definition_versions(self, DeviceDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a device definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListDeviceDefinitionVersions>`_
**Request Syntax**
::
response = client.list_device_definition_versions(
DeviceDefinitionId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_device_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of device definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListDeviceDefinitions>`_
**Request Syntax**
::
response = client.list_device_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_function_definition_versions(self, FunctionDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a Lambda function definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListFunctionDefinitionVersions>`_
**Request Syntax**
::
response = client.list_function_definition_versions(
FunctionDefinitionId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --* success
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_function_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of Lambda function definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListFunctionDefinitions>`_
**Request Syntax**
::
response = client.list_function_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --* Success. The response contains the IDs of all the Greengrass Lambda function definitions in this account.
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_group_certificate_authorities(self, GroupId: str) -> Dict:
"""
Retrieves the current CAs for a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListGroupCertificateAuthorities>`_
**Request Syntax**
::
response = client.list_group_certificate_authorities(
GroupId='string'
)
**Response Syntax**
::
{
'GroupCertificateAuthorities': [
{
'GroupCertificateAuthorityArn': 'string',
'GroupCertificateAuthorityId': 'string'
},
]
}
**Response Structure**
- *(dict) --* Success. The response body contains the PKI Configuration.
- **GroupCertificateAuthorities** *(list) --* A list of certificate authorities associated with the group.
- *(dict) --* Information about a certificate authority for a group.
- **GroupCertificateAuthorityArn** *(string) --* The ARN of the certificate authority for the group.
- **GroupCertificateAuthorityId** *(string) --* The ID of the certificate authority for the group.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def list_group_versions(self, GroupId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListGroupVersions>`_
**Request Syntax**
::
response = client.list_group_versions(
GroupId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --* Success. The response contains the list of versions and metadata for the given group.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_groups(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of groups.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListGroups>`_
**Request Syntax**
::
response = client.list_groups(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Groups': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Groups** *(list) --* Information about a group.
- *(dict) --* Information about a group.
- **Arn** *(string) --* The ARN of the group.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the group was created.
- **Id** *(string) --* The ID of the group.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the group was last updated.
- **LatestVersion** *(string) --* The latest version of the group.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the group.
- **Name** *(string) --* The name of the group.
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_logger_definition_versions(self, LoggerDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a logger definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListLoggerDefinitionVersions>`_
**Request Syntax**
::
response = client.list_logger_definition_versions(
LoggerDefinitionId='string',
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_logger_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of logger definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListLoggerDefinitions>`_
**Request Syntax**
::
response = client.list_logger_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_resource_definition_versions(self, ResourceDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a resource definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListResourceDefinitionVersions>`_
**Request Syntax**
::
response = client.list_resource_definition_versions(
MaxResults='string',
NextToken='string',
ResourceDefinitionId='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --* success
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:rtype: dict
:returns:
"""
pass
def list_resource_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of resource definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListResourceDefinitions>`_
**Request Syntax**
::
response = client.list_resource_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --* The IDs of all the Greengrass resource definitions in this account.
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_subscription_definition_versions(self, SubscriptionDefinitionId: str, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Lists the versions of a subscription definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListSubscriptionDefinitionVersions>`_
**Request Syntax**
::
response = client.list_subscription_definition_versions(
MaxResults='string',
NextToken='string',
SubscriptionDefinitionId='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Versions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'Version': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
- **Versions** *(list) --* Information about a version.
- *(dict) --* Information about a version.
- **Arn** *(string) --* The ARN of the version.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the version was created.
- **Id** *(string) --* The ID of the version.
- **Version** *(string) --* The unique ID of the version.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:rtype: dict
:returns:
"""
pass
def list_subscription_definitions(self, MaxResults: str = None, NextToken: str = None) -> Dict:
"""
Retrieves a list of subscription definitions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListSubscriptionDefinitions>`_
**Request Syntax**
::
response = client.list_subscription_definitions(
MaxResults='string',
NextToken='string'
)
**Response Syntax**
::
{
'Definitions': [
{
'Arn': 'string',
'CreationTimestamp': 'string',
'Id': 'string',
'LastUpdatedTimestamp': 'string',
'LatestVersion': 'string',
'LatestVersionArn': 'string',
'Name': 'string',
'Tags': {
'string': 'string'
}
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Definitions** *(list) --* Information about a definition.
- *(dict) --* Information about a definition.
- **Arn** *(string) --* The ARN of the definition.
- **CreationTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was created.
- **Id** *(string) --* The ID of the definition.
- **LastUpdatedTimestamp** *(string) --* The time, in milliseconds since the epoch, when the definition was last updated.
- **LatestVersion** *(string) --* The latest version of the definition.
- **LatestVersionArn** *(string) --* The ARN of the latest version of the definition.
- **Name** *(string) --* The name of the definition.
- **Tags** *(dict) --* The tags for the definition.
- *(string) --*
- *(string) --*
- **NextToken** *(string) --* The token for the next set of results, or ''null'' if there are no additional results.
:type MaxResults: string
:param MaxResults: The maximum number of results to be returned per request.
:type NextToken: string
:param NextToken: The token for the next set of results, or \'\'null\'\' if there are no additional results.
:rtype: dict
:returns:
"""
pass
def list_tags_for_resource(self, ResourceArn: str) -> Dict:
"""
Retrieves the tags for a resource.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ListTagsForResource>`_
**Request Syntax**
::
response = client.list_tags_for_resource(
ResourceArn='string'
)
**Response Syntax**
::
{
'tags': {
'string': 'string'
}
}
**Response Structure**
- *(dict) --*
- **tags** *(dict) --* A map of the key-value pairs for the resource tag.
- *(string) --*
- *(string) --*
:type ResourceArn: string
:param ResourceArn: **[REQUIRED]** The Amazon Resource Name (ARN) of the resource.
:rtype: dict
:returns:
"""
pass
def reset_deployments(self, GroupId: str, AmznClientToken: str = None, Force: bool = None) -> Dict:
"""
Resets a group's deployments.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/ResetDeployments>`_
**Request Syntax**
::
response = client.reset_deployments(
AmznClientToken='string',
Force=True|False,
GroupId='string'
)
**Response Syntax**
::
{
'DeploymentArn': 'string',
'DeploymentId': 'string'
}
**Response Structure**
- *(dict) --* Success. The group's deployments were reset.
- **DeploymentArn** *(string) --* The ARN of the deployment.
- **DeploymentId** *(string) --* The ID of the deployment.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type Force: boolean
:param Force: If true, performs a best-effort only core reset.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def start_bulk_deployment(self, AmznClientToken: str = None, ExecutionRoleArn: str = None, InputFileUri: str = None, tags: Dict = None) -> Dict:
"""
Deploys multiple groups in one operation. This action starts the bulk deployment of a specified set of group versions. Each group version deployment will be triggered with an adaptive rate that has a fixed upper limit. We recommend that you include an ''X-Amzn-Client-Token'' token in every ''StartBulkDeployment'' request. These requests are idempotent with respect to the token and the request parameters.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/StartBulkDeployment>`_
**Request Syntax**
::
response = client.start_bulk_deployment(
AmznClientToken='string',
ExecutionRoleArn='string',
InputFileUri='string',
tags={
'string': 'string'
}
)
**Response Syntax**
::
{
'BulkDeploymentArn': 'string',
'BulkDeploymentId': 'string'
}
**Response Structure**
- *(dict) --* success
- **BulkDeploymentArn** *(string) --* The ARN of the bulk deployment.
- **BulkDeploymentId** *(string) --* The ID of the bulk deployment.
:type AmznClientToken: string
:param AmznClientToken: A client token used to correlate requests and responses.
:type ExecutionRoleArn: string
:param ExecutionRoleArn: The ARN of the execution role to associate with the bulk deployment operation. This IAM role must allow the \'\'greengrass:CreateDeployment\'\' action for all group versions that are listed in the input file. This IAM role must have access to the S3 bucket containing the input file.
:type InputFileUri: string
:param InputFileUri: The URI of the input file contained in the S3 bucket. The execution role must have \'\'getObject\'\' permissions on this bucket to access the input file. The input file is a JSON-serialized, line delimited file with UTF-8 encoding that provides a list of group and version IDs and the deployment type. This file must be less than 100 MB. Currently, AWS IoT Greengrass supports only \'\'NewDeployment\'\' deployment types.
:type tags: dict
:param tags: Tag(s) to add to the new resource
- *(string) --*
- *(string) --*
:rtype: dict
:returns:
"""
pass
def stop_bulk_deployment(self, BulkDeploymentId: str) -> Dict:
"""
Stops the execution of a bulk deployment. This action returns a status of ''Stopping'' until the deployment is stopped. You cannot start a new bulk deployment while a previous deployment is in the ''Stopping'' state. This action doesn't rollback completed deployments or cancel pending deployments.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/StopBulkDeployment>`_
**Request Syntax**
::
response = client.stop_bulk_deployment(
BulkDeploymentId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* Success. The bulk deployment is being stopped.
:type BulkDeploymentId: string
:param BulkDeploymentId: **[REQUIRED]** The ID of the bulk deployment.
:rtype: dict
:returns:
"""
pass
def tag_resource(self, ResourceArn: str, tags: Dict):
"""
Add tags to a resource.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/TagResource>`_
**Request Syntax**
::
response = client.tag_resource(
ResourceArn='string',
tags={
'string': 'string'
}
)
:type ResourceArn: string
:param ResourceArn: **[REQUIRED]** The Amazon Resource Name (ARN) of the resource.
:type tags: dict
:param tags: **[REQUIRED]** A map of the key-value pairs for the resource tag.
- *(string) --*
- *(string) --*
:returns: None
"""
pass
def untag_resource(self, ResourceArn: str, TagKeys: List):
"""
Remove tags with specified keys from a resource.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UntagResource>`_
**Request Syntax**
::
response = client.untag_resource(
ResourceArn='string',
TagKeys=[
'string',
]
)
:type ResourceArn: string
:param ResourceArn: **[REQUIRED]** The Amazon Resource Name (ARN) of the resource.
:type TagKeys: list
:param TagKeys: **[REQUIRED]** A list of the keys to remove from the resource tags.
- *(string) --*
:returns: None
"""
pass
def update_connectivity_info(self, ThingName: str, ConnectivityInfo: List = None) -> Dict:
"""
Updates the connectivity information for the core. Any devices that belong to the group which has this core will receive this information in order to find the location of the core and connect to it.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateConnectivityInfo>`_
**Request Syntax**
::
response = client.update_connectivity_info(
ConnectivityInfo=[
{
'HostAddress': 'string',
'Id': 'string',
'Metadata': 'string',
'PortNumber': 123
},
],
ThingName='string'
)
**Response Syntax**
::
{
'Message': 'string',
'Version': 'string'
}
**Response Structure**
- *(dict) --* success
- **Message** *(string) --* A message about the connectivity info update request.
- **Version** *(string) --* The new version of the connectivity info.
:type ConnectivityInfo: list
:param ConnectivityInfo: A list of connectivity info.
- *(dict) --* Information about a Greengrass core\'s connectivity.
- **HostAddress** *(string) --* The endpoint for the Greengrass core. Can be an IP address or DNS.
- **Id** *(string) --* The ID of the connectivity information.
- **Metadata** *(string) --* Metadata for this endpoint.
- **PortNumber** *(integer) --* The port of the Greengrass core. Usually 8883.
:type ThingName: string
:param ThingName: **[REQUIRED]** The thing name.
:rtype: dict
:returns:
"""
pass
def update_connector_definition(self, ConnectorDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a connector definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateConnectorDefinition>`_
**Request Syntax**
::
response = client.update_connector_definition(
ConnectorDefinitionId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type ConnectorDefinitionId: string
:param ConnectorDefinitionId: **[REQUIRED]** The ID of the connector definition.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_core_definition(self, CoreDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a core definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateCoreDefinition>`_
**Request Syntax**
::
response = client.update_core_definition(
CoreDefinitionId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type CoreDefinitionId: string
:param CoreDefinitionId: **[REQUIRED]** The ID of the core definition.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_device_definition(self, DeviceDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a device definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateDeviceDefinition>`_
**Request Syntax**
::
response = client.update_device_definition(
DeviceDefinitionId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type DeviceDefinitionId: string
:param DeviceDefinitionId: **[REQUIRED]** The ID of the device definition.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_function_definition(self, FunctionDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a Lambda function definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateFunctionDefinition>`_
**Request Syntax**
::
response = client.update_function_definition(
FunctionDefinitionId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type FunctionDefinitionId: string
:param FunctionDefinitionId: **[REQUIRED]** The ID of the Lambda function definition.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_group(self, GroupId: str, Name: str = None) -> Dict:
"""
Updates a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateGroup>`_
**Request Syntax**
::
response = client.update_group(
GroupId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_group_certificate_configuration(self, GroupId: str, CertificateExpiryInMilliseconds: str = None) -> Dict:
"""
Updates the Certificate expiry time for a group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateGroupCertificateConfiguration>`_
**Request Syntax**
::
response = client.update_group_certificate_configuration(
CertificateExpiryInMilliseconds='string',
GroupId='string'
)
**Response Syntax**
::
{
'CertificateAuthorityExpiryInMilliseconds': 'string',
'CertificateExpiryInMilliseconds': 'string',
'GroupId': 'string'
}
**Response Structure**
- *(dict) --* Success. The response body contains the PKI Configuration.
- **CertificateAuthorityExpiryInMilliseconds** *(string) --* The amount of time remaining before the certificate authority expires, in milliseconds.
- **CertificateExpiryInMilliseconds** *(string) --* The amount of time remaining before the certificate expires, in milliseconds.
- **GroupId** *(string) --* The ID of the group certificate configuration.
:type CertificateExpiryInMilliseconds: string
:param CertificateExpiryInMilliseconds: The amount of time remaining before the certificate expires, in milliseconds.
:type GroupId: string
:param GroupId: **[REQUIRED]** The ID of the Greengrass group.
:rtype: dict
:returns:
"""
pass
def update_logger_definition(self, LoggerDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a logger definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateLoggerDefinition>`_
**Request Syntax**
::
response = client.update_logger_definition(
LoggerDefinitionId='string',
Name='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type LoggerDefinitionId: string
:param LoggerDefinitionId: **[REQUIRED]** The ID of the logger definition.
:type Name: string
:param Name: The name of the definition.
:rtype: dict
:returns:
"""
pass
def update_resource_definition(self, ResourceDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a resource definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateResourceDefinition>`_
**Request Syntax**
::
response = client.update_resource_definition(
Name='string',
ResourceDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type Name: string
:param Name: The name of the definition.
:type ResourceDefinitionId: string
:param ResourceDefinitionId: **[REQUIRED]** The ID of the resource definition.
:rtype: dict
:returns:
"""
pass
def update_subscription_definition(self, SubscriptionDefinitionId: str, Name: str = None) -> Dict:
"""
Updates a subscription definition.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/greengrass-2017-06-07/UpdateSubscriptionDefinition>`_
**Request Syntax**
::
response = client.update_subscription_definition(
Name='string',
SubscriptionDefinitionId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --* success
:type Name: string
:param Name: The name of the definition.
:type SubscriptionDefinitionId: string
:param SubscriptionDefinitionId: **[REQUIRED]** The ID of the subscription definition.
:rtype: dict
:returns:
"""
pass
| 49.850046 | 525 | 0.544276 | 19,856 | 215,751 | 5.885677 | 0.0417 | 0.020836 | 0.009224 | 0.011808 | 0.864186 | 0.839509 | 0.807763 | 0.777095 | 0.756234 | 0.742183 | 0 | 0.006903 | 0.347984 | 215,751 | 4,327 | 526 | 49.861567 | 0.823861 | 0.792154 | 0 | 0.479592 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001387 | 0 | 1 | 0.479592 | false | 0.479592 | 0.035714 | 0 | 0.520408 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 9 |
907319c198188881dbe8ec9f1b645f1bca3789ce | 27,184 | py | Python | tests/v2_validation/cattlevalidationtest/core/test_services_volumemount.py | rancherio/validation-tests | 795733982209edc801d5db624a3fb55043987ed0 | [
"Apache-2.0"
] | 7 | 2015-11-18T17:43:08.000Z | 2021-07-14T09:48:18.000Z | tests/v2_validation/cattlevalidationtest/core/test_services_volumemount.py | rancher/validation-tests | 795733982209edc801d5db624a3fb55043987ed0 | [
"Apache-2.0"
] | 175 | 2015-07-09T18:41:24.000Z | 2021-06-10T21:23:27.000Z | tests/v2_validation/cattlevalidationtest/core/test_services_volumemount.py | rancher/validation-tests | 795733982209edc801d5db624a3fb55043987ed0 | [
"Apache-2.0"
] | 25 | 2015-08-08T04:54:24.000Z | 2021-05-25T21:10:37.000Z | from common_fixtures import * # NOQA
WEB_IMAGE_UUID = "docker:sangeetha/testlbsd:latest"
SSH_IMAGE_UUID = "docker:sangeetha/testclient:latest"
logger = logging.getLogger(__name__)
def env_with_2_svc_and_volume_mount_with_config(client, service_scale,
launch_config_consumed_service,
launch_config_service):
# Create Environment
random_name = random_str()
env_name = random_name.replace("-", "")
env = client.create_stack(name=env_name)
env = client.wait_success(env)
assert env.state == "active"
# Create service
random_name = random_str()
consumed_service_name = random_name.replace("-", "")
launch_config_service["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name]
launch_config_consumed_service["name"] = consumed_service_name
random_name = random_str()
service_name = random_name.replace("-", "")
service = client.create_service(
name=service_name, stackId=env.id,
launchConfig=launch_config_service, scale=service_scale,
secondaryLaunchConfigs=[launch_config_consumed_service])
service = client.wait_success(service)
assert service.state == "inactive"
consumed_service_name = \
get_sidekick_service_name(env, service, consumed_service_name)
service_name = get_service_name(env, service)
return env, service, service_name, consumed_service_name
def create_env_with_2_svc_and_volume_mount(client, service_scale):
launch_config_consumed_service = {
"imageUuid": WEB_IMAGE_UUID}
launch_config_service = {
"imageUuid": SSH_IMAGE_UUID}
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount_with_config(
client, service_scale,
launch_config_consumed_service, launch_config_service)
return env, service, service_name, consumed_service_name
def create_env_with_multiple_svcs_and_volume_mounts(
client, service_scale):
launch_config_consumed_service1 = {
"imageUuid": "docker:redis"}
launch_config_consumed_service2 = {
"imageUuid": WEB_IMAGE_UUID}
launch_config_service = {
"imageUuid": SSH_IMAGE_UUID}
random_name = random_str()
consumed_service_name1 = random_name.replace("-", "")
random_name = random_str()
consumed_service_name2 = random_name.replace("-", "")
launch_config_service["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name1, consumed_service_name2]
launch_config_consumed_service1["name"] = consumed_service_name1
launch_config_consumed_service2["name"] = consumed_service_name2
# Create Environment
random_name = random_str()
env_name = random_name.replace("-", "")
env = client.create_stack(name=env_name)
env = client.wait_success(env)
assert env.state == "active"
# Create service
random_name = random_str()
service_name = random_name.replace("-", "")
service = client.create_service(
name=service_name, stackId=env.id,
launchConfig=launch_config_service, scale=service_scale,
secondaryLaunchConfigs=[launch_config_consumed_service1,
launch_config_consumed_service2])
service = client.wait_success(service)
assert service.state == "inactive"
consumed_service_name1 = \
get_sidekick_service_name(env, service, consumed_service_name1)
consumed_service_name2 = \
get_sidekick_service_name(env, service, consumed_service_name2)
service_name = get_service_name(env, service)
return env, service, service_name, \
[consumed_service_name1, consumed_service_name2]
def create_env_with_multiple_levels_svcs_and_volume_mounts(
client, service_scale):
launch_config_consumed_service1 = {
"imageUuid": "docker:redis"}
launch_config_consumed_service2 = {
"imageUuid": WEB_IMAGE_UUID}
launch_config_service = {
"imageUuid": SSH_IMAGE_UUID}
random_name = random_str()
consumed_service_name1 = random_name.replace("-", "")
random_name = random_str()
consumed_service_name2 = random_name.replace("-", "")
launch_config_service["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name1]
launch_config_consumed_service1["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name2]
launch_config_consumed_service1["name"] = consumed_service_name1
launch_config_consumed_service2["name"] = consumed_service_name2
# Create Environment
random_name = random_str()
env_name = random_name.replace("-", "")
env = client.create_stack(name=env_name)
env = client.wait_success(env)
assert env.state == "active"
# Create service
random_name = random_str()
service_name = random_name.replace("-", "")
service = client.create_service(
name=service_name, stackId=env.id,
launchConfig=launch_config_service, scale=service_scale,
secondaryLaunchConfigs=[launch_config_consumed_service1,
launch_config_consumed_service2])
service = client.wait_success(service)
assert service.state == "inactive"
consumed_service_name1 = \
get_sidekick_service_name(env, service, consumed_service_name1)
consumed_service_name2 = \
get_sidekick_service_name(env, service, consumed_service_name2)
service_name = get_service_name(env, service)
return \
env, service, service_name, consumed_service_name1, \
consumed_service_name2
def create_env_with_multiple_levels_svcs_and_volume_mounts_circular(
client, service_scale):
launch_config_consumed_service1 = {
"imageUuid": "docker:redis"}
launch_config_consumed_service2 = {
"imageUuid": WEB_IMAGE_UUID}
launch_config_service = {
"imageUuid": SSH_IMAGE_UUID}
random_name = random_str()
consumed_service_name1 = random_name.replace("-", "")
random_name = random_str()
consumed_service_name2 = random_name.replace("-", "")
launch_config_service["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name1]
launch_config_consumed_service1["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name2]
launch_config_consumed_service2["dataVolumesFromLaunchConfigs"] = \
[consumed_service_name1]
launch_config_consumed_service1["name"] = consumed_service_name1
launch_config_consumed_service2["name"] = consumed_service_name2
# Create Environment
random_name = random_str()
env_name = random_name.replace("-", "")
env = client.create_stack(name=env_name)
env = client.wait_success(env)
assert env.state == "active"
# Create service
random_name = random_str()
service_name = random_name.replace("-", "")
service = client.create_service(
name=service_name, stackId=env.id,
launchConfig=launch_config_service, scale=service_scale,
secondaryLaunchConfigs=[launch_config_consumed_service1,
launch_config_consumed_service2])
service = client.wait_success(service)
assert service.state == "inactive"
consumed_service_name1 = \
get_sidekick_service_name(env, service, consumed_service_name1)
consumed_service_name2 = \
get_sidekick_service_name(env, service, consumed_service_name2)
service_name = get_service_name(env, service)
return \
env, service, service_name, consumed_service_name1, \
consumed_service_name2
def env_with_2_svc_and_volume_mount(client, service_scale):
env, service, service_name, consumed_service_name = \
create_env_with_2_svc_and_volume_mount(
client, service_scale)
env = env.activateservices()
env = client.wait_success(env, 120)
assert env.state == "active"
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service_name])
return env, service, service_name, consumed_service_name
def test_volume_mount_activate_env(client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
create_env_with_2_svc_and_volume_mount(client, service_scale)
env = env.activateservices()
env = client.wait_success(env, 120)
assert env.state == "active"
service = client.wait_success(service, 120)
assert service.state == "active"
delete_all(client, [env])
def test_volume_mount_activate_service(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
create_env_with_2_svc_and_volume_mount(client, service_scale)
service = service.activate()
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_multiple_volume_mount_activate_service(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_services =\
create_env_with_multiple_svcs_and_volume_mounts(
client, service_scale)
env = env.activateservices()
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(
client, service, service_name, consumed_services)
delete_all(client, [env])
def test_multiple_level_volume_mount_activate_service(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_service1, consumed_service2 =\
create_env_with_multiple_levels_svcs_and_volume_mounts(
client, service_scale)
env = env.activateservices()
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service1])
validate_volume_mount(client, service, consumed_service1,
[consumed_service2])
delete_all(client, [env])
def test_multiple_level_volume_mount_delete_services_1(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_service1, consumed_service2 =\
create_env_with_multiple_levels_svcs_and_volume_mounts(
client, service_scale)
env = env.activateservices()
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service1])
validate_volume_mount(client, service, consumed_service1,
[consumed_service2])
# Delete container from consumed_service2
container_name = consumed_service2 + FIELD_SEPARATOR + "1"
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
print(container_name)
consumed1_container = get_side_kick_container(
client, container, service, consumed_service1)
primary_container = get_side_kick_container(
client, container, service, service_name)
# Delete instance
container = client.wait_success(client.delete(container))
assert container.state == 'removed'
# Wait for both the consuming containers to be removed
print(consumed1_container.name + " - " + consumed1_container.state)
wait_for_condition(
client, consumed1_container,
lambda x: x.state == "removed",
lambda x: 'State is: ' + x.state)
consumed1_container = client.reload(consumed1_container)
assert consumed1_container.state == "removed"
print(consumed1_container.name + " - " + consumed1_container.state)
print(primary_container.name + " - " + primary_container.state)
wait_for_condition(
client, primary_container,
lambda x: x.state == "removed",
lambda x: 'State is: ' + x.state)
primary_container = client.reload(primary_container)
assert primary_container.state == "removed"
print(primary_container.name + " - " + primary_container.state)
wait_state(client, service, "active")
validate_volume_mount(client, service, service_name,
[consumed_service1])
validate_volume_mount(client, service, consumed_service1,
[consumed_service2])
delete_all(client, [env])
def test_multiple_level_volume_mount_delete_services_2(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_service1, consumed_service2 =\
create_env_with_multiple_levels_svcs_and_volume_mounts(
client, service_scale)
env = env.activateservices()
service = client.wait_success(service, 120)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service1])
validate_volume_mount(client, service, consumed_service1,
[consumed_service2])
# Delete container from consumed_service1
container_name = consumed_service1 + FIELD_SEPARATOR + "1"
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
print(container_name)
consumed2_container = get_side_kick_container(
client, container, service, consumed_service2)
print(consumed2_container.name)
primary_container = get_side_kick_container(
client, container, service, service_name)
print(primary_container.name)
# Delete instance
container = client.wait_success(client.delete(container))
assert container.state == 'removed'
wait_state(client, service, "active")
# Wait for primary (consuming) container to be removed
wait_for_condition(client, primary_container,
lambda x: x.state == "removed",
lambda x: 'State is: ' + x.state)
validate_volume_mount(client, service, service_name,
[consumed_service1])
validate_volume_mount(client, service, consumed_service1,
[consumed_service2])
# Check that consuming container of the deleted instance is recreated
# but the consumed container of the deleted instance continues to be in
# running state
consumed2_container = client.reload(consumed2_container)
print(consumed2_container.state)
assert consumed2_container.state == "running"
delete_all(client, [env])
def test_multiple_level_volume_mount_activate_service_circular(client):
service_scale = 2
try:
env, service, consumed_service1, consumed_service2 =\
create_env_with_multiple_levels_svcs_and_volume_mounts_circular(
client, service_scale)
except Exception as e1:
assert e1.error.code == "InvalidReference"
assert e1.error.status == 422
def test_volume_mount_service_scale_up(client, socat_containers):
service_scale = 2
final_service_scale = 3
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
service = client.update(service, scale=final_service_scale,
name=service.name)
service = client.wait_success(service, 120)
assert service.state == "active"
assert service.scale == final_service_scale
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_service_scale_down(client,
socat_containers):
service_scale = 4
final_service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
service = client.update(service, scale=final_service_scale,
name=service.name)
service = client.wait_success(service, 120)
assert service.state == "active"
assert service.scale == final_service_scale
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_consumed_services_stop_start_instance(
client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = consumed_service_name + FIELD_SEPARATOR + "2"
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
# Stop instance
stop_container_from_host(client, container)
wait_state(client, service, "active")
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_consumed_services_restart_instance(
client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = consumed_service_name + FIELD_SEPARATOR + "2"
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
# restart instance
container = client.wait_success(container.restart(), 120)
assert container.state == 'running'
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_consumed_services_delete_instance(
client, socat_containers):
service_scale = 3
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = consumed_service_name + FIELD_SEPARATOR + "1"
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
print(container_name)
primary_container = get_side_kick_container(
client, container, service, service_name)
print(primary_container.name)
# Delete instance
container = client.wait_success(client.delete(container))
assert container.state == 'removed'
wait_state(client, service, "active")
# Wait for primary (consuming) container to be removed
wait_for_condition(client, primary_container,
lambda x: x.state == "removed",
lambda x: 'State is: ' + x.state)
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_deactivate_activate_environment(client,
socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
env = env.deactivateservices()
service = client.wait_success(service, 120)
assert service.state == "inactive"
wait_until_instances_get_stopped_for_service_with_sec_launch_configs(
client, service)
env = env.activateservices()
service = client.wait_success(service, 120)
assert service.state == "active"
time.sleep(restart_sleep_interval)
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_services_stop_start_instance(
client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = get_container_name(env, service, "2")
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
# Stop instance
stop_container_from_host(client, container)
wait_state(client, service, "active")
time.sleep(restart_sleep_interval)
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_services_restart_instance(client,
socat_containers):
service_scale = 3
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = get_container_name(env, service, "2")
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
# restart instance
container = client.wait_success(container.restart(), 120)
assert container.state == 'running'
time.sleep(restart_sleep_interval)
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_services_delete_instance(
client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
container_name = get_container_name(env, service, "1")
containers = client.list_container(name=container_name).data
assert len(containers) == 1
container = containers[0]
print(container_name)
consumed_container = get_side_kick_container(
client, container, service, consumed_service_name)
print(consumed_container.name)
# Delete instance
container = client.wait_success(client.delete(container))
assert container.state == 'removed'
wait_state(client, service, "active")
validate_volume_mount(client, service, service_name,
[consumed_service_name])
# Check that the consumed container is not recreated
consumed_container = client.reload(consumed_container)
print(consumed_container.state)
assert consumed_container.state == "running"
delete_all(client, [env])
def test_volume_mount_services_deactivate_activate(
client, socat_containers):
service_scale = 2
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount(client, service_scale)
service = service.deactivate()
service = client.wait_success(service, 120)
assert service.state == "inactive"
wait_until_instances_get_stopped_for_service_with_sec_launch_configs(
client, service)
service = service.activate()
service = client.wait_success(service, 120)
assert service.state == "active"
time.sleep(restart_sleep_interval)
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def test_volume_mount_with_start_once(client, socat_containers):
launch_config_consumed_service = {
"imageUuid": WEB_IMAGE_UUID,
"labels": {"io.rancher.container.start_once": True}}
launch_config_service = {
"imageUuid": SSH_IMAGE_UUID}
env, service, service_name, consumed_service_name = \
env_with_2_svc_and_volume_mount_with_config(
client, 10,
launch_config_consumed_service, launch_config_service)
service = service.activate()
service = client.wait_success(service, 180)
assert service.state == "active"
validate_volume_mount(client, service, service_name,
[consumed_service_name])
delete_all(client, [env])
def get_service_container_name_list(client, service, name):
container_extids = []
containers = get_service_containers_with_name(client, service, name)
for container in containers:
container_extids.append(container.externalId)
return container_extids
def validate_volume_mount(
client, primary_service, service, consumed_services):
print("Validating service - " + service)
containers = get_service_containers_with_name(client,
primary_service,
service)
assert len(containers) == primary_service.scale
consolidated_container_list = []
mounted_container_names = []
volumes_from_list = []
for consumed_service_name in consumed_services:
print("Validating Consumed Services: " + consumed_service_name)
mounted_containers = get_service_container_name_list(
client, primary_service, consumed_service_name)
assert len(mounted_containers) == primary_service.scale
for mounted_container in mounted_containers:
mounted_container_names.append(mounted_container)
consolidated_container_list.append(mounted_containers)
print("All container lists" + str(consolidated_container_list))
print("All containers" + str(mounted_container_names))
# For every container in the service , make sure that there is 1
# mounted container volume from each of the consumed service
for con in containers:
host = client.by_id('host', con.hosts[0].id)
docker_client = get_docker_client(host)
inspect = docker_client.inspect_container(con.externalId)
volumeFrom = inspect["HostConfig"]["VolumesFrom"]
print(con.name + "->" + str(volumeFrom))
assert volumeFrom is not None
assert len(volumeFrom) == len(consumed_services)
container_list = consolidated_container_list[:]
container_names = mounted_container_names[:]
# Check that there is exactly only 1 entry from each of the
# consumed services
for volume in volumeFrom:
volumes_from_list.append(volume)
found = False
for volume_list in container_list:
if volume in volume_list:
container_list.remove(volume_list)
found = True
if (not found):
error_string = \
str(volume) + " is not in " + str(container_list)
assert False, error_string
# Make sure that the container is in the same host by inspecting it
inspect = docker_client.inspect_container(volume)
assert inspect is not None
# Check that the volumes occur only once in consolidated list of
# containers
for volume in volumes_from_list:
found = False
for container in container_names:
if (volume == container):
container_names.remove(container)
found = True
if (not found):
error_string = \
str(volume) + " is not in " + str(container_names)
assert False, error_string
| 34.022528 | 79 | 0.690259 | 3,018 | 27,184 | 5.833333 | 0.060305 | 0.076228 | 0.048055 | 0.064982 | 0.846691 | 0.830616 | 0.820449 | 0.81278 | 0.788356 | 0.762738 | 0 | 0.011 | 0.230798 | 27,184 | 798 | 80 | 34.065163 | 0.830942 | 0.038479 | 0 | 0.759124 | 0 | 0 | 0.038078 | 0.011224 | 0 | 0 | 0 | 0 | 0.096715 | 1 | 0.047445 | false | 0 | 0.001825 | 0 | 0.062044 | 0.034672 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d3b827b6c13f253bd26d95046439f27542a1ca5f | 9,403 | py | Python | tests/flow/test_ts_delete.py | wcastello/RedisTimeSeries | fdfc7ba7b141576484a934c2c56092db5fb4397e | [
"Ruby",
"BSD-3-Clause",
"MIT"
] | null | null | null | tests/flow/test_ts_delete.py | wcastello/RedisTimeSeries | fdfc7ba7b141576484a934c2c56092db5fb4397e | [
"Ruby",
"BSD-3-Clause",
"MIT"
] | null | null | null | tests/flow/test_ts_delete.py | wcastello/RedisTimeSeries | fdfc7ba7b141576484a934c2c56092db5fb4397e | [
"Ruby",
"BSD-3-Clause",
"MIT"
] | null | null | null | from RLTest import Env
import pytest
import redis
from test_helper_classes import _get_ts_info
from includes import *
def test_ts_del_uncompressed():
# total samples = 101
sample_len = 101
with Env().getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key', 'uncompressed')
for i in range(sample_len):
assert i == r.execute_command("ts.add", 'test_key', i, '1')
res = r.execute_command('ts.range', 'test_key', 0, 100)
i = 0
for sample in res:
assert sample == [i, '1'.encode('ascii')]
i += 1
r.execute_command('ts.del', 'test_key', 0, 100)
res = r.execute_command('ts.range', 'test_key', 0, 100)
assert len(res) == 0
def test_ts_del_uncompressed_in_range():
sample_len = 101
with Env().getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key', 'uncompressed')
for i in range(sample_len):
assert i == r.execute_command("ts.add", 'test_key', i, '1')
res = r.execute_command('ts.range', 'test_key', 0, 100)
i = 0
for sample in res:
assert sample == [i, '1'.encode('ascii')]
i += 1
# delete 11 samples
assert 11 == r.execute_command('ts.del', 'test_key', 50, 60)
res = r.execute_command('ts.range', 'test_key', 0, 100)
assert len(res) == 90
def test_ts_del_compressed():
sample_len = 101
with Env().getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key')
for i in range(sample_len):
assert i == r.execute_command("ts.add", 'test_key', i, '1')
res = r.execute_command('ts.range', 'test_key', 0, 100)
i = 0
for sample in res:
assert sample == [i, '1'.encode('ascii')]
i += 1
assert sample_len == r.execute_command('ts.del', 'test_key', 0, 100)
res = r.execute_command('ts.range', 'test_key', 0, 100)
assert len(res) == 0
def test_ts_del_multi_chunk():
for CHUNK_TYPE in ["compressed","uncompressed"]:
sample_len = 1
e = Env()
with e.getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key', CHUNK_TYPE)
while(_get_ts_info(r, 'test_key').chunk_count<2):
assert sample_len == r.execute_command("ts.add", 'test_key', sample_len, '1')
sample_len = sample_len + 1
sample_len = sample_len -1
res = r.execute_command('ts.range', 'test_key', 0, sample_len - 1)
i = 1
for sample in res:
e.assertEqual(sample, [i, '1'.encode('ascii')])
i += 1
assert sample_len - 1 == r.execute_command('ts.del', 'test_key', 0, sample_len - 1)
res = r.execute_command('ts.range', 'test_key', 0, sample_len)
e.assertEqual(_get_ts_info(r, 'test_key').chunk_count,1)
e.assertEqual(len(res), 1)
e.flush()
def test_ts_del_compressed_out_range():
sample_len = 101
with Env().getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key')
for i in range(sample_len):
assert i + 100 == r.execute_command("ts.add", 'test_key', i + 100, '1')
res = r.execute_command('ts.range', 'test_key', 0 + 100, sample_len + 100 - 1)
i = 0
for sample in res:
assert sample == [i + 100, '1'.encode('ascii')]
i += 1
assert sample_len == r.execute_command('ts.del', 'test_key', 0, 500)
res = r.execute_command('ts.range', 'test_key', 0 + 100, sample_len + 100 - 1)
assert len(res) == 0
def test_bad_del(self):
with Env().getClusterConnectionIfNeeded() as r:
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("ts.del", "test_key", 100, 200)
r.execute_command("ts.add", 'test_key', 120, '1')
r.execute_command("ts.add", 'test_key', 140, '5')
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("ts.del", "test_key", 100)
dump = r.execute_command("dump", "test_key")
assert r.execute_command("restore", "test_key2", "0", dump)
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("ts.del", "test_key", 100, '200a')
assert r.execute_command("ts.del", "test_key", 200, 100) == 0
assert r.execute_command("ts.del", "test_key", 100, 300) == 2
assert r.execute_command("ts.del", "test_key2", 100, 300) == 2
self.assertTrue(r.execute_command("SET", "BAD_X", "NOT_TS"))
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("TS.DEL", "BAD_X", 100, 200)
def test_del_retention_with_rules(self):
sample_len = 1010
with Env().getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key_2', 'RETENTION', 1, 'compressed')
r.execute_command("ts.create", 'test_key_3', 'compressed')
r.execute_command('ts.createrule', 'test_key_2', 'test_key_3', 'AGGREGATION', 'avg', 10)
for i in range(sample_len):
assert i == r.execute_command("ts.add", 'test_key_2', i, 1)
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("ts.del", "test_key_2", 1, 10)
r.execute_command("ts.create", 'test_key_4{4}', 'RETENTION', 25, 'compressed')
r.execute_command("ts.create", 'test_key_5{4}', 'compressed')
r.execute_command('ts.createrule', 'test_key_4{4}', 'test_key_5{4}', 'AGGREGATION', 'avg', 10)
for i in range(30):
assert i == r.execute_command("ts.add", 'test_key_4{4}', i, 1)
with pytest.raises(redis.ResponseError) as excinfo:
r.execute_command("ts.del", "test_key_4{4}", 9, 10)
def test_del_with_rules(self):
sample_len = 1010
e = Env()
with e.getClusterConnectionIfNeeded() as r:
r.execute_command("ts.create", 'test_key_2', 'RETENTION', 5000, 'compressed')
r.execute_command("ts.create", 'test_key_3', 'compressed')
r.execute_command('ts.createrule', 'test_key_2', 'test_key_3', 'AGGREGATION', 'sum', 10)
for i in range(70):
assert i == r.execute_command("ts.add", 'test_key_2', i, 1)
for i in range(80, sample_len):
assert i == r.execute_command("ts.add", 'test_key_2', i, 1)
res = r.execute_command('ts.range', 'test_key_3', 0, 9)
e.assertEqual(len(res), 1)
assert res[0] == [0, b'10']
assert r.execute_command("ts.del", "test_key_2", 0, 9) == 10
res = r.execute_command('ts.range', 'test_key_3', 0, 9)
e.assertEqual(len(res), 0)
res = r.execute_command('ts.range', 'test_key_3', 10, 19)
e.assertEqual(len(res), 1)
assert res[0] == [10, b'10']
assert r.execute_command("ts.del", "test_key_2", 12, 14) == 3
res = r.execute_command('ts.range', 'test_key_3', 10, 19)
e.assertEqual(len(res), 1)
assert res[0] == [10, b'7']
res = r.execute_command('ts.range', 'test_key_3', 20, 39)
e.assertEqual(len(res), 2)
assert res == [[20, b'10'], [30, b'10']]
assert r.execute_command("ts.del", "test_key_2", 28, 31) == 4
res = r.execute_command('ts.range', 'test_key_3', 20, 39)
e.assertEqual(len(res), 2)
assert res == [[20, b'8'], [30, b'8']]
res = r.execute_command('ts.range', 'test_key_3', 50, 89)
e.assertEqual(len(res), 3)
assert res == [[50, b'10'], [60, b'10'], [80, b'10']]
# Tests empty end bucket which is not the lastest bucket:
assert r.execute_command("ts.del", "test_key_2", 58, 79) == 12
assert r.execute_command("ts.del", "test_key_2", 80, 81) == 2
res = r.execute_command('ts.range', 'test_key_3', 50, 89)
e.assertEqual(len(res), 2)
assert res == [[50, b'8'], [80, b'8']]
res = r.execute_command('ts.range', 'test_key_3', 990, 1009)
e.assertEqual(len(res), 1)
assert res[0] == [990, b'10']
assert r.execute_command("ts.del", "test_key_2", 995, 1002) == 8
res = r.execute_command('ts.range', 'test_key_3', 990, 1009)
e.assertEqual(len(res), 1)
assert res == [[990, b'5']]
assert 1010 == r.execute_command("ts.add", 'test_key_2', 1010, 1)
res = r.execute_command('ts.range', 'test_key_3', 990, 1009)
e.assertEqual(len(res), 2)
assert res == [[990, b'5'], [1000, b'7']]
##### delete chunk of a rule #####
r.execute_command("ts.create", 'test_key_{4}', 'RETENTION', 5000, 'CHUNK_SIZE', '1024', 'compressed')
r.execute_command("ts.create", 'test_key_{4}_agg', 'CHUNK_SIZE', '1024', 'compressed')
r.execute_command('ts.createrule', 'test_key_{4}', 'test_key_{4}_agg', 'AGGREGATION', 'sum', 10)
for i in range(2070):
assert i == r.execute_command("ts.add", 'test_key_{4}', i, 1)
res = r.execute_command('ts.range', 'test_key_{4}_agg', 1010, 2059)
e.assertEqual(len(res), 105)
assert r.execute_command("ts.del", "test_key_{4}", 1019, 2050) == 1032
res = r.execute_command('ts.range', 'test_key_{4}_agg', 1010, 2059)
e.assertEqual(len(res), 2)
assert res == [[1010, b'9'], [2050, b'9']]
| 41.241228 | 109 | 0.586834 | 1,375 | 9,403 | 3.808 | 0.095273 | 0.105615 | 0.220588 | 0.24026 | 0.858098 | 0.829832 | 0.81589 | 0.766616 | 0.683728 | 0.642475 | 0 | 0.068995 | 0.247793 | 9,403 | 227 | 110 | 41.422907 | 0.671285 | 0.012336 | 0 | 0.514124 | 0 | 0 | 0.182181 | 0 | 0 | 0 | 0 | 0 | 0.350282 | 1 | 0.045198 | false | 0 | 0.028249 | 0 | 0.073446 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d3cc0e4e8a745b377c41a82e94a416ba9705cf90 | 580 | py | Python | Exceptions.py | gbc1858/SPE-image-handler | 6d1e92ba4b74a6c8c44928f93bbf2f5994d34510 | [
"MIT"
] | null | null | null | Exceptions.py | gbc1858/SPE-image-handler | 6d1e92ba4b74a6c8c44928f93bbf2f5994d34510 | [
"MIT"
] | null | null | null | Exceptions.py | gbc1858/SPE-image-handler | 6d1e92ba4b74a6c8c44928f93bbf2f5994d34510 | [
"MIT"
] | null | null | null | class CurrentValueError(BaseException):
def __init__(self, message):
super().__init__(self, message)
class DenoiseBGError(BaseException):
def __init__(self, message):
super().__init__(self, message)
class DenoiseCroppedError(BaseException):
def __init__(self, message):
super().__init__(self, message)
class DenoiseFrameAfterContourError(BaseException):
def __init__(self, message):
super().__init__(self, message)
class RmsMethodError(BaseException):
def __init__(self, message):
super().__init__(self, message) | 25.217391 | 51 | 0.712069 | 55 | 580 | 6.781818 | 0.218182 | 0.214477 | 0.402145 | 0.321716 | 0.737265 | 0.737265 | 0.737265 | 0.737265 | 0.737265 | 0.600536 | 0 | 0 | 0.175862 | 580 | 23 | 52 | 25.217391 | 0.780335 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
3105014dfa362d6d69fc6e7e585cf2ea079414d2 | 1,932 | py | Python | web/ofd/filters.py | MarconiMediaGroup/TravelHub | e09940e83046540a6522aac50af9bd16b1df1865 | [
"Apache-2.0"
] | 1 | 2015-02-06T19:44:44.000Z | 2015-02-06T19:44:44.000Z | web/ofd/filters.py | MarconiMediaGroup/TravelHub | e09940e83046540a6522aac50af9bd16b1df1865 | [
"Apache-2.0"
] | null | null | null | web/ofd/filters.py | MarconiMediaGroup/TravelHub | e09940e83046540a6522aac50af9bd16b1df1865 | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from ofd.models import *
class AirportHasIcaoFilter(admin.SimpleListFilter):
title = u'ICAO Code'
parameter_name = 'check_icao'
def lookups(self, request, model_admin):
return (
('Yes', 'Assigned'),
('No', 'Unassigned'),
)
def queryset(self,request,queryset):
if self.value() == 'No':
return queryset.filter(icao__exact='')
if self.value() == 'Yes':
return queryset.exclude(icao__exact='')
class AirportHasIataFaaFilter(admin.SimpleListFilter):
title = u'IATA/FAA Code'
parameter_name = 'check_iata_faa'
def lookups(self, request, model_admin):
return (
('Yes', 'Assigned'),
('No', 'Unassigned'),
)
def queryset(self,request,queryset):
if self.value() == 'No':
return queryset.filter(iata_faa__exact='')
if self.value() == 'Yes':
return queryset.exclude(iata_faa__exact='')
class AirlineHasIcaoFilter(admin.SimpleListFilter):
title = u'ICAO Code'
parameter_name = 'check_icao'
def lookups(self, request, model_admin):
return (
('Yes', 'Assigned'),
('No', 'Unassigned'),
)
def queryset(self,request,queryset):
if self.value() == 'No':
return queryset.filter(icao__exact='')
if self.value() == 'Yes':
return queryset.exclude(icao__exact='')
class AirlineHasIataFaaFilter(admin.SimpleListFilter):
title = u'IATA Code'
parameter_name = 'check_iata_faa'
def lookups(self, request, model_admin):
return (
('Yes', 'Assigned'),
('No', 'Unassigned'),
)
def queryset(self,request,queryset):
if self.value() == 'No':
return queryset.filter(iata__exact='')
if self.value() == 'Yes':
return queryset.exclude(iata__exact='')
| 32.745763 | 55 | 0.585921 | 198 | 1,932 | 5.555556 | 0.191919 | 0.08 | 0.08 | 0.098182 | 0.852727 | 0.796364 | 0.796364 | 0.796364 | 0.796364 | 0.716364 | 0 | 0 | 0.275362 | 1,932 | 58 | 56 | 33.310345 | 0.785714 | 0 | 0 | 0.703704 | 0 | 0 | 0.10352 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.037037 | 0.074074 | 0.62963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
312ffea366c7d1f3c3bf0dce549a6daad0b2d288 | 45,436 | py | Python | supervisor/tests/test_dispatchers.py | blueyed/supervisor | 0bfed5f2a7442a88103da7e9a92dcd018e8e15fc | [
"ZPL-2.1"
] | 2 | 2017-09-17T21:24:44.000Z | 2019-08-26T03:02:43.000Z | supervisor/tests/test_dispatchers.py | blueyed/supervisor | 0bfed5f2a7442a88103da7e9a92dcd018e8e15fc | [
"ZPL-2.1"
] | null | null | null | supervisor/tests/test_dispatchers.py | blueyed/supervisor | 0bfed5f2a7442a88103da7e9a92dcd018e8e15fc | [
"ZPL-2.1"
] | 3 | 2021-02-23T08:36:03.000Z | 2021-02-23T08:36:54.000Z | import unittest
import os
import sys
from supervisor.tests.base import DummyOptions
from supervisor.tests.base import DummyProcess
from supervisor.tests.base import DummyPConfig
from supervisor.tests.base import DummyLogger
from supervisor.tests.base import DummyEvent
class POutputDispatcherTests(unittest.TestCase):
def setUp(self):
from supervisor.events import clear
clear()
def tearDown(self):
from supervisor.events import clear
clear()
def _getTargetClass(self):
from supervisor.dispatchers import POutputDispatcher
return POutputDispatcher
def _makeOne(self, process, channel='stdout'):
from supervisor import events
events = {'stdout': events.ProcessCommunicationStdoutEvent,
'stderr': events.ProcessCommunicationStderrEvent}
# dispatcher derives its channel from event class
return self._getTargetClass()(process, events[channel], 0)
def test_writable(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.writable(), False)
def test_readable_open(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.closed = False
self.assertEqual(dispatcher.readable(), True)
def test_readable_closed(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.closed = True
self.assertEqual(dispatcher.readable(), False)
def test_handle_write_event(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertRaises(NotImplementedError, dispatcher.handle_write_event)
def test_handle_read_event(self):
options = DummyOptions()
options.readfd_result = 'abc'
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_capture_maxbytes=100)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.handle_read_event(), None)
self.assertEqual(dispatcher.output_buffer, 'abc')
def test_handle_error(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
try:
raise ValueError('foo')
except:
dispatcher.handle_error()
result = options.logger.data[0]
self.assertTrue(result.startswith(
'uncaptured python exception, closing channel'),result)
def test_toggle_capturemode_sends_event(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo',
stdout_capture_maxbytes=500)
process = DummyProcess(config)
process.pid = 4000
dispatcher = self._makeOne(process)
dispatcher.capturemode = True
dispatcher.capturelog.getvalue = lambda: 'hallooo'
L = []
def doit(event):
L.append(event)
from supervisor import events
events.subscribe(events.EventTypes.PROCESS_COMMUNICATION, doit)
dispatcher.toggle_capturemode()
self.assertEqual(len(L), 1)
event = L[0]
self.assertEqual(event.process, process)
self.assertEqual(event.pid, 4000)
self.assertEqual(event.data, 'hallooo')
def test_removelogs(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.removelogs()
self.assertEqual(dispatcher.mainlog.handlers[0].reopened, True)
self.assertEqual(dispatcher.mainlog.handlers[0].removed, True)
self.assertEqual(dispatcher.childlog.handlers[0].reopened, True)
self.assertEqual(dispatcher.childlog.handlers[0].removed, True)
def test_reopenlogs(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.reopenlogs()
self.assertEqual(dispatcher.childlog.handlers[0].reopened, True)
self.assertEqual(dispatcher.mainlog.handlers[0].reopened, True)
def test_record_output_log_non_capturemode(self):
# stdout/stderr goes to the process log and the main log,
# in non-capturemode, the data length doesn't matter
options = DummyOptions()
from supervisor import loggers
options.loglevel = loggers.LevelsByName.TRAC
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.output_buffer = 'a'
dispatcher.record_output()
self.assertEqual(dispatcher.childlog.data, ['a'])
self.assertEqual(options.logger.data[0],
"'process1' stdout output:\na")
self.assertEqual(dispatcher.output_buffer, '')
def test_record_output_emits_stdout_event_when_enabled(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_events_enabled=True)
process = DummyProcess(config)
dispatcher = self._makeOne(process, 'stdout')
dispatcher.output_buffer = 'hello from stdout'
L = []
def doit(event):
L.append(event)
from supervisor import events
events.subscribe(events.EventTypes.PROCESS_LOG_STDOUT, doit)
dispatcher.record_output()
self.assertEqual(len(L), 1)
event = L[0]
self.assertEqual(event.process, process)
self.assertEqual(event.data, 'hello from stdout')
def test_record_output_does_not_emit_stdout_event_when_disabled(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_events_enabled=False)
process = DummyProcess(config)
dispatcher = self._makeOne(process, 'stdout')
dispatcher.output_buffer = 'hello from stdout'
L = []
def doit(event):
L.append(event)
from supervisor import events
events.subscribe(events.EventTypes.PROCESS_LOG_STDOUT, doit)
dispatcher.record_output()
self.assertEqual(len(L), 0)
def test_record_output_emits_stderr_event_when_enabled(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stderr_events_enabled=True)
process = DummyProcess(config)
dispatcher = self._makeOne(process, 'stderr')
dispatcher.output_buffer = 'hello from stderr'
L = []
def doit(event):
L.append(event)
from supervisor import events
events.subscribe(events.EventTypes.PROCESS_LOG_STDERR, doit)
dispatcher.record_output()
self.assertEqual(len(L), 1)
event = L[0]
self.assertEqual(event.process, process)
self.assertEqual(event.data, 'hello from stderr')
def test_record_output_does_not_emit_stderr_event_when_disabled(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stderr_events_enabled=False)
process = DummyProcess(config)
dispatcher = self._makeOne(process, 'stderr')
dispatcher.output_buffer = 'hello from stderr'
L = []
def doit(event):
L.append(event)
from supervisor import events
events.subscribe(events.EventTypes.PROCESS_LOG_STDERR, doit)
dispatcher.record_output()
self.assertEqual(len(L), 0)
def test_record_output_capturemode_string_longer_than_token(self):
# stdout/stderr goes to the process log and the main log,
# in capturemode, the length of the data needs to be longer
# than the capture token to make it out.
options = DummyOptions()
from supervisor import loggers
options.loglevel = loggers.LevelsByName.TRAC
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo',
stdout_capture_maxbytes=100)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.output_buffer = 'stdout string longer than a token'
dispatcher.record_output()
self.assertEqual(dispatcher.childlog.data,
['stdout string longer than a token'])
self.assertEqual(options.logger.data[0],
"'process1' stdout output:\nstdout string longer than a token")
def test_record_output_capturemode_string_not_longer_than_token(self):
# stdout/stderr goes to the process log and the main log,
# in capturemode, the length of the data needs to be longer
# than the capture token to make it out.
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo',
stdout_capture_maxbytes=100)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.output_buffer = 'a'
dispatcher.record_output()
self.assertEqual(dispatcher.childlog.data, [])
self.assertEqual(dispatcher.output_buffer, 'a')
def test_stdout_capturemode_single_buffer(self):
# mike reported that comm events that took place within a single
# output buffer were broken 8/20/2007
from supervisor.events import ProcessCommunicationEvent
from supervisor.events import subscribe
events = []
def doit(event):
events.append(event)
subscribe(ProcessCommunicationEvent, doit)
BEGIN_TOKEN = ProcessCommunicationEvent.BEGIN_TOKEN
END_TOKEN = ProcessCommunicationEvent.END_TOKEN
data = BEGIN_TOKEN + 'hello' + END_TOKEN
options = DummyOptions()
from supervisor.loggers import getLogger
options.getLogger = getLogger # actually use real logger
logfile = '/tmp/log'
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile=logfile,
stdout_capture_maxbytes=1000)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
try:
dispatcher.output_buffer = data
dispatcher.record_output()
self.assertEqual(open(logfile, 'r').read(), '')
self.assertEqual(dispatcher.output_buffer, '')
self.assertEqual(len(events), 1)
event = events[0]
from supervisor.events import ProcessCommunicationStdoutEvent
self.assertEqual(event.__class__, ProcessCommunicationStdoutEvent)
self.assertEqual(event.process, process)
self.assertEqual(event.channel, 'stdout')
self.assertEqual(event.data, 'hello')
finally:
try:
os.remove(logfile)
except (OSError, IOError):
pass
def test_stdout_capturemode_multiple_buffers(self):
from supervisor.events import ProcessCommunicationEvent
from supervisor.events import subscribe
events = []
def doit(event):
events.append(event)
subscribe(ProcessCommunicationEvent, doit)
import string
# ascii_letters for python 3
letters = getattr(string, "letters", string.ascii_letters)
digits = string.digits * 4
BEGIN_TOKEN = ProcessCommunicationEvent.BEGIN_TOKEN
END_TOKEN = ProcessCommunicationEvent.END_TOKEN
data = (letters + BEGIN_TOKEN + digits + END_TOKEN + letters)
# boundaries that split tokens
broken = data.split(':')
first = broken[0] + ':'
second = broken[1] + ':'
third = broken[2]
options = DummyOptions()
from supervisor.loggers import getLogger
options.getLogger = getLogger # actually use real logger
logfile = '/tmp/log'
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile=logfile,
stdout_capture_maxbytes=10000)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
try:
dispatcher.output_buffer = first
dispatcher.record_output()
[ x.flush() for x in dispatcher.childlog.handlers]
self.assertEqual(open(logfile, 'r').read(), letters)
self.assertEqual(dispatcher.output_buffer, first[len(letters):])
self.assertEqual(len(events), 0)
dispatcher.output_buffer += second
dispatcher.record_output()
self.assertEqual(len(events), 0)
[ x.flush() for x in dispatcher.childlog.handlers]
self.assertEqual(open(logfile, 'r').read(), letters)
self.assertEqual(dispatcher.output_buffer, first[len(letters):])
self.assertEqual(len(events), 0)
dispatcher.output_buffer += third
dispatcher.record_output()
[ x.flush() for x in dispatcher.childlog.handlers]
self.assertEqual(open(logfile, 'r').read(), letters *2)
self.assertEqual(len(events), 1)
event = events[0]
from supervisor.events import ProcessCommunicationStdoutEvent
self.assertEqual(event.__class__, ProcessCommunicationStdoutEvent)
self.assertEqual(event.process, process)
self.assertEqual(event.channel, 'stdout')
self.assertEqual(event.data, digits)
finally:
try:
os.remove(logfile)
except (OSError, IOError):
pass
def test_strip_ansi(self):
options = DummyOptions()
options.strip_ansi = True
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
ansi = '\x1b[34mHello world... this is longer than a token!\x1b[0m'
noansi = 'Hello world... this is longer than a token!'
dispatcher.output_buffer = ansi
dispatcher.record_output()
self.assertEqual(len(dispatcher.childlog.data), 1)
self.assertEqual(dispatcher.childlog.data[0], noansi)
options.strip_ansi = False
dispatcher.output_buffer = ansi
dispatcher.record_output()
self.assertEqual(len(dispatcher.childlog.data), 2)
self.assertEqual(dispatcher.childlog.data[1], ansi)
def test_ctor_nologfiles(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.capturelog, None)
self.assertEqual(dispatcher.mainlog, None)
self.assertEqual(dispatcher.childlog, None)
def test_ctor_logfile_only(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.capturelog, None)
self.assertEqual(dispatcher.mainlog.__class__, DummyLogger)
self.assertEqual(dispatcher.childlog, dispatcher.mainlog)
def test_ctor_capturelog_only(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_capture_maxbytes=300)
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.capturelog.__class__,DummyLogger)
self.assertEqual(dispatcher.mainlog, None)
self.assertEqual(dispatcher.childlog, None)
def test_ctor_nologs(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.capturelog, None)
self.assertEqual(dispatcher.mainlog, None)
self.assertEqual(dispatcher.childlog, None)
def test_repr(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
drepr = repr(dispatcher)
self.assertTrue(drepr.startswith('<POutputDispatcher at'), drepr)
self.assertNotEqual(
drepr.find('<supervisor.tests.base.DummyProcess instance at'),
-1)
self.assertTrue(drepr.endswith('(stdout)>'), drepr)
def test_close(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.close()
self.assertEqual(dispatcher.closed, True)
dispatcher.close() # make sure we don't error if we try to close twice
self.assertEqual(dispatcher.closed, True)
class PInputDispatcherTests(unittest.TestCase):
def _getTargetClass(self):
from supervisor.dispatchers import PInputDispatcher
return PInputDispatcher
def _makeOne(self, process):
channel = 'stdin'
return self._getTargetClass()(process, channel, 0)
def test_writable_open_nodata(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = 'a'
dispatcher.closed = False
self.assertEqual(dispatcher.writable(), True)
def test_writable_open_withdata(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = ''
dispatcher.closed = False
self.assertEqual(dispatcher.writable(), False)
def test_writable_closed_nodata(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = 'a'
dispatcher.closed = True
self.assertEqual(dispatcher.writable(), False)
def test_writable_closed_withdata(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = ''
dispatcher.closed = True
self.assertEqual(dispatcher.writable(), False)
def test_readable(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.readable(), False)
def test_handle_write_event(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = 'halloooo'
self.assertEqual(dispatcher.handle_write_event(), None)
self.assertEqual(options.written[0], 'halloooo')
def test_handle_write_event_nodata(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.input_buffer, '')
dispatcher.handle_write_event
self.assertEqual(dispatcher.input_buffer, '')
self.assertEqual(options.written, {})
def test_handle_write_event_epipe_raised(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = 'halloooo'
import errno
options.write_error = errno.EPIPE
dispatcher.handle_write_event()
self.assertEqual(dispatcher.input_buffer, '')
self.assertTrue(options.logger.data[0].startswith(
'fd 0 closed, stopped monitoring'))
self.assertTrue(options.logger.data[0].endswith('(stdin)>'))
def test_handle_write_event_uncaught_raised(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.input_buffer = 'halloooo'
import errno
options.write_error = errno.EBADF
self.assertRaises(OSError, dispatcher.handle_write_event)
def test_handle_write_event_over_os_limit(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
options.write_accept = 1
dispatcher.input_buffer = 'a' * 50
dispatcher.handle_write_event()
self.assertEqual(len(dispatcher.input_buffer), 49)
self.assertEqual(options.written[0], 'a')
def test_handle_read_event(self):
process = DummyProcess(None)
dispatcher = self._makeOne(process)
self.assertRaises(NotImplementedError, dispatcher.handle_read_event)
def test_handle_error(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
try:
raise ValueError('foo')
except:
dispatcher.handle_error()
result = options.logger.data[0]
self.assertTrue(result.startswith(
'uncaptured python exception, closing channel'),result)
def test_repr(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
drepr = repr(dispatcher)
self.assertTrue(drepr.startswith('<PInputDispatcher at'), drepr)
self.assertNotEqual(
drepr.find('<supervisor.tests.base.DummyProcess instance at'),
-1)
self.assertTrue(drepr.endswith('(stdin)>'), drepr)
def test_close(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.close()
self.assertEqual(dispatcher.closed, True)
dispatcher.close() # make sure we don't error if we try to close twice
self.assertEqual(dispatcher.closed, True)
class PEventListenerDispatcherTests(unittest.TestCase):
def setUp(self):
from supervisor.events import clear
clear()
def tearDown(self):
from supervisor.events import clear
clear()
def _getTargetClass(self):
from supervisor.dispatchers import PEventListenerDispatcher
return PEventListenerDispatcher
def _makeOne(self, process):
channel = 'stdout'
return self._getTargetClass()(process, channel, 0)
def test_writable(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.writable(), False)
def test_readable_open(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.closed = False
self.assertEqual(dispatcher.readable(), True)
def test_readable_closed(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.closed = True
self.assertEqual(dispatcher.readable(), False)
def test_handle_write_event(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertRaises(NotImplementedError, dispatcher.handle_write_event)
def test_handle_read_event_calls_handle_listener_state_change(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
process.listener_state = EventListenerStates.ACKNOWLEDGED
dispatcher = self._makeOne(process)
options.readfd_result = dispatcher.READY_FOR_EVENTS_TOKEN
self.assertEqual(dispatcher.handle_read_event(), None)
self.assertEqual(process.listener_state, EventListenerStates.READY)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(len(dispatcher.childlog.data), 1)
self.assertEqual(dispatcher.childlog.data[0],
dispatcher.READY_FOR_EVENTS_TOKEN)
def test_handle_read_event_nodata(self):
options = DummyOptions()
options.readfd_result = ''
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.handle_read_event(), None)
self.assertEqual(dispatcher.state_buffer, '')
from supervisor.dispatchers import EventListenerStates
self.assertEqual(dispatcher.process.listener_state,
EventListenerStates.ACKNOWLEDGED)
def test_handle_read_event_logging_nologs(self):
options = DummyOptions()
options.readfd_result = 'supercalifragilisticexpialidocious'
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
# just make sure there are no errors if a child logger doesnt
# exist
self.assertEqual(dispatcher.handle_read_event(), None)
self.assertEqual(dispatcher.childlog, None)
def test_handle_read_event_logging_childlog(self):
options = DummyOptions()
options.readfd_result = 'supercalifragilisticexpialidocious'
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.handle_read_event(), None)
self.assertEqual(len(dispatcher.childlog.data), 1)
self.assertEqual(dispatcher.childlog.data[0],
'supercalifragilisticexpialidocious')
def test_handle_listener_state_change_from_unknown(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.UNKNOWN
dispatcher.state_buffer = 'whatever'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data, [])
self.assertEqual(process.listener_state, EventListenerStates.UNKNOWN)
def test_handle_listener_state_change_acknowledged_to_ready(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.ACKNOWLEDGED
dispatcher.state_buffer = 'READY\n'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: ACKNOWLEDGED -> READY')
self.assertEqual(process.listener_state, EventListenerStates.READY)
def test_handle_listener_state_change_acknowledged_gobbles(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.ACKNOWLEDGED
dispatcher.state_buffer = 'READY\ngarbage\n'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: ACKNOWLEDGED -> READY')
self.assertEqual(options.logger.data[1],
'process1: READY -> UNKNOWN')
self.assertEqual(process.listener_state, EventListenerStates.UNKNOWN)
def test_handle_listener_state_change_acknowledged_to_insufficient(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.ACKNOWLEDGED
dispatcher.state_buffer = 'RE'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, 'RE')
self.assertEqual(options.logger.data, [])
self.assertEqual(process.listener_state,
EventListenerStates.ACKNOWLEDGED)
def test_handle_listener_state_change_acknowledged_to_unknown(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.ACKNOWLEDGED
dispatcher.state_buffer = 'bogus data yo'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: ACKNOWLEDGED -> UNKNOWN')
self.assertEqual(process.listener_state, EventListenerStates.UNKNOWN)
def test_handle_listener_state_change_ready_to_unknown(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.READY
dispatcher.state_buffer = 'bogus data yo'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: READY -> UNKNOWN')
self.assertEqual(process.listener_state, EventListenerStates.UNKNOWN)
def test_handle_listener_state_change_busy_to_insufficient(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.BUSY
dispatcher.state_buffer = 'bogus data yo'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, 'bogus data yo')
self.assertEqual(process.listener_state, EventListenerStates.BUSY)
def test_handle_listener_state_change_busy_to_acknowledged_procd(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.BUSY
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
from supervisor.dispatchers import default_handler
process.group.config.result_handler = default_handler
dispatcher.state_buffer = 'RESULT 2\nOKabc'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, 'abc')
self.assertEqual(options.logger.data[0],
'process1: BUSY -> ACKNOWLEDGED (processed)')
self.assertEqual(process.listener_state,
EventListenerStates.ACKNOWLEDGED)
def test_handle_listener_state_change_busy_to_acknowledged_rejected(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.BUSY
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
from supervisor.dispatchers import default_handler
process.group.config.result_handler = default_handler
dispatcher.state_buffer = 'RESULT 4\nFAILabc'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, 'abc')
self.assertEqual(options.logger.data[0],
'process1: BUSY -> ACKNOWLEDGED (rejected)')
self.assertEqual(process.listener_state,
EventListenerStates.ACKNOWLEDGED)
def test_handle_listener_state_change_busy_to_unknown(self):
from supervisor.events import EventRejectedEvent
from supervisor.events import subscribe
events = []
def doit(event):
events.append(event)
subscribe(EventRejectedEvent, doit)
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.BUSY
current_event = DummyEvent()
process.event = current_event
dispatcher.state_buffer = 'bogus data\n'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: BUSY -> UNKNOWN (bad result line \'bogus data\')')
self.assertEqual(process.listener_state,
EventListenerStates.UNKNOWN)
self.assertEqual(events[0].process, process)
self.assertEqual(events[0].event, current_event)
def test_handle_listener_state_busy_gobbles(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
process.listener_state = EventListenerStates.BUSY
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
from supervisor.dispatchers import default_handler
process.group.config.result_handler = default_handler
dispatcher.state_buffer = 'RESULT 2\nOKbogus data\n'
self.assertEqual(dispatcher.handle_listener_state_change(), None)
self.assertEqual(dispatcher.state_buffer, '')
self.assertEqual(options.logger.data[0],
'process1: BUSY -> ACKNOWLEDGED (processed)')
self.assertEqual(options.logger.data[1],
'process1: ACKNOWLEDGED -> UNKNOWN')
self.assertEqual(process.listener_state,
EventListenerStates.UNKNOWN)
def test_handle_result_accept(self):
from supervisor.events import subscribe
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
L = []
def doit(event):
L.append(event)
from supervisor import events
subscribe(events.EventRejectedEvent, doit)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
def handle(event, result):
pass
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
process.group.config.result_handler = handle
process.listener_state = EventListenerStates.BUSY
dispatcher.handle_result('foo')
self.assertEqual(len(L), 0)
self.assertEqual(process.listener_state,
EventListenerStates.ACKNOWLEDGED)
result = options.logger.data[0]
self.assertTrue(result.endswith('BUSY -> ACKNOWLEDGED (processed)'))
def test_handle_result_rejectevent(self):
from supervisor.events import subscribe
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
L = []
def doit(event):
L.append(event)
from supervisor import events
subscribe(events.EventRejectedEvent, doit)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
def rejected(event, result):
from supervisor.dispatchers import RejectEvent
raise RejectEvent(result)
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
process.group.config.result_handler = rejected
process.listener_state = EventListenerStates.BUSY
dispatcher.handle_result('foo')
self.assertEqual(len(L), 1)
self.assertEqual(L[0].__class__, events.EventRejectedEvent)
self.assertEqual(process.listener_state,
EventListenerStates.ACKNOWLEDGED)
result = options.logger.data[0]
self.assertTrue(result.endswith('BUSY -> ACKNOWLEDGED (rejected)'))
def test_handle_result_exception(self):
from supervisor.events import subscribe
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
L = []
def doit(event):
L.append(event)
from supervisor import events
subscribe(events.EventRejectedEvent, doit)
from supervisor.dispatchers import EventListenerStates
dispatcher = self._makeOne(process)
def exception(event, result):
raise ValueError
class Dummy:
pass
process.group = Dummy()
process.group.config = Dummy()
process.group.config.result_handler = exception
process.group.result_handler = exception
process.listener_state = EventListenerStates.BUSY
dispatcher.handle_result('foo')
self.assertEqual(len(L), 1)
self.assertEqual(L[0].__class__, events.EventRejectedEvent)
self.assertEqual(process.listener_state,
EventListenerStates.UNKNOWN)
result = options.logger.data[0]
self.assertTrue(result.endswith('BUSY -> UNKNOWN'))
def test_handle_error(self):
options = DummyOptions()
config = DummyPConfig(options, 'test', '/test')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
try:
raise ValueError('foo')
except:
dispatcher.handle_error()
result = options.logger.data[0]
self.assertTrue(result.startswith(
'uncaptured python exception, closing channel'),result)
def test_removelogs(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.removelogs()
self.assertEqual(dispatcher.childlog.handlers[0].reopened, True)
self.assertEqual(dispatcher.childlog.handlers[0].removed, True)
def test_reopenlogs(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.reopenlogs()
self.assertEqual(dispatcher.childlog.handlers[0].reopened, True)
def test_strip_ansi(self):
options = DummyOptions()
options.strip_ansi = True
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
ansi = '\x1b[34mHello world... this is longer than a token!\x1b[0m'
noansi = 'Hello world... this is longer than a token!'
options.readfd_result = ansi
dispatcher.handle_read_event()
self.assertEqual(len(dispatcher.childlog.data), 1)
self.assertEqual(dispatcher.childlog.data[0], noansi)
options.strip_ansi = False
options.readfd_result = ansi
dispatcher.handle_read_event()
self.assertEqual(len(dispatcher.childlog.data), 2)
self.assertEqual(dispatcher.childlog.data[1], ansi)
def test_ctor_nologfiles(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.childlog, None)
def test_ctor_logfile_only(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1',
stdout_logfile='/tmp/foo')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
self.assertEqual(dispatcher.process, process)
self.assertEqual(dispatcher.channel, 'stdout')
self.assertEqual(dispatcher.fd, 0)
self.assertEqual(dispatcher.childlog.__class__, DummyLogger)
def test_repr(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
drepr = repr(dispatcher)
self.assertTrue(drepr.startswith('<PEventListenerDispatcher at'), drepr)
self.assertNotEqual(
drepr.find('<supervisor.tests.base.DummyProcess instance at'),
-1)
self.assertTrue(drepr.endswith('(stdout)>'), drepr)
def test_close(self):
options = DummyOptions()
config = DummyPConfig(options, 'process1', '/bin/process1')
process = DummyProcess(config)
dispatcher = self._makeOne(process)
dispatcher.close()
self.assertEqual(dispatcher.closed, True)
dispatcher.close() # make sure we don't error if we try to close twice
self.assertEqual(dispatcher.closed, True)
def test_suite():
return unittest.findTestCases(sys.modules[__name__])
if __name__ == '__main__':
unittest.main(defaultTest='test_suite')
| 42.74318 | 80 | 0.655339 | 4,385 | 45,436 | 6.642417 | 0.065222 | 0.094242 | 0.092698 | 0.06633 | 0.914581 | 0.882034 | 0.859821 | 0.845779 | 0.834586 | 0.813815 | 0 | 0.007782 | 0.250528 | 45,436 | 1,062 | 81 | 42.783428 | 0.847561 | 0.01939 | 0 | 0.81509 | 0 | 0 | 0.070598 | 0.00521 | 0 | 0 | 0 | 0 | 0.216791 | 1 | 0.099894 | false | 0.009564 | 0.065887 | 0.001063 | 0.182784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3145faf8a7494290d0463eb6d90ee53dfb6c0d9b | 187 | py | Python | esl/economics/commodity.py | fagan2888/ESL | 24ffa903e8c5b9e725eed9861623d4b6a4a205a2 | [
"Apache-2.0"
] | 1 | 2020-04-17T18:18:08.000Z | 2020-04-17T18:18:08.000Z | esl/economics/commodity.py | fagan2888/ESL | 24ffa903e8c5b9e725eed9861623d4b6a4a205a2 | [
"Apache-2.0"
] | null | null | null | esl/economics/commodity.py | fagan2888/ESL | 24ffa903e8c5b9e725eed9861623d4b6a4a205a2 | [
"Apache-2.0"
] | null | null | null | from esl.economics.asset import Asset
from esl.economics.fungibility import Fungible
from esl.economics.tangibility import Tangible
class Commodity(Asset, Fungible, Tangible):
pass
| 23.375 | 46 | 0.818182 | 24 | 187 | 6.375 | 0.5 | 0.137255 | 0.313725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122995 | 187 | 7 | 47 | 26.714286 | 0.932927 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
31618a23ca0858a8ce7902ba818300b498857e3c | 1,270 | py | Python | lists_multiple_manipulation_for_loops_pizza.py | julencosme/python-crash-course | 6b37d7346e235273c266110932207cd67ce4eb0e | [
"MIT"
] | null | null | null | lists_multiple_manipulation_for_loops_pizza.py | julencosme/python-crash-course | 6b37d7346e235273c266110932207cd67ce4eb0e | [
"MIT"
] | null | null | null | lists_multiple_manipulation_for_loops_pizza.py | julencosme/python-crash-course | 6b37d7346e235273c266110932207cd67ce4eb0e | [
"MIT"
] | null | null | null | favorite_pizzas = ['mushroom', 'tomato', 'artichoke', 'double-pepperoni']
friend_pizzas = favorite_pizzas[:]
print(favorite_pizzas)
print(friend_pizzas)
# Adding a new pizza to the original list .
favorite_pizzas = ['mushroom', 'tomato', 'artichoke', 'double-pepperoni']
friend_pizzas = favorite_pizzas[:]
print(favorite_pizzas)
print(friend_pizzas)
favorite_pizzas.append("jalepeno")
print(favorite_pizzas)
print(friend_pizzas)
# Adding a different pizza to the list friend_pizzas .
favorite_pizzas = ['mushroom', 'tomato', 'artichoke', 'double-pepperoni']
friend_pizzas = favorite_pizzas[:]
print(favorite_pizzas)
print(friend_pizzas)
favorite_pizzas.append("jalepeno")
print(favorite_pizzas)
print(friend_pizzas)
friend_pizzas.append("pineapple")
print(friend_pizzas)
print(favorite_pizzas)
print("My favorite pizzas are: ")
for pizza in favorite_pizzas[:]:
print(pizza.title())
print("My friend's favorite pizzas are: ")
for pizza in friend_pizzas[:]:
print(pizza.title())
# Two for loops to print each list of foods .
for pizza in friend_pizzas[:2]:
print(pizza.title())
for pizza in favorite_pizzas[:3]:
print(pizza.title())
for pizza in favorite_pizzas[:]:
print(pizza.title())
for pizza in friend_pizzas[:]:
print(pizza.title())
| 26.458333 | 73 | 0.752756 | 169 | 1,270 | 5.473373 | 0.201183 | 0.287568 | 0.225946 | 0.168649 | 0.838919 | 0.782703 | 0.745946 | 0.745946 | 0.508108 | 0.508108 | 0 | 0.001781 | 0.115748 | 1,270 | 47 | 74 | 27.021277 | 0.821906 | 0.108661 | 0 | 0.857143 | 0 | 0 | 0.176418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
317272e3556fcbd2d5d180da70eccd7e9cb6914c | 119 | py | Python | src/prefect/tasks/jira/__init__.py | concreted/prefect | dd732f5990ee2b0f3d816adb285168fd63b239e4 | [
"Apache-2.0"
] | 8,633 | 2019-03-23T17:51:03.000Z | 2022-03-31T22:17:42.000Z | src/prefect/tasks/jira/__init__.py | concreted/prefect | dd732f5990ee2b0f3d816adb285168fd63b239e4 | [
"Apache-2.0"
] | 3,903 | 2019-03-23T19:11:21.000Z | 2022-03-31T23:21:23.000Z | src/prefect/tasks/jira/__init__.py | ngriffiths13/prefect | 7f5613abcb182494b7dc12159277c3bc5f3c9898 | [
"Apache-2.0"
] | 937 | 2019-03-23T18:49:44.000Z | 2022-03-31T21:45:13.000Z | from prefect.tasks.jira.jira_task import JiraTask
from prefect.tasks.jira.jira_service_desk import JiraServiceDeskTask
| 39.666667 | 68 | 0.882353 | 17 | 119 | 6 | 0.588235 | 0.215686 | 0.313725 | 0.392157 | 0.470588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067227 | 119 | 2 | 69 | 59.5 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3182535c574f14cefc37274a0370aa2615de522f | 140 | py | Python | plotlyflask_app/routes.py | Quartz14/Stress_Detection | da5d3003bd662e6fcaf2f6933ff13aaaa7bc270a | [
"MIT"
] | null | null | null | plotlyflask_app/routes.py | Quartz14/Stress_Detection | da5d3003bd662e6fcaf2f6933ff13aaaa7bc270a | [
"MIT"
] | null | null | null | plotlyflask_app/routes.py | Quartz14/Stress_Detection | da5d3003bd662e6fcaf2f6933ff13aaaa7bc270a | [
"MIT"
] | 1 | 2021-06-15T10:56:13.000Z | 2021-06-15T10:56:13.000Z | from flask import render_template
from flask import current_app as app
@app.route('/')
def home():
return render_template('home.html')
| 20 | 39 | 0.75 | 21 | 140 | 4.857143 | 0.619048 | 0.176471 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 140 | 6 | 40 | 23.333333 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
31b64b9447533943176a66bb75c77b5f9ca03e77 | 2,547 | py | Python | game_of_life/initialization.py | Phamphy/game-of-life | 0f6abc9612878dc73b35b95d3f56d448778a4544 | [
"Apache-2.0"
] | null | null | null | game_of_life/initialization.py | Phamphy/game-of-life | 0f6abc9612878dc73b35b95d3f56d448778a4544 | [
"Apache-2.0"
] | null | null | null | game_of_life/initialization.py | Phamphy/game-of-life | 0f6abc9612878dc73b35b95d3f56d448778a4544 | [
"Apache-2.0"
] | null | null | null | from game_of_life2.generate_universe import *
seeds = {
"boat": [[1, 1, 0], [1, 0, 1], [0, 1, 0]],
"r_pentomino": [[0, 1, 1], [1, 1, 0], [0, 1, 0]],
"beacon": [[1, 1, 0, 0], [1, 1, 0, 0], [0, 0, 1, 1], [0, 0, 1, 1]],
"acorn": [[0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0], [1, 1, 0, 0, 1, 1, 1]],
"block_switch_engine": [
[0, 0, 0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 1, 0, 1, 1],
[0, 0, 0, 0, 1, 0, 1, 0],
[0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 0],
[1, 0, 1, 0, 0, 0, 0, 0],
],
"infinite": [
[1, 1, 1, 0, 1],
[1, 0, 0, 0, 0],
[0, 0, 0, 1, 1],
[0, 1, 1, 0, 1],
[1, 0, 1, 0, 1],
],
"block": [[1,1],[1,1]],
"beehive": [[0,1,1,0],[1,0,0,1],[0,1,1,0]],
"loaf": [[0,1,1,0],[1,0,0,1],[0,1,0,1],[0,0,1,0]],
"tub": [[0,1,0],[1,0,1],[0,1,0]],
"blinker": [[1,1,1]],
"toad": [[0,1,1,1],[1,1,1,0]],
"pulsar": [
[0,0,1,1,1,0,0,0,1,1,1,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[0,0,1,1,1,0,0,0,1,1,1,0,0],
[0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,1,1,0,0,0,1,1,1,0,0],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[1,0,0,0,0,1,0,1,0,0,0,0,1],
[0,0,0,0,0,0,0,0,0,0,0,0,0],
[0,0,1,1,1,0,0,0,1,1,1,0,0]
],
"pentadecathlon": [
[0,1,0],
[0,1,0],
[1,0,1],
[0,1,0],
[0,1,0],
[0,1,0],
[0,1,0],
[1,0,1],
[0,1,0],
[0,1,0],
],
"glider": [[1,0,0],[0,1,1],[1,1,0]],
"lightweight_spaceship": [
[0,1,1,0,0],
[1,1,1,1,0],
[1,1,0,1,1],
[0,0,1,1,0]
]
}
#dictionnaire contenant quelques seeds
def initiate(seed_name,size):
"""retourne un univers avec un seed choisit et positionne aleatoirement"""
if seed_name not in seeds.keys():
raise ValueError("Le seed n'est pas dans le dictionnaire !")
#on verifie que la seed est bien dans l'univers
seed=np.array(seeds[seed_name])
if seed.shape[0]>size[0] or seed.shape[1]>size[1]:
raise ValueError("Le seed est plus grand que l'univers !")
#on verifie que le seed n'est pas plus grand que l'univers
else :
universe=generate_universe(size)
#on genere un univers vide
return add_seed_to_universe(seed, universe)
#on retourne l'univer avec la seed choisie placee aleatoirement
| 29.616279 | 83 | 0.432273 | 550 | 2,547 | 1.976364 | 0.14 | 0.283349 | 0.289788 | 0.272309 | 0.429623 | 0.361546 | 0.356946 | 0.333027 | 0.315547 | 0.264029 | 0 | 0.230084 | 0.295249 | 2,547 | 85 | 84 | 29.964706 | 0.375487 | 0.116215 | 0 | 0.366197 | 1 | 0 | 0.092857 | 0.009375 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014085 | false | 0 | 0.014085 | 0 | 0.042254 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
31d280729bdb494713eb0639a54e26973d4d4528 | 3,206 | py | Python | page/migrations/0019_auto_20180313_1156.py | Zex0n/django-simple-cms | 097098dcea218697a53f9c04005c86a7680ee4e1 | [
"MIT"
] | 1 | 2021-04-03T09:29:13.000Z | 2021-04-03T09:29:13.000Z | page/migrations/0019_auto_20180313_1156.py | Zex0n/django-simple-cms | 097098dcea218697a53f9c04005c86a7680ee4e1 | [
"MIT"
] | null | null | null | page/migrations/0019_auto_20180313_1156.py | Zex0n/django-simple-cms | 097098dcea218697a53f9c04005c86a7680ee4e1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2018-03-13 08:56
from __future__ import unicode_literals
from django.db import migrations, models
import sorl.thumbnail.fields
class Migration(migrations.Migration):
dependencies = [
('page', '0018_banners'),
]
operations = [
migrations.AddField(
model_name='setting',
name='link_1_file',
field=sorl.thumbnail.fields.ImageField(blank=True, upload_to='links', verbose_name='Картинка первой ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_1_link',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='URL первой ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_1_text',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Текст первой ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_1_title',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Название первой ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_2_file',
field=sorl.thumbnail.fields.ImageField(blank=True, upload_to='links', verbose_name='Картинка второй ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_2_link',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='URL второй ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_2_text',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Текст второй ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_2_title',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Название второй ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_3_file',
field=sorl.thumbnail.fields.ImageField(blank=True, upload_to='links', verbose_name='Картинка третьей ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_3_link',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='URL третьей ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_3_text',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Текст третьей ссылки'),
),
migrations.AddField(
model_name='setting',
name='link_3_title',
field=models.CharField(blank=True, default='', max_length=10, verbose_name='Название третьей ссылки'),
),
migrations.AlterField(
model_name='banners',
name='banner_image',
field=sorl.thumbnail.fields.ImageField(blank=True, upload_to='banners', verbose_name='Картинка для баннера 270px X 120 px'),
),
]
| 39.097561 | 136 | 0.603244 | 345 | 3,206 | 5.402899 | 0.208696 | 0.062768 | 0.148069 | 0.17382 | 0.830472 | 0.830472 | 0.830472 | 0.830472 | 0.807403 | 0.72103 | 0 | 0.024411 | 0.271678 | 3,206 | 81 | 137 | 39.580247 | 0.773876 | 0.02121 | 0 | 0.5 | 1 | 0 | 0.177033 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.040541 | 0 | 0.081081 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
31dd79788e6b3408f2b516d23bd304ed2a3a8e7a | 2,780 | py | Python | tests/gpu/gpu_test_deconv2d.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 539 | 2019-11-01T04:09:42.000Z | 2022-03-26T06:25:44.000Z | tests/gpu/gpu_test_deconv2d.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 32 | 2019-11-21T07:50:16.000Z | 2022-01-26T14:01:55.000Z | tests/gpu/gpu_test_deconv2d.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 157 | 2019-11-17T22:20:03.000Z | 2022-03-23T02:50:51.000Z | import unittest
import cupy as np # !! CUPY !!
import dezero.layers as L
import dezero.functions as F
from dezero.utils import gradient_check, array_allclose
import chainer.functions as CF
class TestDeconv2d(unittest.TestCase):
def test_forward1(self):
n, c_i, c_o = 10, 1, 3
h_i, w_i = 5, 10
h_k, w_k = 10, 10
h_p, w_p = 5, 5
s_y, s_x = 5, 5
x = np.random.uniform(0, 1, (n, c_i, h_i, w_i)).astype(np.float32)
W = np.random.uniform(0, 1, (c_i, c_o, h_k, w_k)).astype(np.float32)
b = np.random.uniform(0, 1, c_o).astype(np.float32)
expected = CF.deconvolution_2d(x, W, b, stride=(s_y, s_x),
pad=(h_p, w_p))
y = F.deconv2d(x, W, b, stride=(s_y, s_x), pad=(h_p, w_p))
self.assertTrue(array_allclose(expected.data, y.data))
def test_forward2(self):
n, c_i, c_o = 10, 1, 3
h_i, w_i = 5, 10
h_k, w_k = 10, 10
h_p, w_p = 5, 5
s_y, s_x = 5, 5
x = np.random.uniform(0, 1, (n, c_i, h_i, w_i)).astype(np.float32)
W = np.random.uniform(0, 1, (c_i, c_o, h_k, w_k)).astype(np.float32)
b = None
expected = CF.deconvolution_2d(x, W, b, stride=(s_y, s_x),
pad=(h_p, w_p))
y = F.deconv2d(x, W, b, stride=(s_y, s_x), pad=(h_p, w_p))
self.assertTrue(array_allclose(expected.data, y.data))
def test_backward1(self):
n, c_i, c_o = 10, 1, 3
h_i, w_i = 5, 10
h_k, w_k = 10, 10
h_p, w_p = 5, 5
s_y, s_x = 5, 5
x = np.random.uniform(0, 1, (n, c_i, h_i, w_i))
W = np.random.uniform(0, 1, (c_i, c_o, h_k, w_k))
b = None # np.random.uniform(0, 1, c_o).astype(np.float32)
f = lambda x: F.deconv2d(x, W, b, stride=(s_y, s_x), pad=(h_p, w_p))
self.assertTrue(gradient_check(f, x))
def test_backward2(self):
n, c_i, c_o = 10, 1, 3
h_i, w_i = 5, 10
h_k, w_k = 10, 10
h_p, w_p = 5, 5
s_y, s_x = 5, 5
x = np.random.uniform(0, 1, (n, c_i, h_i, w_i))
W = np.random.uniform(0, 1, (c_i, c_o, h_k, w_k))
b = np.random.uniform(0, 1, c_o)
f = lambda W: F.deconv2d(x, W, b, stride=(s_y, s_x), pad=(h_p, w_p))
self.assertTrue(gradient_check(f, W))
def test_backward3(self):
n, c_i, c_o = 10, 1, 3
h_i, w_i = 5, 10
h_k, w_k = 10, 10
h_p, w_p = 5, 5
s_y, s_x = 5, 5
x = np.random.uniform(0, 1, (n, c_i, h_i, w_i))
W = np.random.uniform(0, 1, (c_i, c_o, h_k, w_k))
b = np.random.uniform(0, 1, c_o)
f = lambda b: F.deconv2d(x, W, b, stride=(s_y, s_x), pad=(h_p, w_p))
self.assertTrue(gradient_check(f, b)) | 37.567568 | 76 | 0.525899 | 556 | 2,780 | 2.393885 | 0.107914 | 0.022539 | 0.157776 | 0.168295 | 0.798648 | 0.798648 | 0.798648 | 0.798648 | 0.798648 | 0.798648 | 0 | 0.067404 | 0.316906 | 2,780 | 74 | 77 | 37.567568 | 0.633491 | 0.020863 | 0 | 0.712121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075758 | 1 | 0.075758 | false | 0 | 0.090909 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ed97950a3756d5101cf63d9d3623b20159c319b | 2,254 | py | Python | src/communication/gnn.py | hex-plex/GNN-MARL | ebe964a4eb749fd8d2780af18aead85e342d2988 | [
"Apache-2.0"
] | 1 | 2022-03-22T14:59:05.000Z | 2022-03-22T14:59:05.000Z | src/communication/gnn.py | hex-plex/GNN-MARL | ebe964a4eb749fd8d2780af18aead85e342d2988 | [
"Apache-2.0"
] | null | null | null | src/communication/gnn.py | hex-plex/GNN-MARL | ebe964a4eb749fd8d2780af18aead85e342d2988 | [
"Apache-2.0"
] | null | null | null | import torch
from torch_geometric.nn import GCNConv
import torch.nn.functional as F
from torch_geometric.utils import dense_to_sparse
class GCNComm(torch.nn.Module):
def __init__(self, input_shape, args, training=True):
super(GCNComm, self).__init__()
self.args = args
self.training = training
self.convs = []
self.convs.append(GCNConv(input_shape, self.args.msg_hidden_dim))
for i in range(1,self.args.num_layers-1):
self.convs.append(GCNConv(self.args.msg_hidden_dim, self.args.msg_hidden_dim))
self.convs.append(GCNConv(self.args.msg_hidden_dim, self.args.msg_out_size))
def cuda_transfer(self):
for i in range(self.args.num_layers):
self.convs[i].cuda()
def forward(self,x, adj_matrix):
x_out = []
for x_in, am_in in zip(torch.unbind(x, dim=0), torch.unbind(adj_matrix, dim=0)):
for i in range(self.args.num_layers):
x_in = self.convs[i](x_in, dense_to_sparse(am_in)[0])
if (i+1)<self.args.num_layers:
x_in = F.elu(x_in)
x_in = F.dropout(x_in, p=0.2, training=self.training)
x_out.append(x_in)
return torch.stack(x_out, dim=0)
class GATComm(torch.nn.Module):
def __init__(self, input_shape, args, training=True):
super(GATComm, self).__init__()
self.args = args
self.convs = []
self.convs.append(GCNConv(input_shape, self.args.msg_hidden_dim))
for i in range(1,self.args.num_layers-1):
self.convs.append(GCNConv(self.args.msg_hidden_dim, self.args.msg_hidden_dim))
self.convs.append(GCNConv(self.args.msg_hidden_dim, self.args.msg_out_size))
def forward(self,x, adj_matrix):
x_out = []
for x_in, am_in in zip(torch.unbind(x, dim=0), torch.unbind(adj_matrix, dim=0)):
for i in range(self.args.num_layers):
x_in = self.convs[i](x_in, dense_to_sparse(am_in)[0])
if (i+1)<self.args.num_layers:
x_in = F.elu(x_in)
x_in = F.dropout(x_in, p=0.2, training=self.training)
x_out.append(x_in)
return torch.stack(x_out, dim=0) | 43.346154 | 90 | 0.618012 | 351 | 2,254 | 3.723647 | 0.162393 | 0.116297 | 0.084162 | 0.104055 | 0.850038 | 0.850038 | 0.816373 | 0.816373 | 0.79495 | 0.79495 | 0 | 0.010759 | 0.257764 | 2,254 | 52 | 91 | 43.346154 | 0.770472 | 0 | 0 | 0.76087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108696 | false | 0 | 0.086957 | 0 | 0.282609 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9efb601e64532c31abb430925e71296f1dd1e6ed | 12,399 | py | Python | gan_simple.py | statsu1990/gan_simple_2d_problem | 5b62e79fcab5a66d49536f43863169a001e3089b | [
"MIT"
] | 2 | 2019-09-09T08:02:36.000Z | 2020-07-30T13:20:55.000Z | gan_simple.py | statsu1990/gan_simple_2d_problem | 5b62e79fcab5a66d49536f43863169a001e3089b | [
"MIT"
] | 1 | 2019-07-14T10:27:43.000Z | 2019-07-14T10:27:43.000Z | gan_simple.py | statsu1990/gan_simple_2d_problem | 5b62e79fcab5a66d49536f43863169a001e3089b | [
"MIT"
] | null | null | null | import gan
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.animation as animation
# 今更きけないGAN
# https://qiita.com/triwave33/items/1890ccc71fab6cbca87e
class GanTest2D:
def __init__(self, data_num, latent_dim, train_epoch):
self.DATA_NUM = data_num
self.LATENT_DIM = latent_dim
self.TRAIN_EPOCH = train_epoch
self.real_datas = None
return
# run
def run(self):
# make real data
self.make_real_data(self.DATA_NUM)
self.__plot_scat1(self.real_datas[:,0], self.real_datas[:,1], label='real data')
# make gan model
self.make_gan_model()
#self.make_gan_model_separating_disc_gene()
# graph of real and judged as true by discriminator data
self.__check_disc(self.gan, 100)
return
# data
def make_real_data(self, data_num):
self.real_datas = self.__sample_data_in_circle(data_num, radius=0.5)
#self.real_datas = self.__sample_data_in_half_circle(data_num, radius=0.5)
return
def __sample_data_in_circle(self, data_num, radius):
#
center = np.array([0.5, 0.5])
#center = np.array([0.0, 0.0])
# sampling num
sampling_margin = 2
sampling_num = int((1.0 * 1.0) / (radius * radius * 3.14) * data_num * sampling_margin)
# sampling
end_sampling_flag = False
x = np.empty((0,2), float)
# sampling roop
while not end_sampling_flag:
# x in [-1,1)
x_sampled = np.random.rand(sampling_num, 2) * 2.0 - 1.0
x_sampled = x_sampled[np.sqrt(np.sum(np.square(x_sampled - center), axis=1)) <= radius]
#
x = np.append(x, x_sampled, axis=0)
# check flag
end_sampling_flag = x.shape[0] >= data_num
# extract
x = x[0:data_num]
return x
def __sample_data_in_half_circle(self, data_num, radius):
#
center = np.array([0.5, 0.5])
#center = np.array([0.0, 0.0])
# sampling num
sampling_margin = 2
sampling_num = int((1.0 * 1.0) / (radius * radius * 3.14) * data_num * sampling_margin)
# sampling
end_sampling_flag = False
x = np.empty((0,2), float)
# sampling roop
while not end_sampling_flag:
# x in [-1,1)
x_sampled = np.random.rand(sampling_num, 2) * 2.0 - 1.0
x_sampled = x_sampled[np.sqrt(np.sum(np.square(x_sampled - center), axis=1)) <= radius]
x_sampled = x_sampled[x_sampled[:,1] < center[1]]
#
x = np.append(x, x_sampled, axis=0)
# check flag
end_sampling_flag = x.shape[0] >= data_num
# extract
x = x[0:data_num]
return x
# gan
def make_gan_model(self):
# make model
self.gan = gan.GAN(latent_dim=self.LATENT_DIM, data_dim=self.real_datas.shape[1])
#self.gan.make_model(gene_hidden_neurons=[32, 16, 16], disc_hidden_neurons=[32, 16, 16])
self.gan.make_model(gene_hidden_neurons=[32, 16, 16], disc_hidden_neurons=[124, 64, 16])
# train gan model
fig = plt.figure()
ims = []
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# training epoch roop
for iep in range(self.TRAIN_EPOCH):
self.gan.train_step(self.real_datas, batch_size=32, now_epoch=iep)
#self.gan.train_step_test1(self.real_datas, batch_size=32, now_epoch=iep)
# images for animation
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# graph of real and generated data
ani = animation.ArtistAnimation(fig, ims, interval=100)
ani.save('generated_point.gif', writer='pillow')
plt.show()
return
def make_gan_model_separating_disc_gene(self):
# make model
self.gan = gan.GAN(latent_dim=self.LATENT_DIM, data_dim=self.real_datas.shape[1])
#self.gan.make_model(gene_hidden_neurons=[32, 16, 16], disc_hidden_neurons=[32, 16, 16])
self.gan.make_model(gene_hidden_neurons=[32, 16, 16], disc_hidden_neurons=[248, 124, 16])
# train disc model
for iep in range(self.TRAIN_EPOCH):
self.gan.train_step_only_disc_with_random_noise(self.real_datas, batch_size=32, now_epoch=iep)
# train gene model
fig = plt.figure()
ims = []
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# training epoch roop
for iep in range(self.TRAIN_EPOCH):
self.gan.train_step_only_gene(self.real_datas, batch_size=32, now_epoch=iep)
# images for animation
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# graph of real and generated data
ani = animation.ArtistAnimation(fig, ims, interval=100)
ani.save('generated_point.gif', writer='pillow')
plt.show()
return
def __plot_gene_data(self, gan, data_num, title=None, show=True):
'''
plot generated data
'''
latents = np.random.normal(0, 1, (300, self.LATENT_DIM))
gene_datas = gan.gene_model.predict(latents)
image = self.__plot_scat1(gene_datas[:,0], gene_datas[:,1], color='c', title=title, show=show)
return image
def __plot_disc_predict(self, gan, data_num, binary=False, save=False, savefilename=''):
'''
plot discrimination model prediction
'''
# grid [-1,1] and [-1,1]
x1d = np.linspace(start=-1, stop=1, num=data_num)
x1, x2 = np.meshgrid(x1d, x1d)
x1 = np.ravel(x1)
x2 = np.ravel(x2)
#
x = np.concatenate([x1[:,np.newaxis], x2[:,np.newaxis]], axis=1)
# discriminate model prediction
if binary:
pre = (self.gan.disc_model.predict(x) > 0.5) * 1
else:
pre = gan.disc_model.predict(x)
#
self.__plot_scat1(x[:,0], x[:,1], color=np.ravel(pre), label='disc predict', save=save, savefilename=savefilename)
return
def __check_disc(self, gan, data_num):
real_disc = self.gan.disc_model.predict(self.real_datas)
real_disc_acc = np.average(real_disc > 0.5)
real_disc_ave = np.average(real_disc)
real_disc_std = np.std(real_disc)
print('real disc acc {0}, ave {1}, std {2}'.format(real_disc_acc, real_disc_ave, real_disc_std))
self.__plot_disc_predict(self.gan, 100, binary=False, save=False)
self.__plot_disc_predict(self.gan, 100, binary=True, save=True, savefilename='discriminate_true_range.png')
return
# plot
def __plot_scat1(self, x, y, color=None, title=None, label=None, xmin=-1, xmax=1, ymin=-1, ymax=1, show=True, save=False, savefilename=''):
image = plt.scatter(x, y, c=color, s=5, label=label, cmap='Blues')
plt.xlim(xmin, xmax)
plt.ylim(ymin, ymax)
plt.title(title)
if (color is not None) and (type(color) != type('string')):
plt.colorbar()
if save:
plt.savefig(savefilename)
if show:
plt.show()
return image
class WGanGpTest2D:
def __init__(self, data_num, latent_dim, train_epoch):
self.DATA_NUM = data_num
self.LATENT_DIM = latent_dim
self.TRAIN_EPOCH = train_epoch
self.BATCH_SIZE = 32
self.TRAIN_RATIO = 1
self.GRADIENT_PENALTY_WEIGHT = 0.1
self.real_datas = None
return
# run
def run(self):
# make real data
self.make_real_data(self.DATA_NUM)
self.__plot_scat1(self.real_datas[:,0], self.real_datas[:,1], label='real data')
# make gan model
self.make_gan_model()
return
# data
def make_real_data(self, data_num):
self.real_datas = self.__sample_data_in_circle(data_num, radius=0.5)
#self.real_datas = self.__sample_data_in_half_circle(data_num, radius=0.5)
return
def __sample_data_in_circle(self, data_num, radius):
#
center = np.array([0.5, 0.5])
#center = np.array([0.0, 0.0])
# sampling num
sampling_margin = 2
sampling_num = int((1.0 * 1.0) / (radius * radius * 3.14) * data_num * sampling_margin)
# sampling
end_sampling_flag = False
x = np.empty((0,2), float)
# sampling roop
while not end_sampling_flag:
# x in [-1,1)
x_sampled = np.random.rand(sampling_num, 2) * 2.0 - 1.0
x_sampled = x_sampled[np.sqrt(np.sum(np.square(x_sampled - center), axis=1)) <= radius]
#
x = np.append(x, x_sampled, axis=0)
# check flag
end_sampling_flag = x.shape[0] >= data_num
# extract
x = x[0:data_num]
return x
def __sample_data_in_half_circle(self, data_num, radius):
#
center = np.array([0.5, 0.5])
#center = np.array([0.0, 0.0])
# sampling num
sampling_margin = 2
sampling_num = int((1.0 * 1.0) / (radius * radius * 3.14) * data_num * sampling_margin)
# sampling
end_sampling_flag = False
x = np.empty((0,2), float)
# sampling roop
while not end_sampling_flag:
# x in [-1,1)
x_sampled = np.random.rand(sampling_num, 2) * 2.0 - 1.0
x_sampled = x_sampled[np.sqrt(np.sum(np.square(x_sampled - center), axis=1)) <= radius]
x_sampled = x_sampled[x_sampled[:,1] < center[1]]
#
x = np.append(x, x_sampled, axis=0)
# check flag
end_sampling_flag = x.shape[0] >= data_num
# extract
x = x[0:data_num]
return x
# gan
def make_gan_model(self):
# make model
self.gan = gan.WGAN_GP(latent_dim=self.LATENT_DIM, data_dim=self.real_datas.shape[1])
self.gan.make_model(gene_hidden_neurons=[32, 16, 16], disc_hidden_neurons=[124, 64, 16], batch_size=self.BATCH_SIZE, gradient_penalty_weight=self.GRADIENT_PENALTY_WEIGHT)
# train gan model
fig = plt.figure()
ims = []
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# training epoch roop
for iep in range(self.TRAIN_EPOCH):
self.gan.train_step(self.real_datas, batch_size=self.BATCH_SIZE)
# images for animation
ims.append([self.__plot_gene_data(self.gan, data_num=3000, show=False)])
# graph of real and generated data
ani = animation.ArtistAnimation(fig, ims, interval=100)
ani.save('generated_point.gif', writer='pillow')
plt.show()
return
def __plot_gene_data(self, gan, data_num, title=None, show=True):
'''
plot generated data
'''
latents = np.random.normal(0, 1, (300, self.LATENT_DIM))
gene_datas = gan.gene_model.predict(latents)
image = self.__plot_scat1(gene_datas[:,0], gene_datas[:,1], color='c', title=title, show=show)
return image
# plot
def __plot_scat1(self, x, y, color=None, title=None, label=None, xmin=-1, xmax=1, ymin=-1, ymax=1, show=True, save=False, savefilename=''):
image = plt.scatter(x, y, c=color, s=5, label=label, cmap='Blues')
plt.xlim(xmin, xmax)
plt.ylim(ymin, ymax)
plt.title(title)
if (color is not None) and (type(color) != type('string')):
plt.colorbar()
if save:
plt.savefig(savefilename)
if show:
plt.show()
return image
if __name__ == '__main__':
#gan_test_2d = GanTest2D(500, 16, 200)
#gan_test_2d.run()
wgangp_test_2d = WGanGpTest2D(500, 32, 200)
wgangp_test_2d.run()
| 34.731092 | 179 | 0.571094 | 1,675 | 12,399 | 3.983284 | 0.108657 | 0.043016 | 0.03702 | 0.020983 | 0.832884 | 0.814299 | 0.807104 | 0.804706 | 0.794215 | 0.78702 | 0 | 0.040905 | 0.31188 | 12,399 | 356 | 180 | 34.828652 | 0.741092 | 0.124123 | 0 | 0.78534 | 0 | 0 | 0.019216 | 0.002607 | 0 | 0 | 0 | 0 | 0 | 1 | 0.099476 | false | 0 | 0.020942 | 0 | 0.230366 | 0.005236 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7306091f9453521ba0858b985949accc0101757a | 6,269 | py | Python | loldib/getratings/models/NA/na_lulu/na_lulu_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_lulu/na_lulu_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_lulu/na_lulu_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Lulu_Jng_Aatrox(Ratings):
pass
class NA_Lulu_Jng_Ahri(Ratings):
pass
class NA_Lulu_Jng_Akali(Ratings):
pass
class NA_Lulu_Jng_Alistar(Ratings):
pass
class NA_Lulu_Jng_Amumu(Ratings):
pass
class NA_Lulu_Jng_Anivia(Ratings):
pass
class NA_Lulu_Jng_Annie(Ratings):
pass
class NA_Lulu_Jng_Ashe(Ratings):
pass
class NA_Lulu_Jng_AurelionSol(Ratings):
pass
class NA_Lulu_Jng_Azir(Ratings):
pass
class NA_Lulu_Jng_Bard(Ratings):
pass
class NA_Lulu_Jng_Blitzcrank(Ratings):
pass
class NA_Lulu_Jng_Brand(Ratings):
pass
class NA_Lulu_Jng_Braum(Ratings):
pass
class NA_Lulu_Jng_Caitlyn(Ratings):
pass
class NA_Lulu_Jng_Camille(Ratings):
pass
class NA_Lulu_Jng_Cassiopeia(Ratings):
pass
class NA_Lulu_Jng_Chogath(Ratings):
pass
class NA_Lulu_Jng_Corki(Ratings):
pass
class NA_Lulu_Jng_Darius(Ratings):
pass
class NA_Lulu_Jng_Diana(Ratings):
pass
class NA_Lulu_Jng_Draven(Ratings):
pass
class NA_Lulu_Jng_DrMundo(Ratings):
pass
class NA_Lulu_Jng_Ekko(Ratings):
pass
class NA_Lulu_Jng_Elise(Ratings):
pass
class NA_Lulu_Jng_Evelynn(Ratings):
pass
class NA_Lulu_Jng_Ezreal(Ratings):
pass
class NA_Lulu_Jng_Fiddlesticks(Ratings):
pass
class NA_Lulu_Jng_Fiora(Ratings):
pass
class NA_Lulu_Jng_Fizz(Ratings):
pass
class NA_Lulu_Jng_Galio(Ratings):
pass
class NA_Lulu_Jng_Gangplank(Ratings):
pass
class NA_Lulu_Jng_Garen(Ratings):
pass
class NA_Lulu_Jng_Gnar(Ratings):
pass
class NA_Lulu_Jng_Gragas(Ratings):
pass
class NA_Lulu_Jng_Graves(Ratings):
pass
class NA_Lulu_Jng_Hecarim(Ratings):
pass
class NA_Lulu_Jng_Heimerdinger(Ratings):
pass
class NA_Lulu_Jng_Illaoi(Ratings):
pass
class NA_Lulu_Jng_Irelia(Ratings):
pass
class NA_Lulu_Jng_Ivern(Ratings):
pass
class NA_Lulu_Jng_Janna(Ratings):
pass
class NA_Lulu_Jng_JarvanIV(Ratings):
pass
class NA_Lulu_Jng_Jax(Ratings):
pass
class NA_Lulu_Jng_Jayce(Ratings):
pass
class NA_Lulu_Jng_Jhin(Ratings):
pass
class NA_Lulu_Jng_Jinx(Ratings):
pass
class NA_Lulu_Jng_Kalista(Ratings):
pass
class NA_Lulu_Jng_Karma(Ratings):
pass
class NA_Lulu_Jng_Karthus(Ratings):
pass
class NA_Lulu_Jng_Kassadin(Ratings):
pass
class NA_Lulu_Jng_Katarina(Ratings):
pass
class NA_Lulu_Jng_Kayle(Ratings):
pass
class NA_Lulu_Jng_Kayn(Ratings):
pass
class NA_Lulu_Jng_Kennen(Ratings):
pass
class NA_Lulu_Jng_Khazix(Ratings):
pass
class NA_Lulu_Jng_Kindred(Ratings):
pass
class NA_Lulu_Jng_Kled(Ratings):
pass
class NA_Lulu_Jng_KogMaw(Ratings):
pass
class NA_Lulu_Jng_Leblanc(Ratings):
pass
class NA_Lulu_Jng_LeeSin(Ratings):
pass
class NA_Lulu_Jng_Leona(Ratings):
pass
class NA_Lulu_Jng_Lissandra(Ratings):
pass
class NA_Lulu_Jng_Lucian(Ratings):
pass
class NA_Lulu_Jng_Lulu(Ratings):
pass
class NA_Lulu_Jng_Lux(Ratings):
pass
class NA_Lulu_Jng_Malphite(Ratings):
pass
class NA_Lulu_Jng_Malzahar(Ratings):
pass
class NA_Lulu_Jng_Maokai(Ratings):
pass
class NA_Lulu_Jng_MasterYi(Ratings):
pass
class NA_Lulu_Jng_MissFortune(Ratings):
pass
class NA_Lulu_Jng_MonkeyKing(Ratings):
pass
class NA_Lulu_Jng_Mordekaiser(Ratings):
pass
class NA_Lulu_Jng_Morgana(Ratings):
pass
class NA_Lulu_Jng_Nami(Ratings):
pass
class NA_Lulu_Jng_Nasus(Ratings):
pass
class NA_Lulu_Jng_Nautilus(Ratings):
pass
class NA_Lulu_Jng_Nidalee(Ratings):
pass
class NA_Lulu_Jng_Nocturne(Ratings):
pass
class NA_Lulu_Jng_Nunu(Ratings):
pass
class NA_Lulu_Jng_Olaf(Ratings):
pass
class NA_Lulu_Jng_Orianna(Ratings):
pass
class NA_Lulu_Jng_Ornn(Ratings):
pass
class NA_Lulu_Jng_Pantheon(Ratings):
pass
class NA_Lulu_Jng_Poppy(Ratings):
pass
class NA_Lulu_Jng_Quinn(Ratings):
pass
class NA_Lulu_Jng_Rakan(Ratings):
pass
class NA_Lulu_Jng_Rammus(Ratings):
pass
class NA_Lulu_Jng_RekSai(Ratings):
pass
class NA_Lulu_Jng_Renekton(Ratings):
pass
class NA_Lulu_Jng_Rengar(Ratings):
pass
class NA_Lulu_Jng_Riven(Ratings):
pass
class NA_Lulu_Jng_Rumble(Ratings):
pass
class NA_Lulu_Jng_Ryze(Ratings):
pass
class NA_Lulu_Jng_Sejuani(Ratings):
pass
class NA_Lulu_Jng_Shaco(Ratings):
pass
class NA_Lulu_Jng_Shen(Ratings):
pass
class NA_Lulu_Jng_Shyvana(Ratings):
pass
class NA_Lulu_Jng_Singed(Ratings):
pass
class NA_Lulu_Jng_Sion(Ratings):
pass
class NA_Lulu_Jng_Sivir(Ratings):
pass
class NA_Lulu_Jng_Skarner(Ratings):
pass
class NA_Lulu_Jng_Sona(Ratings):
pass
class NA_Lulu_Jng_Soraka(Ratings):
pass
class NA_Lulu_Jng_Swain(Ratings):
pass
class NA_Lulu_Jng_Syndra(Ratings):
pass
class NA_Lulu_Jng_TahmKench(Ratings):
pass
class NA_Lulu_Jng_Taliyah(Ratings):
pass
class NA_Lulu_Jng_Talon(Ratings):
pass
class NA_Lulu_Jng_Taric(Ratings):
pass
class NA_Lulu_Jng_Teemo(Ratings):
pass
class NA_Lulu_Jng_Thresh(Ratings):
pass
class NA_Lulu_Jng_Tristana(Ratings):
pass
class NA_Lulu_Jng_Trundle(Ratings):
pass
class NA_Lulu_Jng_Tryndamere(Ratings):
pass
class NA_Lulu_Jng_TwistedFate(Ratings):
pass
class NA_Lulu_Jng_Twitch(Ratings):
pass
class NA_Lulu_Jng_Udyr(Ratings):
pass
class NA_Lulu_Jng_Urgot(Ratings):
pass
class NA_Lulu_Jng_Varus(Ratings):
pass
class NA_Lulu_Jng_Vayne(Ratings):
pass
class NA_Lulu_Jng_Veigar(Ratings):
pass
class NA_Lulu_Jng_Velkoz(Ratings):
pass
class NA_Lulu_Jng_Vi(Ratings):
pass
class NA_Lulu_Jng_Viktor(Ratings):
pass
class NA_Lulu_Jng_Vladimir(Ratings):
pass
class NA_Lulu_Jng_Volibear(Ratings):
pass
class NA_Lulu_Jng_Warwick(Ratings):
pass
class NA_Lulu_Jng_Xayah(Ratings):
pass
class NA_Lulu_Jng_Xerath(Ratings):
pass
class NA_Lulu_Jng_XinZhao(Ratings):
pass
class NA_Lulu_Jng_Yasuo(Ratings):
pass
class NA_Lulu_Jng_Yorick(Ratings):
pass
class NA_Lulu_Jng_Zac(Ratings):
pass
class NA_Lulu_Jng_Zed(Ratings):
pass
class NA_Lulu_Jng_Ziggs(Ratings):
pass
class NA_Lulu_Jng_Zilean(Ratings):
pass
class NA_Lulu_Jng_Zyra(Ratings):
pass
| 15.033573 | 46 | 0.75642 | 972 | 6,269 | 4.452675 | 0.151235 | 0.223198 | 0.350739 | 0.446396 | 0.791359 | 0.791359 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177221 | 6,269 | 416 | 47 | 15.069712 | 0.839085 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
b41153fae450fd7867fbf4a611c319190661a3b7 | 92 | py | Python | gym_adserver/envs/__init__.py | roelbertens/gym-adserver | fbdb3827add70068759348d4bb390e6fffa7dbe2 | [
"MIT"
] | 17 | 2020-10-13T08:01:23.000Z | 2022-03-01T14:45:11.000Z | gym_adserver/envs/__init__.py | roelbertens/gym-adserver | fbdb3827add70068759348d4bb390e6fffa7dbe2 | [
"MIT"
] | 4 | 2020-09-12T11:24:54.000Z | 2020-09-16T20:21:31.000Z | gym_adserver/envs/__init__.py | roelbertens/gym-adserver | fbdb3827add70068759348d4bb390e6fffa7dbe2 | [
"MIT"
] | 14 | 2020-10-13T08:02:16.000Z | 2022-02-20T13:53:43.000Z | from gym_adserver.envs.adserver import Ad
from gym_adserver.envs.adserver import AdServerEnv | 46 | 50 | 0.880435 | 14 | 92 | 5.642857 | 0.5 | 0.177215 | 0.379747 | 0.481013 | 0.835443 | 0.835443 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076087 | 92 | 2 | 50 | 46 | 0.929412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
b439413d239f55c0bf8b80faecb2c454d631ab9a | 24,174 | py | Python | tests/test_interface.py | mnm678/securesystemslib | caf029ea61339dc5fa9beedf230ca1d026129169 | [
"MIT"
] | null | null | null | tests/test_interface.py | mnm678/securesystemslib | caf029ea61339dc5fa9beedf230ca1d026129169 | [
"MIT"
] | null | null | null | tests/test_interface.py | mnm678/securesystemslib | caf029ea61339dc5fa9beedf230ca1d026129169 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
<Program Name>
test_interface.py
<Author>
Vladimir Diaz <vladimir.v.diaz@gmail.com>
<Started>
January 5, 2017.
<Copyright>
See LICENSE for licensing information.
<Purpose>
Unit test for 'interface.py'.
"""
# Help with Python 3 compatibility, where the print statement is a function, an
# implicit relative import is invalid, and the '/' operator performs true
# division. Example: print 'hello world' raises a 'SyntaxError' exception.
from __future__ import print_function
from __future__ import absolute_import
from __future__ import division
from __future__ import unicode_literals
import os
import time
import datetime
import tempfile
import json
import shutil
import stat
import sys
import unittest
# Use external backport 'mock' on versions under 3.3
if sys.version_info >= (3, 3):
import unittest.mock as mock
else:
import mock
import securesystemslib.formats
import securesystemslib.hash
import securesystemslib.interface as interface
import six
class TestInterfaceFunctions(unittest.TestCase):
@classmethod
def setUpClass(cls):
# setUpClass() is called before tests in an individual class are executed.
# Create a temporary directory to store the repository, metadata, and target
# files. 'temporary_directory' must be deleted in TearDownClass() so that
# temporary files are always removed, even when exceptions occur.
cls.temporary_directory = tempfile.mkdtemp(dir=os.getcwd())
@classmethod
def tearDownClass(cls):
# tearDownModule() is called after all the tests have run.
# http://docs.python.org/2/library/unittest.html#class-and-module-fixtures
# Remove the temporary repository directory, which should contain all the
# metadata, targets, and key files generated for the test cases.
shutil.rmtree(cls.temporary_directory)
def setUp(self):
pass
def tearDown(self):
pass
def test_generate_and_write_rsa_keypair(self):
# Test normal case.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
test_keypath = os.path.join(temporary_directory, 'rsa_key')
test_keypath_unencrypted = os.path.join(temporary_directory,
'rsa_key_unencrypted')
returned_path = interface.generate_and_write_rsa_keypair(test_keypath,
password='pw')
self.assertTrue(os.path.exists(test_keypath))
self.assertTrue(os.path.exists(test_keypath + '.pub'))
self.assertEqual(returned_path, test_keypath)
# If an empty string is given for 'password', the private key file
# is written to disk unencrypted.
interface.generate_and_write_rsa_keypair(test_keypath_unencrypted,
password='')
self.assertTrue(os.path.exists(test_keypath_unencrypted))
self.assertTrue(os.path.exists(test_keypath_unencrypted + '.pub'))
# Ensure the generated key files are importable.
scheme = 'rsassa-pss-sha256'
imported_pubkey = \
interface.import_rsa_publickey_from_file(test_keypath + '.pub')
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_pubkey))
imported_privkey = interface.import_rsa_privatekey_from_file(test_keypath,
'pw')
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_privkey))
# Try to import the unencrypted key file, by not passing a password
imported_privkey = interface.import_rsa_privatekey_from_file(test_keypath_unencrypted)
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_privkey))
# Try to import the unencrypted key file, by entering an empty password
with mock.patch('securesystemslib.interface.get_password',
return_value=''):
imported_privkey = \
interface.import_rsa_privatekey_from_file(test_keypath_unencrypted,
prompt=True)
self.assertTrue(
securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_privkey))
# Fail importing unencrypted key passing a password
with self.assertRaises(securesystemslib.exceptions.CryptoError):
interface.import_rsa_privatekey_from_file(test_keypath_unencrypted, 'pw')
# Fail importing encrypted key passing no password
with self.assertRaises(securesystemslib.exceptions.CryptoError):
interface.import_rsa_privatekey_from_file(test_keypath)
# Custom 'bits' argument.
os.remove(test_keypath)
os.remove(test_keypath + '.pub')
interface.generate_and_write_rsa_keypair(test_keypath, bits=2048,
password='pw')
self.assertTrue(os.path.exists(test_keypath))
self.assertTrue(os.path.exists(test_keypath + '.pub'))
# Test for a default filepath. If 'filepath' is not given, the key's
# KEYID is used as the filename. The key is saved to the current working
# directory.
default_keypath = interface.generate_and_write_rsa_keypair(password='pw')
self.assertTrue(os.path.exists(default_keypath))
self.assertTrue(os.path.exists(default_keypath + '.pub'))
written_key = interface.import_rsa_publickey_from_file(default_keypath + '.pub')
self.assertEqual(written_key['keyid'], os.path.basename(default_keypath))
os.remove(default_keypath)
os.remove(default_keypath + '.pub')
# Test improperly formatted arguments.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_rsa_keypair, 3, bits=2048, password='pw')
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_rsa_keypair, test_keypath, bits='bad',
password='pw')
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_rsa_keypair, test_keypath, bits=2048,
password=3)
# Test invalid 'bits' argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_rsa_keypair, test_keypath, bits=1024,
password='pw')
def test_import_rsa_privatekey_from_file(self):
# Test normal case.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
# Load one of the pre-generated key files from
# 'securesystemslib/tests/repository_data'. 'password' unlocks the
# pre-generated key files.
key_filepath = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'data', 'keystore', 'rsa_key')
self.assertTrue(os.path.exists(key_filepath))
imported_rsa_key = interface.import_rsa_privatekey_from_file(
key_filepath, 'password')
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_rsa_key))
# Test load encrypted key prompt for password
with mock.patch('securesystemslib.interface.get_password',
return_value='password'):
imported_rsa_key = interface.import_rsa_privatekey_from_file(
key_filepath, prompt=True)
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(
imported_rsa_key))
# Test improperly formatted 'filepath' argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_rsa_privatekey_from_file, 3, 'pw')
# Test improperly formatted 'password' argument.
with self.assertRaises(securesystemslib.exceptions.FormatError):
interface.import_rsa_privatekey_from_file(key_filepath, 123)
# Test unallowed empty 'password'
with self.assertRaises(ValueError):
interface.import_rsa_privatekey_from_file(key_filepath, '')
# Test unallowed passing 'prompt' and 'password'
with self.assertRaises(ValueError):
interface.import_rsa_privatekey_from_file(key_filepath,
password='pw', prompt=True)
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory,
'nonexistent_keypath')
self.assertRaises(IOError, interface.import_rsa_privatekey_from_file,
nonexistent_keypath, 'pw')
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.CryptoError,
interface.import_rsa_privatekey_from_file, invalid_keyfile, 'pw')
def test_import_rsa_publickey_from_file(self):
# Test normal case.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
# Load one of the pre-generated key files from 'securesystemslib/tests/data'.
key_filepath = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'data', 'keystore', 'rsa_key.pub')
self.assertTrue(os.path.exists(key_filepath))
imported_rsa_key = interface.import_rsa_publickey_from_file(key_filepath)
self.assertTrue(securesystemslib.formats.RSAKEY_SCHEMA.matches(imported_rsa_key))
# Test improperly formatted argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_rsa_privatekey_from_file, 3)
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory,
'nonexistent_keypath')
self.assertRaises(IOError, interface.import_rsa_publickey_from_file,
nonexistent_keypath)
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.Error,
interface.import_rsa_publickey_from_file, invalid_keyfile)
def test_generate_and_write_ed25519_keypair(self):
# Test normal case.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
test_keypath = os.path.join(temporary_directory, 'ed25519_key')
test_keypath_unencrypted = os.path.join(temporary_directory,
'ed25519_key_unencrypted')
returned_path = interface.generate_and_write_ed25519_keypair(
test_keypath, password='pw')
self.assertTrue(os.path.exists(test_keypath))
self.assertTrue(os.path.exists(test_keypath + '.pub'))
self.assertEqual(returned_path, test_keypath)
# If an empty string is given for 'password', the private key file
# is written to disk unencrypted.
interface.generate_and_write_ed25519_keypair(test_keypath_unencrypted,
password='')
self.assertTrue(os.path.exists(test_keypath_unencrypted))
self.assertTrue(os.path.exists(test_keypath_unencrypted + '.pub'))
# Ensure the generated key files are importable.
imported_pubkey = \
interface.import_ed25519_publickey_from_file(test_keypath + '.pub')
self.assertTrue(securesystemslib.formats.ED25519KEY_SCHEMA\
.matches(imported_pubkey))
imported_privkey = \
interface.import_ed25519_privatekey_from_file(test_keypath, 'pw')
self.assertTrue(securesystemslib.formats.ED25519KEY_SCHEMA\
.matches(imported_privkey))
# Fail importing encrypted key passing password and prompt
with self.assertRaises(ValueError):
interface.import_ed25519_privatekey_from_file(test_keypath,
password='pw',
prompt=True)
# Fail importing encrypted key passing an empty string for passwd
with self.assertRaises(ValueError):
interface.import_ed25519_privatekey_from_file(test_keypath,
password='')
# Try to import the unencrypted key file, by not passing a password
imported_privkey = \
interface.import_ed25519_privatekey_from_file(test_keypath_unencrypted)
self.assertTrue(securesystemslib.formats.ED25519KEY_SCHEMA.\
matches(imported_privkey))
# Try to import the unencrypted key file, by entering an empty password
with mock.patch('securesystemslib.interface.get_password',
return_value=''):
imported_privkey = \
interface.import_ed25519_privatekey_from_file(test_keypath_unencrypted,
prompt=True)
self.assertTrue(
securesystemslib.formats.ED25519KEY_SCHEMA.matches(imported_privkey))
# Fail importing unencrypted key passing a password
with self.assertRaises(securesystemslib.exceptions.CryptoError):
interface.import_ed25519_privatekey_from_file(test_keypath_unencrypted,
'pw')
# Fail importing encrypted key passing no password
with self.assertRaises(securesystemslib.exceptions.CryptoError):
interface.import_ed25519_privatekey_from_file(test_keypath)
# Test for a default filepath. If 'filepath' is not given, the key's
# KEYID is used as the filename. The key is saved to the current working
# directory.
default_keypath = interface.generate_and_write_ed25519_keypair(password='pw')
self.assertTrue(os.path.exists(default_keypath))
self.assertTrue(os.path.exists(default_keypath + '.pub'))
written_key = interface.import_ed25519_publickey_from_file(default_keypath + '.pub')
self.assertEqual(written_key['keyid'], os.path.basename(default_keypath))
os.remove(default_keypath)
os.remove(default_keypath + '.pub')
# Test improperly formatted arguments.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_ed25519_keypair, 3, password='pw')
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_rsa_keypair, test_keypath, password=3)
def test_import_ed25519_publickey_from_file(self):
# Test normal case.
# Generate ed25519 keys that can be imported.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
ed25519_keypath = os.path.join(temporary_directory, 'ed25519_key')
interface.generate_and_write_ed25519_keypair(ed25519_keypath, password='pw')
imported_ed25519_key = \
interface.import_ed25519_publickey_from_file(ed25519_keypath + '.pub')
self.assertTrue(securesystemslib.formats.ED25519KEY_SCHEMA.matches(imported_ed25519_key))
# Test improperly formatted argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ed25519_publickey_from_file, 3)
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory,
'nonexistent_keypath')
self.assertRaises(IOError, interface.import_ed25519_publickey_from_file,
nonexistent_keypath)
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.Error,
interface.import_ed25519_publickey_from_file, invalid_keyfile)
# Invalid public key imported (contains unexpected keytype.)
keytype = imported_ed25519_key['keytype']
keyval = imported_ed25519_key['keyval']
scheme = imported_ed25519_key['scheme']
ed25519key_metadata_format = \
securesystemslib.keys.format_keyval_to_metadata(keytype, scheme,
keyval, private=False)
ed25519key_metadata_format['keytype'] = 'invalid_keytype'
with open(ed25519_keypath + '.pub', 'wb') as file_object:
file_object.write(json.dumps(ed25519key_metadata_format).encode('utf-8'))
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ed25519_publickey_from_file,
ed25519_keypath + '.pub')
def test_import_ed25519_privatekey_from_file(self):
# Test normal case.
# Generate ed25519 keys that can be imported.
scheme = 'ed25519'
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
ed25519_keypath = os.path.join(temporary_directory, 'ed25519_key')
interface.generate_and_write_ed25519_keypair(ed25519_keypath, password='pw')
imported_ed25519_key = \
interface.import_ed25519_privatekey_from_file(ed25519_keypath, 'pw')
self.assertTrue(securesystemslib.formats.ED25519KEY_SCHEMA.matches(imported_ed25519_key))
# Test improperly formatted argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ed25519_privatekey_from_file, 3, 'pw')
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory,
'nonexistent_keypath')
self.assertRaises(IOError, interface.import_ed25519_privatekey_from_file,
nonexistent_keypath, 'pw')
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.Error,
interface.import_ed25519_privatekey_from_file, invalid_keyfile, 'pw')
# Invalid private key imported (contains unexpected keytype.)
imported_ed25519_key['keytype'] = 'invalid_keytype'
# Use 'rsa_keys.py' to bypass the key format validation performed
# by 'keys.py'.
salt, iterations, derived_key = \
securesystemslib.rsa_keys._generate_derived_key('pw')
# Store the derived key info in a dictionary, the object expected
# by the non-public _encrypt() routine.
derived_key_information = {'salt': salt, 'iterations': iterations,
'derived_key': derived_key}
# Convert the key object to json string format and encrypt it with the
# derived key.
encrypted_key = \
securesystemslib.rsa_keys._encrypt(json.dumps(imported_ed25519_key),
derived_key_information)
with open(ed25519_keypath, 'wb') as file_object:
file_object.write(encrypted_key.encode('utf-8'))
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ed25519_privatekey_from_file, ed25519_keypath, 'pw')
def test_generate_and_write_ecdsa_keypair(self):
# Test normal case.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
test_keypath = os.path.join(temporary_directory, 'ecdsa_key')
returned_path = interface.generate_and_write_ecdsa_keypair(test_keypath, password='pw')
self.assertTrue(os.path.exists(test_keypath))
self.assertTrue(os.path.exists(test_keypath + '.pub'))
self.assertEqual(returned_path, test_keypath)
# Ensure the generated key files are importable.
imported_pubkey = \
interface.import_ecdsa_publickey_from_file(test_keypath + '.pub')
self.assertTrue(securesystemslib.formats.ECDSAKEY_SCHEMA.matches(imported_pubkey))
imported_privkey = \
interface.import_ecdsa_privatekey_from_file(test_keypath, 'pw')
self.assertTrue(securesystemslib.formats.ECDSAKEY_SCHEMA.matches(imported_privkey))
# Test for a default filepath. If 'filepath' is not given, the key's
# KEYID is used as the filename. The key is saved to the current working
# directory.
default_keypath = interface.generate_and_write_ecdsa_keypair(password='pw')
self.assertTrue(os.path.exists(default_keypath))
self.assertTrue(os.path.exists(default_keypath + '.pub'))
written_key = interface.import_ecdsa_publickey_from_file(default_keypath + '.pub')
self.assertEqual(written_key['keyid'], os.path.basename(default_keypath))
os.remove(default_keypath)
os.remove(default_keypath + '.pub')
# Test improperly formatted arguments.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_ecdsa_keypair, 3, password='pw')
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.generate_and_write_ecdsa_keypair, test_keypath, password=3)
def test_import_ecdsa_publickey_from_file(self):
# Test normal case.
# Generate ecdsa keys that can be imported.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
ecdsa_keypath = os.path.join(temporary_directory, 'ecdsa_key')
interface.generate_and_write_ecdsa_keypair(ecdsa_keypath, password='pw')
imported_ecdsa_key = \
interface.import_ecdsa_publickey_from_file(ecdsa_keypath + '.pub')
self.assertTrue(securesystemslib.formats.ECDSAKEY_SCHEMA.matches(imported_ecdsa_key))
# Test improperly formatted argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ecdsa_publickey_from_file, 3)
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory,
'nonexistent_keypath')
self.assertRaises(IOError, interface.import_ecdsa_publickey_from_file,
nonexistent_keypath)
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.Error,
interface.import_ecdsa_publickey_from_file, invalid_keyfile)
# Invalid public key imported (contains unexpected keytype.)
keytype = imported_ecdsa_key['keytype']
keyval = imported_ecdsa_key['keyval']
scheme = imported_ecdsa_key['scheme']
ecdsakey_metadata_format = \
securesystemslib.keys.format_keyval_to_metadata(keytype,
scheme, keyval, private=False)
ecdsakey_metadata_format['keytype'] = 'invalid_keytype'
with open(ecdsa_keypath + '.pub', 'wb') as file_object:
file_object.write(json.dumps(ecdsakey_metadata_format).encode('utf-8'))
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ecdsa_publickey_from_file,
ecdsa_keypath + '.pub')
def test_import_ecdsa_privatekey_from_file(self):
# Test normal case.
# Generate ecdsa keys that can be imported.
temporary_directory = tempfile.mkdtemp(dir=self.temporary_directory)
ecdsa_keypath = os.path.join(temporary_directory, 'ecdsa_key')
interface.generate_and_write_ecdsa_keypair(ecdsa_keypath, password='pw')
imported_ecdsa_key = \
interface.import_ecdsa_privatekey_from_file(ecdsa_keypath, 'pw')
self.assertTrue(securesystemslib.formats.ECDSAKEY_SCHEMA.matches(imported_ecdsa_key))
# Test improperly formatted argument.
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ecdsa_privatekey_from_file, 3, 'pw')
# Test invalid argument.
# Non-existent key file.
nonexistent_keypath = os.path.join(temporary_directory, 'nonexistent_keypath')
self.assertRaises(IOError, interface.import_ecdsa_privatekey_from_file,
nonexistent_keypath, 'pw')
# Invalid key file argument.
invalid_keyfile = os.path.join(temporary_directory, 'invalid_keyfile')
with open(invalid_keyfile, 'wb') as file_object:
file_object.write(b'bad keyfile')
self.assertRaises(securesystemslib.exceptions.Error,
interface.import_ecdsa_privatekey_from_file, invalid_keyfile, 'pw')
# Invalid private key imported (contains unexpected keytype.)
imported_ecdsa_key['keytype'] = 'invalid_keytype'
# Use 'rsa_keys.py' to bypass the key format validation performed
# by 'keys.py'.
salt, iterations, derived_key = \
securesystemslib.rsa_keys._generate_derived_key('pw')
# Store the derived key info in a dictionary, the object expected
# by the non-public _encrypt() routine.
derived_key_information = {'salt': salt, 'iterations': iterations,
'derived_key': derived_key}
# Convert the key object to json string format and encrypt it with the
# derived key.
encrypted_key = \
securesystemslib.rsa_keys._encrypt(json.dumps(imported_ecdsa_key),
derived_key_information)
with open(ecdsa_keypath, 'wb') as file_object:
file_object.write(encrypted_key.encode('utf-8'))
self.assertRaises(securesystemslib.exceptions.FormatError,
interface.import_ecdsa_privatekey_from_file, ecdsa_keypath, 'pw')
# Run the test cases.
if __name__ == '__main__':
unittest.main()
| 39.116505 | 93 | 0.74274 | 2,852 | 24,174 | 6.028752 | 0.096424 | 0.026521 | 0.036641 | 0.070839 | 0.882866 | 0.862627 | 0.852449 | 0.843259 | 0.82808 | 0.806386 | 0 | 0.018179 | 0.171713 | 24,174 | 617 | 94 | 39.179903 | 0.840533 | 0.197278 | 0 | 0.535088 | 0 | 0 | 0.05163 | 0.007257 | 0 | 0 | 0 | 0 | 0.239766 | 1 | 0.038012 | false | 0.090643 | 0.336257 | 0 | 0.377193 | 0.002924 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 8 |
b45e0e5540b828ddf236b8de16c43243c459d826 | 3,166 | py | Python | oxe-api/test/resource/setting/test_upload_favicon.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/setting/test_upload_favicon.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/setting/test_upload_favicon.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | import base64
import os
from unittest.mock import patch
from test.BaseCase import BaseCase
class TestUploadFavicon(BaseCase):
@BaseCase.login
@BaseCase.grant_access("/setting/upload_favicon")
@patch('resource.setting.upload_favicon.IMAGE_FOLDER', os.path.join(os.path.dirname(os.path.realpath(__file__)),
"test_upload_favicon"))
def test_ok(self, token):
path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "original_favicon.ico")
target_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "favicon.ico")
if os.path.exists(target_path):
os.remove(target_path)
f = open(path, 'rb')
data = base64.b64encode(f.read()).decode("utf-8")
payload = {"image": data}
f.close()
response = self.application.post('/setting/upload_favicon',
headers=self.get_standard_post_header(token),
json=payload)
self.assertEqual(200, response.status_code)
self.assertTrue(os.path.exists(target_path))
os.remove(target_path)
@BaseCase.login
@BaseCase.grant_access("/setting/upload_favicon")
@patch('resource.setting.upload_favicon.IMAGE_FOLDER', os.path.join(os.path.dirname(os.path.realpath(__file__)),
"test_upload_favicon"))
def test_ko_wrong_format(self, token):
path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "original_image.png")
target_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "favicon.ico")
if os.path.exists(target_path):
os.remove(target_path)
f = open(path, 'rb')
data = base64.b64encode(f.read()).decode("utf-8")
payload = {"image": data}
f.close()
response = self.application.post('/setting/upload_favicon',
headers=self.get_standard_post_header(token),
json=payload)
self.assertEqual("422 Wrong image format. Must be an ICO file", response.status)
self.assertFalse(os.path.exists(target_path))
@BaseCase.login
@BaseCase.grant_access("/setting/upload_favicon")
@patch('resource.setting.upload_favicon.IMAGE_FOLDER', os.path.join(os.path.dirname(os.path.realpath(__file__)),
"test_upload_favicon"))
def test_ko_file_exception(self, token):
path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "original_image.png")
target_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "test_upload_favicon", "favicon.ico")
payload = {"image": "FAKE FILE"}
response = self.application.post('/setting/upload_favicon',
headers=self.get_standard_post_header(token),
json=payload)
self.assertEqual("422 Impossible to read the image", response.status)
self.assertFalse(os.path.exists(target_path))
| 39.575 | 119 | 0.638977 | 382 | 3,166 | 5.034031 | 0.191099 | 0.099844 | 0.093604 | 0.056162 | 0.868435 | 0.868435 | 0.868435 | 0.868435 | 0.868435 | 0.804992 | 0 | 0.008628 | 0.231207 | 3,166 | 79 | 120 | 40.075949 | 0.78143 | 0 | 0 | 0.745455 | 0 | 0 | 0.203095 | 0.085281 | 0 | 0 | 0 | 0 | 0.109091 | 1 | 0.054545 | false | 0 | 0.072727 | 0 | 0.145455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b488006221a5195ab1233d83061c262f01d851c0 | 1,233 | py | Python | osext/argparse_actions.py | Tatsh/osext | 1e5b2da07963e79be310b2f88cea34c52f62b856 | [
"MIT"
] | null | null | null | osext/argparse_actions.py | Tatsh/osext | 1e5b2da07963e79be310b2f88cea34c52f62b856 | [
"MIT"
] | null | null | null | osext/argparse_actions.py | Tatsh/osext | 1e5b2da07963e79be310b2f88cea34c52f62b856 | [
"MIT"
] | null | null | null | # coding: utf-8
from os.path import isdir, realpath
import argparse
import os
class ReadableDirectoryAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
prospective_dir = values
if not isdir(prospective_dir):
raise argparse.ArgumentTypeError('%s is not a valid directory' % (
prospective_dir,
))
if os.access(prospective_dir, os.R_OK):
setattr(namespace, self.dest, realpath(prospective_dir))
return
raise argparse.ArgumentTypeError('%s is not a readable directory' % (
prospective_dir,
))
class WritableDirectoryAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
prospective_dir = values
if not isdir(prospective_dir):
raise argparse.ArgumentTypeError('%s is not a valid directory' % (
prospective_dir,
))
if os.access(prospective_dir, os.W_OK):
setattr(namespace, self.dest, realpath(prospective_dir))
return
raise argparse.ArgumentTypeError('%s is not a readable directory' % (
prospective_dir,
))
| 29.357143 | 78 | 0.633414 | 131 | 1,233 | 5.778626 | 0.320611 | 0.221929 | 0.15852 | 0.163804 | 0.842801 | 0.842801 | 0.842801 | 0.842801 | 0.842801 | 0.842801 | 0 | 0.001131 | 0.283049 | 1,233 | 41 | 79 | 30.073171 | 0.855204 | 0.010543 | 0 | 0.758621 | 0 | 0 | 0.093596 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0 | 0.103448 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
81e8fe6a48ad732803769bd01c3910f500a7e22e | 58,999 | py | Python | adjutant/api/v1/tests/test_api_taskview.py | CCI-MOC/adjutant | 032db3124ea0b0632afdfc27afc60b6c66cf5f66 | [
"Apache-2.0"
] | null | null | null | adjutant/api/v1/tests/test_api_taskview.py | CCI-MOC/adjutant | 032db3124ea0b0632afdfc27afc60b6c66cf5f66 | [
"Apache-2.0"
] | null | null | null | adjutant/api/v1/tests/test_api_taskview.py | CCI-MOC/adjutant | 032db3124ea0b0632afdfc27afc60b6c66cf5f66 | [
"Apache-2.0"
] | 1 | 2019-04-18T12:21:59.000Z | 2019-04-18T12:21:59.000Z | # Copyright (C) 2015 Catalyst IT Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from django.test.utils import override_settings
from django.conf import settings
from django.core import mail
from rest_framework import status
from adjutant.api.models import Task, Token, Notification
from adjutant.api.v1.tasks import CreateProject
from adjutant.common.tests.fake_clients import (
FakeManager, setup_identity_cache)
from adjutant.common.tests import fake_clients
from adjutant.common.tests.utils import (AdjutantAPITestCase,
modify_dict_settings)
@mock.patch('adjutant.common.user_store.IdentityManager',
FakeManager)
class TaskViewTests(AdjutantAPITestCase):
"""
Tests to ensure the approval/token workflow does what is
expected with the given TaskViews. These test don't check
final results for actions, simply that the tasks, action,
and tokens are created/updated.
"""
def test_bad_data(self):
"""
Simple test to confirm the serializers are correctly processing
wrong data or missing fields.
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'wrong_email_field': "test@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{'errors': {'email': ['This field is required.']}})
data = {'email': "not_a_valid_email", 'roles': ["not_a_valid_role"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{'errors': {
'email': ['Enter a valid email address.'],
'roles': ['"not_a_valid_role" is not a valid choice.']}})
def test_new_user(self):
"""
Ensure the new user workflow goes as expected.
Create task, create token, submit token.
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 1)
self.assertEqual(mail.outbox[0].subject, 'invite_user')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'testpassword'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(
fake_clients.identity_cache['new_users'][0].name,
'test@example.com')
def test_new_user_no_project(self):
"""
Can't create a user for a non-existent project.
"""
setup_identity_cache()
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': 'test_project_id'}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), {'errors': ['actions invalid']})
def test_new_user_not_my_project(self):
"""
Can't create a user for project that user isn't'
project admin or mod on.
"""
setup_identity_cache()
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': 'test_project_id'}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_new_user_not_authenticated(self):
"""
Can't create a user if unauthenticated.
"""
setup_identity_cache()
url = "/v1/actions/InviteUser"
headers = {}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': 'test_project_id'}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.assertEqual(
response.json(),
{'errors': ["Credentials incorrect or none given."]}
)
def test_add_user_existing(self):
"""
Adding existing user to project.
"""
project = fake_clients.FakeProject(name="parent_project")
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(projects=[project], users=[user])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_add_user_existing_with_role(self):
"""
Adding existing user to project.
Already has role.
Should 'complete' anyway but do nothing.
"""
project = fake_clients.FakeProject(name="test_project")
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
assignment = fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user.id}
)
setup_identity_cache(
projects=[project], users=[user], role_assignments=[assignment])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json(),
{'notes': ['Task completed successfully.']})
def test_new_project(self):
"""
Ensure the new project workflow goes as expected.
"""
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin,_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
new_task = Task.objects.all()[0]
url = "/v1/tasks/" + new_task.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json(),
{'notes': ['created token']}
)
new_project = fake_clients.identity_cache['new_projects'][0]
self.assertEqual(new_project.name, 'test_project')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'testpassword'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_new_project_invalid_on_submit(self):
"""
Ensures that when a project becomes invalid at the submit stage
that the a 400 is recieved and no final emails are sent.
"""
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin,_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
new_task = Task.objects.all()[0]
url = "/v1/tasks/" + new_task.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.data,
{'notes': ['created token']}
)
self.assertEqual(len(mail.outbox), 3)
fake_clients.identity_cache['projects'] = {}
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'testpassword'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(len(mail.outbox), 3)
def test_new_project_existing(self):
"""
Test to ensure validation marks actions as invalid
if project is already present.
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin,_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
new_task = Task.objects.all()[0]
url = "/v1/tasks/" + new_task.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{'errors': ['Cannot approve an invalid task. '
'Update data and rerun pre_approve.']})
def test_new_project_existing_user(self):
"""
Project created if not present, existing user attached.
No token should be needed.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
# unauthenticated sign up as existing user
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': user.email}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
# approve the sign-up as admin
headers = {
'project_name': "admin_project",
'project_id': "admin_project_id",
'roles': "admin,_member_",
'username': "admin",
'user_id': "admin_id",
'authenticated': True
}
new_task = Task.objects.all()[0]
url = "/v1/tasks/" + new_task.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json(),
{'notes': ['Task completed successfully.']}
)
def test_new_project_existing_project_new_user(self):
"""
Project already exists but new user attempting to create it.
"""
setup_identity_cache()
# create signup#1 - project1 with user 1
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Create signup#2 - project1 with user 2
data = {'project_name': "test_project", 'email': "test2@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
headers = {
'project_name': "admin_project",
'project_id': "admin_project_id",
'roles': "admin,_member_",
'username': "admin",
'user_id': "admin_id",
'authenticated': True
}
# approve signup #1
new_task1 = Task.objects.all()[0]
url = "/v1/tasks/" + new_task1.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json(),
{'notes': ['created token']}
)
# Attempt to approve signup #2
new_task2 = Task.objects.all()[1]
url = "/v1/tasks/" + new_task2.uuid
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{'errors': ['actions invalid']}
)
def test_reset_user(self):
"""
Ensure the reset user workflow goes as expected.
Create task + create token, submit token.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/ResetPassword"
data = {'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notes'],
['If user with email exists, reset token will be issued.'])
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'new_test_password'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.password, 'new_test_password')
def test_reset_user_duplicate(self):
"""
Request password reset twice in a row
The first token should become invalid, with the second replacing it.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
# Submit password reset
url = "/v1/actions/ResetPassword"
data = {'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notes'],
['If user with email exists, reset token will be issued.'])
# Submit password reset again
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notes'],
['If user with email exists, reset token will be issued.'])
# Verify the first token doesn't work
first_token = Token.objects.all()[0]
url = "/v1/tokens/" + first_token.token
data = {'password': 'new_test_password1'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, 400)
self.assertEqual(user.password, '123')
# Now reset with the second token
second_token = Token.objects.all()[1]
url = "/v1/tokens/" + second_token.token
data = {'password': 'new_test_password2'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.password, 'new_test_password2')
def test_reset_user_no_existing(self):
"""
Actions should be successful, so usernames are not exposed.
"""
setup_identity_cache()
url = "/v1/actions/ResetPassword"
data = {'email': "test@exampleinvalid.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notes'],
['If user with email exists, reset token will be issued.'])
def test_notification_createproject(self):
"""
CreateProject should create a notification.
We should be able to grab it.
"""
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
new_task = Task.objects.all()[0]
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin,_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
url = "/v1/notifications"
response = self.client.get(url, headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notifications'][0]['task'],
new_task.uuid)
def test_duplicate_tasks_new_project(self):
"""
Ensure we can't submit duplicate tasks
"""
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_409_CONFLICT)
data = {'project_name': "test_project_2", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_duplicate_tasks_new_user(self):
"""
Ensure we can't submit duplicate tasks
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_409_CONFLICT)
data = {'email': "test2@example.com", 'roles': ["_member_"],
'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_409_CONFLICT)
def test_return_task_id_if_admin(self):
"""
Confirm that the task id is returned when admin.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin,_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
url = "/v1/actions/ResetPassword"
data = {'email': "test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
# make sure the task is actually valid
new_task = Task.objects.all()[0]
self.assertTrue(all([a.valid for a in new_task.actions]))
self.assertEqual(
response.json()['task'],
new_task.uuid)
def test_return_task_id_if_admin_fail(self):
"""
Confirm that the task id is not returned unless admin.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "_member_",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
url = "/v1/actions/ResetPassword"
data = {'email': "test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
# make sure the task is actually valid
new_task = Task.objects.all()[0]
self.assertTrue(all([a.valid for a in new_task.actions]))
self.assertFalse(response.json().get('task'))
def test_update_email_task(self):
"""
Ensure the update email workflow goes as expected.
Create task, create token, submit token.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': user.id,
'authenticated': True
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.name, 'new_test@example.com')
@modify_dict_settings(TASK_SETTINGS=[
{'key_list': ['update_email', 'additional_actions'],
'operation': 'append',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['update_email', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value': {
'subject': 'email_update_additional',
'template': 'email_update_started.txt',
'email_roles': [],
'email_current_user': True,
}
}
])
def test_update_email_task_send_email_to_current_user(self):
"""
Tests the email update workflow, and ensures that when setup
to send a confirmation email to the old email address it does.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': user.id,
'authenticated': True
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(mail.outbox[0].to, ['test@example.com'])
self.assertEqual(mail.outbox[0].subject, 'email_update_additional')
self.assertEqual(mail.outbox[1].to, ['new_test@example.com'])
self.assertEqual(mail.outbox[1].subject, 'email_update_token')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.name, 'new_test@example.com')
self.assertEqual(len(mail.outbox), 3)
@modify_dict_settings(TASK_SETTINGS=[
{'key_list': ['update_email', 'additional_actions'],
'operation': 'append',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['update_email', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value': {
'subject': 'email_update_additional',
'template': 'email_update_started.txt',
'email_roles': [],
'email_current_user': True}
}
])
@override_settings(USERNAME_IS_EMAIL=False)
def test_update_email_task_send_email_current_name_not_email(self):
"""
Tests the email update workflow when USERNAME_IS_EMAIL=False, and
ensures that when setup to send a confirmation email to the old
email address it does.
"""
user = fake_clients.FakeUser(
name="nkdfslnkls", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "nkdfslnkls",
'user_id': user.id,
'authenticated': True,
'email': 'test@example.com',
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(mail.outbox[0].to, ['test@example.com'])
self.assertEqual(mail.outbox[0].subject, 'email_update_additional')
self.assertEqual(mail.outbox[1].to, ['new_test@example.com'])
self.assertEqual(mail.outbox[1].subject, 'email_update_token')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(mail.outbox), 3)
def test_update_email_task_invalid_email(self):
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': user.id,
'authenticated': True
}
data = {'new_email': "new_test@examplecom"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json(),
{'errors': {'new_email': [u'Enter a valid email address.']}})
@override_settings(USERNAME_IS_EMAIL=True)
def test_update_email_pre_existing_user_with_email(self):
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
user2 = fake_clients.FakeUser(
name="new_test@example.com", password="123",
email="new_test@example.com")
setup_identity_cache(users=[user, user2])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True,
'project_domain_id': 'default',
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), {'errors': ['actions invalid']})
self.assertEqual(len(Token.objects.all()), 0)
self.assertEqual(len(mail.outbox), 0)
@override_settings(USERNAME_IS_EMAIL=False)
def test_update_email_user_with_email_username_not_email(self):
user = fake_clients.FakeUser(
name="test", password="123", email="test@example.com")
user2 = fake_clients.FakeUser(
name="new_test", password="123",
email="new_test@example.com")
setup_identity_cache(users=[user, user2])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': user.id,
'authenticated': True
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 1)
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.email, 'new_test@example.com')
def test_update_email_task_not_authenticated(self):
"""
Ensure that an unauthenticated user cant access the endpoint.
"""
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
}
data = {'new_email': "new_test@examplecom"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
@override_settings(USERNAME_IS_EMAIL=False)
def test_update_email_task_username_not_email(self):
user = fake_clients.FakeUser(
name="test_user", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/UpdateEmail"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test_user",
'user_id': user.id,
'authenticated': True
}
data = {'new_email': "new_test@example.com"}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.name, "test_user")
self.assertEqual(user.email, 'new_test@example.com')
# Tests for USERNAME_IS_EMAIL=False
@override_settings(USERNAME_IS_EMAIL=False)
def test_invite_user_email_not_username(self):
"""
Invites a user where the email is different to the username.
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "user",
'user_id': "test_user_id",
'authenticated': True
}
data = {'username': 'new_user', 'email': "new@example.com",
'roles': ["_member_"], 'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 1)
self.assertEqual(mail.outbox[0].subject, 'invite_user')
self.assertEqual(mail.outbox[0].to[0], 'new@example.com')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'testpassword'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(
fake_clients.identity_cache['new_users'][0].name,
'new_user')
@override_settings(USERNAME_IS_EMAIL=False)
def test_reset_user_username_not_email(self):
"""
Ensure the reset user workflow goes as expected.
Create task + create token, submit token.
"""
user = fake_clients.FakeUser(
name="test_user", password="123", email="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/ResetPassword"
# NOTE(amelia): Requiring both username and email here may be
# a slight issue for various UIs as typically a
# forgotten password screen only asks for the
# email address, however there isn't a very
# good way to address this as keystone doesn't
# store emails in their own field
# Currently this is an issue for the forked adjutant
# horizon
data = {'email': "test@example.com", 'username': 'test_user'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['notes'],
['If user with email exists, reset token will be issued.'])
self.assertEqual(len(mail.outbox), 1)
self.assertEqual(mail.outbox[0].subject,
'Password Reset for OpenStack')
self.assertEqual(mail.outbox[0].to[0], 'test@example.com')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'new_test_password'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(user.password, 'new_test_password')
@override_settings(USERNAME_IS_EMAIL=False)
def test_new_project_username_not_email(self):
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com",
'username': 'test'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = {'email': "new_test@example.com", 'username': "new",
'project_name': 'new_project'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['task created']})
new_task = Task.objects.all()[0]
url = "/v1/tasks/" + new_task.uuid
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "admin",
'username': "test",
'user_id': "test_user_id",
'email': "test@example.com",
'authenticated': True
}
response = self.client.post(url, {'approved': True}, format='json',
headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True, 'password': '1234'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
@modify_dict_settings(
TASK_SETTINGS=[
{'key_list': ['invite_user', 'additional_actions'],
'operation': 'append',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['invite_user', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value': {
'subject': 'email_update_additional',
'template': 'email_update_started.txt',
'email_roles': ['project_admin'],
'email_current_user': False,
}
}
])
def test_additional_emails_roles(self):
"""
Tests the sending of additional emails to a set of roles in a project
"""
# NOTE(amelia): sending this email here is probably not the intended
# case. It would be more useful in utils such as a quota update or a
# child project being created that all the project admins should be
# notified of
project = fake_clients.FakeProject(name="test_project")
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
user2 = fake_clients.FakeUser(
name="test2@example.com", password="123",
email="test2@example.com")
user3 = fake_clients.FakeUser(
name="test3@example.com", password="123",
email="test2@example.com")
assignments = [
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="project_admin",
user={'id': user.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user2.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="project_admin",
user={'id': user2.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user3.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="project_mod",
user={'id': user3.id}
),
]
setup_identity_cache(
projects=[project], users=[user, user2, user3],
role_assignments=assignments)
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "new_test@example.com",
'roles': ['_member_'], 'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(len(mail.outbox[0].to), 2)
self.assertEqual(set(mail.outbox[0].to),
set([user.email, user2.email]))
self.assertEqual(mail.outbox[0].subject, 'email_update_additional')
# Test that the token email gets sent to the other addresses
self.assertEqual(mail.outbox[1].to[0], 'new_test@example.com')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True, 'password': '1234'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
@modify_dict_settings(
TASK_SETTINGS=[
{'key_list': ['invite_user', 'additional_actions'],
'operation': 'append',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['invite_user', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value': {
'subject': 'email_update_additional',
'template': 'email_update_started.txt',
'email_roles': ['project_admin'],
'email_current_user': False,
}
}
])
def test_additional_emails_role_no_email(self):
"""
Tests that setting email roles to something that has no people to
send to that the update action doesn't fall over
"""
project = fake_clients.FakeProject(name="test_project")
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
assignment = fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user.id}
)
setup_identity_cache(
projects=[project], users=[user], role_assignments=[assignment])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "new_test@example.com",
'roles': ['_member_']}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 1)
# Test that the token email gets sent to the other addresses
self.assertEqual(mail.outbox[0].to[0], 'new_test@example.com')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'confirm': True, 'password': '1234'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
@modify_dict_settings(
TASK_SETTINGS=[
{'key_list': ['invite_user', 'additional_actions'],
'operation': 'override',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['invite_user', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value':{
'subject': 'invite_user_additional',
'template': 'email_update_started.txt',
'email_additional_addresses': ['admin@example.com'],
'email_current_user': False,
}
}
])
def test_email_additional_addresses(self):
"""
Tests the sending of additional emails an admin email set in
the conf
"""
project = fake_clients.FakeProject(name="test_project")
user = fake_clients.FakeUser(
name="test@example.com", password="123", email="test@example.com")
assignments = [
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="_member_",
user={'id': user.id}
),
fake_clients.FakeRoleAssignment(
scope={'project': {'id': project.id}},
role_name="project_admin",
user={'id': user.id}
),
]
setup_identity_cache(
projects=[project], users=[user], role_assignments=assignments)
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "new_test@example.com", 'roles': ['_member_']}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
self.assertEqual(len(mail.outbox), 2)
self.assertEqual(set(mail.outbox[0].to),
set(['admin@example.com']))
self.assertEqual(mail.outbox[0].subject, 'invite_user_additional')
# Test that the token email gets sent to the other addresses
self.assertEqual(mail.outbox[1].to[0], 'new_test@example.com')
new_token = Token.objects.all()[0]
url = "/v1/tokens/" + new_token.token
data = {'password': 'testpassword'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
@modify_dict_settings(
TASK_SETTINGS=[
{'key_list': ['invite_user', 'additional_actions'],
'operation': 'override',
'value': ['SendAdditionalEmailAction']},
{'key_list': ['invite_user', 'action_settings',
'SendAdditionalEmailAction', 'initial'],
'operation': 'update',
'value':{
'subject': 'invite_user_additional',
'template': 'email_update_started.txt',
'email_additional_addresses': ['admin@example.com'],
'email_current_user': False,
}
}
])
def test_email_additional_action_invalid(self):
"""
The additional email actions should not send an email if the
action is invalid.
"""
setup_identity_cache()
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': "test_project_id",
'roles': "project_admin,_member_,project_mod",
'username': "test@example.com",
'user_id': "test_user_id",
'authenticated': True
}
data = {'email': "test@example.com", 'roles': ["_member_"],
'project_id': 'test_project_id'}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), {'errors': ['actions invalid']})
self.assertEqual(len(mail.outbox), 0)
@mock.patch('adjutant.common.tests.fake_clients.FakeManager.find_project')
def test_all_actions_setup(self, mocked_find):
"""
Ensures that all actions have been setup before pre_approve is
run on any actions, even if we have a pre_approve failure.
Deals with: bug/1745053
"""
setup_identity_cache()
mocked_find.side_effect = KeyError()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(
response.status_code, status.HTTP_500_INTERNAL_SERVER_ERROR)
new_task = Task.objects.all()[0]
class_conf = settings.TASK_SETTINGS.get(
CreateProject.task_type, settings.DEFAULT_TASK_SETTINGS)
expected_action_names = (
class_conf.get('default_actions', [])
or CreateProject.default_actions[:])
expected_action_names += class_conf.get('additional_actions', [])
actions = new_task.actions
observed_action_names = [a.action_name for a in actions]
self.assertEqual(observed_action_names, expected_action_names)
@mock.patch('adjutant.common.tests.fake_clients.FakeManager.find_project')
def test_task_error_handler(self, mocked_find):
"""
Ensure the _handle_task_error function works as expected.
"""
setup_identity_cache()
mocked_find.side_effect = KeyError("Forced key error.")
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(
response.status_code, status.HTTP_500_INTERNAL_SERVER_ERROR)
self.assertEqual(
response.json(),
{'errors': ["Error: Something went wrong on the server. "
"It will be looked into shortly."]})
new_task = Task.objects.all()[0]
new_notification = Notification.objects.all()[0]
self.assertTrue(new_notification.error)
self.assertEqual(
new_notification.notes,
{'errors': [
"Error: KeyError('Forced key error.') while setting up "
"task. See task itself for details."]})
self.assertEqual(new_notification.task, new_task)
@override_settings(KEYSTONE={'can_edit_users': False})
def test_user_invite_cant_edit_users(self):
"""
When can_edit_users is false, and a new user is invited,
the task should be marked as invalid if the user doesn't
already exist.
"""
project = fake_clients.FakeProject(name="test_project")
setup_identity_cache(projects=[project])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "user",
'user_id': "test_user_id",
'authenticated': True
}
data = {'username': 'new_user', 'email': "new@example.com",
'roles': ["_member_"], 'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.json(), {'errors': ['actions invalid']})
@override_settings(KEYSTONE={'can_edit_users': False})
def test_user_invite_cant_edit_users_existing_user(self):
"""
When can_edit_users is false, and a new user is invited,
the task should be marked as valid if the user exists.
"""
project = fake_clients.FakeProject(name="test_project")
user = fake_clients.FakeUser(name="test@example.com")
setup_identity_cache(projects=[project], users=[user])
url = "/v1/actions/InviteUser"
headers = {
'project_name': "test_project",
'project_id': project.id,
'roles': "project_admin,_member_,project_mod",
'username': "user",
'user_id': "test_user_id",
'authenticated': True
}
data = {'username': 'new_user', 'email': "test@example.com",
'roles': ["_member_"], 'project_id': project.id}
response = self.client.post(url, data, format='json', headers=headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['created token']})
@override_settings(KEYSTONE={'can_edit_users': False})
def test_project_create_cant_edit_users(self):
"""
When can_edit_users is false, and a new signup comes in,
the task should be marked as invalid if it needs to
create a new user.
Will return OK (as task doesn't auto_approve), but task will
actually be invalid.
"""
setup_identity_cache()
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['task created']})
task = Task.objects.all()[0]
action_models = task.actions
actions = [act.get_action() for act in action_models]
self.assertFalse(all([act.valid for act in actions]))
@override_settings(KEYSTONE={'can_edit_users': False})
def test_project_create_cant_edit_users_existing_user(self):
"""
When can_edit_users is false, and a new signup comes in,
the task should be marked as valid if the user already
exists.
Will return OK (as task doesn't auto_approve), but task will
actually be valid.
"""
user = fake_clients.FakeUser(name="test@example.com")
setup_identity_cache(users=[user])
url = "/v1/actions/CreateProject"
data = {'project_name': "test_project", 'email': "test@example.com"}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json(), {'notes': ['task created']})
task = Task.objects.all()[0]
action_models = task.actions
actions = [act.get_action() for act in action_models]
self.assertTrue(all([act.valid for act in actions]))
| 38.435831 | 79 | 0.595112 | 6,522 | 58,999 | 5.17694 | 0.063784 | 0.075524 | 0.079019 | 0.065277 | 0.850344 | 0.831418 | 0.813411 | 0.803015 | 0.78021 | 0.777278 | 0 | 0.012042 | 0.275106 | 58,999 | 1,534 | 80 | 38.460887 | 0.777427 | 0.087781 | 0 | 0.801835 | 0 | 0 | 0.239998 | 0.048421 | 0 | 0 | 0 | 0 | 0.161468 | 1 | 0.037615 | false | 0.042202 | 0.009174 | 0 | 0.047706 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
81fb920b5a96ab5eefa48d8c6fb9903ab3f1f445 | 5,515 | py | Python | tests/test_cancion.py | BlazAlvarado/MisCancionesApp | a5e61415d63f41cd6eedb57c556b987347fab2b8 | [
"MIT"
] | null | null | null | tests/test_cancion.py | BlazAlvarado/MisCancionesApp | a5e61415d63f41cd6eedb57c556b987347fab2b8 | [
"MIT"
] | null | null | null | tests/test_cancion.py | BlazAlvarado/MisCancionesApp | a5e61415d63f41cd6eedb57c556b987347fab2b8 | [
"MIT"
] | null | null | null | from faker import Faker
import random
import unittest
from src.logica.coleccion import Coleccion
from src.modelo.album import Album, Medio
from src.modelo.cancion import Cancion, AlbumCancion
from src.modelo.interprete import Interprete
from src.modelo.declarative_base import Session
class CancionTestCase(unittest.TestCase):
def setUp(self):
'''Crea una colección para hacer las pruebas'''
self.coleccion = Coleccion()
'''Abre la sesión'''
self.session = Session()
'''Crea una instance de Faker'''
self.data_factory = Faker ( )
'''Se programa para que Faker cree los mismos datos cuando se ejecuta'''
Faker.seed ( 1000 )
'''Genera 10 datos en data y creamos los álbumes'''
self.data = [ ]
self.canciones = [ ]
for i in range ( 0 , 10 ) :
self.data.append ( (
self.data_factory.unique.name ( ) ,
self.data_factory.random_int ( 0 , 4) ,
self.data_factory.random_int (0, 60 ) ,
self.data_factory.unique.name ( ) ))
self.canciones.append (
Cancion(
titulo=self.data[-1][0],
minutos=self.data[-1][1],
segundos=self.data[-1][2],
compositor=self.data[-1][3],
albumes=[],
interpretes=[]
) )
self.session.add ( self.canciones[ -1 ] )
'''Persiste los objetos
En este setUp no se cierra la sesión para usar los albumes en las pruebas'''
self.session.commit ( )
def tearDown(self) :
self.session = Session ( )
busqueda = self.session.query ( Cancion ).all ( )
for cancion in busqueda :
self.session.delete ( cancion )
self.session.commit ( )
self.session.close ( )
def test_constructor(self):
for cancion, dato in zip(self.canciones, self.data):
self.assertEqual(cancion.titulo, dato[0])
self.assertEqual(cancion.minutos, dato[1])
self.assertEqual(cancion.segundos, dato[2])
self.assertEqual(cancion.compositor, dato[3])
def test_agregar_cancion ( self ) :
'''Prueba la adición de un álbum'''
self.data.append((
self.data_factory.unique.name(),
self.data_factory.random_int(0, 4),
self.data_factory.random_int(0, 60),
self.data_factory.unique.name()))
resultado = self.coleccion.agregar_cancion (
titulo=self.data[-1][0],
minutos=self.data[-1][1],
segundos=self.data[-1][2],
compositor=self.data[-1][3], )
self.assertEqual ( resultado , True )
import random
import unittest
from src.logica.coleccion import Coleccion
from src.modelo.album import Album, Medio
from src.modelo.cancion import Cancion, AlbumCancion
from src.modelo.interprete import Interprete
from src.modelo.declarative_base import Session
class CancionTestCase(unittest.TestCase):
def setUp(self):
'''Crea una colección para hacer las pruebas'''
self.coleccion = Coleccion()
'''Abre la sesión'''
self.session = Session()
'''Crea una instance de Faker'''
self.data_factory = Faker ( )
'''Se programa para que Faker cree los mismos datos cuando se ejecuta'''
Faker.seed ( 1000 )
'''Genera 10 datos en data y creamos los álbumes'''
self.data = [ ]
self.canciones = [ ]
for i in range ( 0 , 10 ) :
self.data.append ( (
self.data_factory.unique.name ( ) ,
self.data_factory.random_int ( 0 , 4) ,
self.data_factory.random_int (0, 60 ) ,
self.data_factory.unique.name ( ) ))
self.canciones.append (
Cancion(
titulo=self.data[-1][0],
minutos=self.data[-1][1],
segundos=self.data[-1][2],
compositor=self.data[-1][3],
albumes=[],
interpretes=[]
) )
self.session.add ( self.canciones[ -1 ] )
'''Persiste los objetos
En este setUp no se cierra la sesión para usar los albumes en las pruebas'''
self.session.commit ( )
def tearDown(self) :
self.session = Session ( )
busqueda = self.session.query ( Cancion ).all ( )
for cancion in busqueda :
self.session.delete ( cancion )
self.session.commit ( )
self.session.close ( )
def test_constructor(self):
for cancion, dato in zip(self.canciones, self.data):
self.assertEqual(cancion.titulo, dato[0])
self.assertEqual(cancion.minutos, dato[1])
self.assertEqual(cancion.segundos, dato[2])
self.assertEqual(cancion.compositor, dato[3])
def test_agregar_cancion ( self ) :
'''Prueba la adición de un álbum'''
self.data.append((
self.data_factory.unique.name(),
self.data_factory.random_int(0, 4),
self.data_factory.random_int(0, 60),
self.data_factory.unique.name()))
resultado = self.coleccion.agregar_cancion (
titulo=self.data[-1][0],
minutos=self.data[-1][1],
segundos=self.data[-1][2],
compositor=self.data[-1][3], )
self.assertEqual ( resultado , True ) | 35.580645 | 84 | 0.568268 | 630 | 5,515 | 4.91746 | 0.160317 | 0.108457 | 0.087153 | 0.054229 | 0.993544 | 0.993544 | 0.993544 | 0.993544 | 0.993544 | 0.993544 | 0 | 0.021277 | 0.318223 | 5,515 | 155 | 85 | 35.580645 | 0.80266 | 0.025929 | 0 | 0.991304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.069565 | false | 0 | 0.130435 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c307bbca8620417cb4a61da083b80ff3bae328e9 | 2,434 | py | Python | ims/migrations/0024_auto_20200314_0910.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | null | null | null | ims/migrations/0024_auto_20200314_0910.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | 15 | 2021-03-19T03:43:56.000Z | 2022-03-12T00:30:55.000Z | ims/migrations/0024_auto_20200314_0910.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | null | null | null | # Generated by Django 3.0.3 on 2020-03-14 03:40
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('ims', '0023_auto_20200313_2233'),
]
operations = [
migrations.RenameField(
model_name='imsassignfiles',
old_name='assdate',
new_name='date',
),
migrations.RenameField(
model_name='imsassignfiles',
old_name='assfile',
new_name='file',
),
migrations.RenameField(
model_name='imsassignfiles',
old_name='apost',
new_name='post',
),
migrations.RenameField(
model_name='imsclosurefiles',
old_name='clodate',
new_name='date',
),
migrations.RenameField(
model_name='imsclosurefiles',
old_name='clofile',
new_name='file',
),
migrations.RenameField(
model_name='imsclosurefiles',
old_name='cpost',
new_name='post',
),
migrations.RenameField(
model_name='imsinvestigationfiles',
old_name='invdate',
new_name='date',
),
migrations.RenameField(
model_name='imsinvestigationfiles',
old_name='invfile',
new_name='file',
),
migrations.RenameField(
model_name='imsinvestigationfiles',
old_name='ipost',
new_name='post',
),
migrations.RenameField(
model_name='imssubmissionfiles',
old_name='subdate',
new_name='date',
),
migrations.RenameField(
model_name='imssubmissionfiles',
old_name='subfile',
new_name='file',
),
migrations.RenameField(
model_name='imssubmissionfiles',
old_name='spost',
new_name='post',
),
migrations.RenameField(
model_name='imsvalidationfiles',
old_name='valdate',
new_name='date',
),
migrations.RenameField(
model_name='imsvalidationfiles',
old_name='valfile',
new_name='file',
),
migrations.RenameField(
model_name='imsvalidationfiles',
old_name='vpost',
new_name='post',
),
]
| 27.348315 | 47 | 0.518077 | 193 | 2,434 | 6.284974 | 0.264249 | 0.259687 | 0.321517 | 0.370981 | 0.797197 | 0.797197 | 0.797197 | 0 | 0 | 0 | 0 | 0.020315 | 0.373048 | 2,434 | 88 | 48 | 27.659091 | 0.774574 | 0.018488 | 0 | 0.731707 | 1 | 0 | 0.183913 | 0.036028 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012195 | 0 | 0.04878 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c3377155161e5497599aaf0abeced7f937a16ada | 165 | py | Python | eslearn/GUI/lc_ui2py_for_feature_engineering.py | ZitongLu1996/easylearn | c1ba6944541ad531b0ce828a34452341a460d366 | [
"MIT"
] | null | null | null | eslearn/GUI/lc_ui2py_for_feature_engineering.py | ZitongLu1996/easylearn | c1ba6944541ad531b0ce828a34452341a460d366 | [
"MIT"
] | null | null | null | eslearn/GUI/lc_ui2py_for_feature_engineering.py | ZitongLu1996/easylearn | c1ba6944541ad531b0ce828a34452341a460d366 | [
"MIT"
] | 1 | 2021-06-10T07:27:56.000Z | 2021-06-10T07:27:56.000Z | import os
cmd_str = r'pyuic5 -o easylearn_feature_engineering_gui.py D:/My_Codes/easylearn-fmri/eslearn/GUI/easylearn_feature_engineering_gui.ui'
os.system(cmd_str)
| 41.25 | 135 | 0.848485 | 28 | 165 | 4.678571 | 0.678571 | 0.091603 | 0.412214 | 0.458015 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00641 | 0.054545 | 165 | 3 | 136 | 55 | 0.833333 | 0 | 0 | 0 | 0 | 0.333333 | 0.739394 | 0.672727 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
c33c5075d92eb8912bb96f65f013141a32557363 | 32,789 | py | Python | leetcode/3_SOL_sliding_window.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | leetcode/3_SOL_sliding_window.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | leetcode/3_SOL_sliding_window.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | class Solution:
def lengthOfLongestSubstring(self, s):
n = len(s)
i = j = ans = 0
_map = {}
while j < n:
if c[j] in _map:
if _map[s[j]] > i: i = _map[s[j]]
if j-i+1 > ans: ans = j-i+1
# ans = j - i + 1 if j - i + 1 > ans else ans
_map[s[j]] = j + 1
j += 1
# n = len(s)
# _set = {}
# ans = i = j = 0
# while i < n and j < n:
# if s[j] not in _set:
# _set[s[j]] = s[j]
# j += 1
# if (j - i > ans): ans = j - i
# else:
# del _set[s[i]]
# i += 1
# return ans
if __name__ == '__main__':
s = Solution()
a = s.lengthOfLongestSubstring('pwwkew')
print(a)
a = s.lengthOfLongestSubstring('abacbacbd')
print(a)
a = s.lengthOfLongestSubstring('bbbbbbb')
print(a)
# a = s.lengthOfLongestSubstring('bbbbbb')
# print(a)
a = s.lengthOfLongestSubstring('abcabcbb')
print(a)
b = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~ abcdefghijklmnopqrstuvwxyzABCD"
a = s.lengthOfLongestSubstring(b)
print(a)
| 780.690476 | 31,662 | 0.641953 | 470 | 32,789 | 44.055319 | 0.076596 | 1.946296 | 2.910461 | 3.868637 | 0.983097 | 0.976142 | 0.976142 | 0.976142 | 0.976142 | 0.976142 | 0 | 0.10229 | 0.025039 | 32,789 | 42 | 31,663 | 780.690476 | 0.54542 | 0.009637 | 0 | 0.208333 | 0 | 0 | 0.644957 | 0.6371 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0 | 0 | 0.083333 | 0.208333 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
c345d4e5cb26c79cdb2408531b98c60d5b9a1fea | 22,846 | py | Python | models/networks/fusions.py | valeoai/BEEF | f1c5f3708ba91f6402dd05814b76dca1d9012942 | [
"Apache-2.0"
] | 4 | 2021-05-31T16:53:35.000Z | 2021-11-30T03:03:34.000Z | models/networks/fusions.py | valeoai/BEEF | f1c5f3708ba91f6402dd05814b76dca1d9012942 | [
"Apache-2.0"
] | 3 | 2022-02-02T20:41:56.000Z | 2022-02-24T11:47:44.000Z | models/networks/fusions.py | valeoai/BEEF | f1c5f3708ba91f6402dd05814b76dca1d9012942 | [
"Apache-2.0"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
from . import mlp
def get_sizes_list(dim, chunks):
split_size = (dim + chunks - 1) // chunks
sizes_list = [split_size] * chunks
sizes_list[-1] = sizes_list[-1] - (sum(sizes_list) - dim) # Adjust last
assert sum(sizes_list) == dim
if sizes_list[-1]<0:
n_miss = sizes_list[-2] - sizes_list[-1]
sizes_list[-1] = sizes_list[-2]
for j in range(n_miss):
sizes_list[-j-1] -= 1
assert sum(sizes_list) == dim
assert min(sizes_list) > 0
return sizes_list
def get_chunks(x,sizes):
out = []
begin = 0
for s in sizes:
y = x.narrow(1,begin,s)
out.append(y)
begin += s
return out
class Bilinear(nn.Module):
def __init__(self,
input_dims,
output_dim,
activ_output=None,
dropout_output=0.,
**kwargs):
super(Bilinear, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.activ_output = activ_output
self.dropout_output = dropout_output
# Modules
self.bilinear = nn.Bilinear(input_dims[0], input_dims[1], output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
z = self.bilinear(x[0], x[1])
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class Block(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1600,
chunks=20,
rank=15,
shared=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.,
pos_norm='before_cat'):
super(Block, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.mm_dim = mm_dim
self.chunks = chunks
self.rank = rank
self.shared = shared
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
assert(pos_norm in ['before_cat', 'after_cat'])
self.pos_norm = pos_norm
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
if shared:
self.linear1 = self.linear0
else:
self.linear1 = nn.Linear(input_dims[1], mm_dim)
merge_linears0, merge_linears1 = [], []
self.sizes_list = get_sizes_list(mm_dim, chunks)
for size in self.sizes_list:
ml0 = nn.Linear(size, size*rank)
merge_linears0.append(ml0)
if self.shared:
ml1 = ml0
else:
ml1 = nn.Linear(size, size*rank)
merge_linears1.append(ml1)
self.merge_linears0 = nn.ModuleList(merge_linears0)
self.merge_linears1 = nn.ModuleList(merge_linears1)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
self.hidden_activations = {0:None}
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
bsize = x1.size(0)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
x0_chunks = get_chunks(x0, self.sizes_list)
x1_chunks = get_chunks(x1, self.sizes_list)
zs = []
for chunk_id, m0, m1 in zip(range(len(self.sizes_list)),
self.merge_linears0,
self.merge_linears1):
x0_c = x0_chunks[chunk_id]
x1_c = x1_chunks[chunk_id]
m = m0(x0_c) * m1(x1_c) # bsize x split_size*rank
m = m.view(bsize, self.rank, -1)
z = torch.sum(m, 1)
if self.pos_norm == 'before_cat':
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
zs.append(z)
z = torch.cat(zs,1)
if self.pos_norm == 'after_cat':
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
self.hidden_activations[0] = z
z = self.linear_out(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class BlockTucker(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1600,
chunks=20,
shared=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.,
pos_norm='before_cat'):
super(BlockTucker, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.mm_dim = mm_dim
self.chunks = chunks
self.shared = shared
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
assert(pos_norm in ['before_cat', 'after_cat'])
self.pos_norm = pos_norm
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
if self.shared:
self.linear1 = self.linear0
else:
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.sizes_list = get_sizes_list(mm_dim, chunks)
bilinears = []
for size in self.sizes_list:
bilinears.append(
nn.Bilinear(size, size, size)
)
self.bilinears = nn.ModuleList(bilinears)
self.linear_out = nn.Linear(self.mm_dim, self.output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
bsize = x1.size(0)
if self.dropout_input:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
x0_chunks = get_chunks(x0, self.sizes_list)
x1_chunks = get_chunks(x1, self.sizes_list)
zs = []
for chunk_id, bilinear in enumerate(self.bilinears):
x0_c = x0_chunks[chunk_id]
x1_c = x1_chunks[chunk_id]
z = bilinear(x0_c, x1_c)
if self.pos_norm == 'before_cat':
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
zs.append(z)
z = torch.cat(zs, 1)
if self.pos_norm == 'after_cat':
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
z = self.linear_out(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class Mutan(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1600,
rank=15,
shared=False,
normalize=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.):
super(Mutan, self).__init__()
self.input_dims = input_dims
self.shared = shared
self.mm_dim = mm_dim
self.rank = rank
self.output_dim = output_dim
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
self.normalize = normalize
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
self.merge_linear0 = nn.Linear(mm_dim, mm_dim*rank)
if self.shared:
self.linear1 = self.linear0
self.merge_linear1 = self.merge_linear0
else:
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.merge_linear1 = nn.Linear(mm_dim, mm_dim*rank)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
m0 = self.merge_linear0(x0)
m1 = self.merge_linear1(x1)
m = m0 * m1
m = m.view(-1, self.rank, self.mm_dim)
z = torch.sum(m, 1)
if self.normalize:
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z, p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
z = self.linear_out(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class Tucker(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1600,
shared=False,
normalize=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.):
super(Tucker, self).__init__()
self.input_dims = input_dims
self.shared = shared
self.mm_dim = mm_dim
self.output_dim = output_dim
self.normalize = normalize
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
if shared:
self.linear1 = self.linear0
else:
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.bilinear = nn.Bilinear(mm_dim, mm_dim, mm_dim)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z = self.bilinear(x0, x1)
if self.normalize:
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
z = self.linear_out(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class MLB(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1200,
activ_input='relu',
activ_output='relu',
normalize=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.):
super(MLB, self).__init__()
self.input_dims = input_dims
self.mm_dim = mm_dim
self.output_dim = output_dim
self.activ_input = activ_input
self.activ_output = activ_output
self.normalize = normalize
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
if self.activ_input:
x0 = getattr(F, self.activ_input)(x0)
x1 = getattr(F, self.activ_input)(x1)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z = x0 * x1
if self.normalize:
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
z = self.linear_out(z)
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class MFB(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1200,
factor=2,
activ_input='relu',
activ_output='relu',
normalize=False,
dropout_input=0.,
dropout_pre_norm=0.,
dropout_output=0.):
super(MFB, self).__init__()
self.input_dims = input_dims
self.mm_dim = mm_dim
self.factor = factor
self.output_dim = output_dim
self.activ_input = activ_input
self.activ_output = activ_output
self.normalize = normalize
self.dropout_input = dropout_input
self.dropout_pre_norm = dropout_pre_norm
self.dropout_output = dropout_output
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim*factor)
self.linear1 = nn.Linear(input_dims[1], mm_dim*factor)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
if self.activ_input:
x0 = getattr(F, self.activ_input)(x0)
x1 = getattr(F, self.activ_input)(x1)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z = x0 * x1
if self.dropout_pre_norm > 0:
z = F.dropout(z, p=self.dropout_pre_norm, training=self.training)
z = z.view(z.size(0), self.mm_dim, self.factor)
z = z.sum(2)
if self.normalize:
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
z = self.linear_out(z)
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class MFH(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1200,
factor=2,
activ_input='relu',
activ_output='relu',
normalize=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.):
super(MFH, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.mm_dim = mm_dim
self.factor = factor
self.activ_input = activ_input
self.activ_output = activ_output
self.normalize = normalize
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
# Modules
self.linear0_0 = nn.Linear(input_dims[0], mm_dim*factor)
self.linear1_0 = nn.Linear(input_dims[1], mm_dim*factor)
self.linear0_1 = nn.Linear(input_dims[0], mm_dim*factor)
self.linear1_1 = nn.Linear(input_dims[1], mm_dim*factor)
self.linear_out = nn.Linear(mm_dim*2, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0_0(x[0])
x1 = self.linear1_0(x[1])
if self.activ_input:
x0 = getattr(F, self.activ_input)(x0)
x1 = getattr(F, self.activ_input)(x1)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z_0_skip = x0 * x1
if self.dropout_pre_lin:
z_0_skip = F.dropout(z_0_skip, p=self.dropout_pre_lin, training=self.training)
z_0 = z_0_skip.view(z_0_skip.size(0), self.mm_dim, self.factor)
z_0 = z_0.sum(2)
if self.normalize:
z_0 = torch.sqrt(F.relu(z_0)) - torch.sqrt(F.relu(-z_0))
z_0 = F.normalize(z_0, p=2)
#
x0 = self.linear0_1(x[0])
x1 = self.linear1_1(x[1])
if self.activ_input:
x0 = getattr(F, self.activ_input)(x0)
x1 = getattr(F, self.activ_input)(x1)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z_1 = x0 * x1 * z_0_skip
if self.dropout_pre_lin > 0:
z_1 = F.dropout(z_1, p=self.dropout_pre_lin, training=self.training)
z_1 = z_1.view(z_1.size(0), self.mm_dim, self.factor)
z_1 = z_1.sum(2)
if self.normalize:
z_1 = torch.sqrt(F.relu(z_1)) - torch.sqrt(F.relu(-z_1))
z_1 = F.normalize(z_1, p=2)
#
cat_dim = z_0.dim() - 1
z = torch.cat([z_0, z_1], cat_dim)
z = self.linear_out(z)
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class MCB(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=16000,
activ_output='relu',
dropout_output=0.):
super(MCB, self).__init__()
# compatible with pytorch 0.3 and 0.4, not 1.0
from . import compactbilinearpooling as cbp
self.input_dims = input_dims
self.output_dim = output_dim
self.mm_dim = mm_dim
self.activ_output = activ_output
self.dropout_output = dropout_output
# Modules
self.mcb = cbp.CompactBilinearPooling(input_dims[0], input_dims[1], mm_dim)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
z = self.mcb(x[0], x[1])
z = self.linear_out(z)
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class LinearSum(nn.Module):
def __init__(self,
input_dims,
output_dim,
mm_dim=1200,
activ_input='relu',
activ_output='relu',
normalize=False,
dropout_input=0.,
dropout_pre_lin=0.,
dropout_output=0.):
super(LinearSum, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.mm_dim = mm_dim
self.activ_input = activ_input
self.activ_output = activ_output
self.normalize = normalize
self.dropout_input = dropout_input
self.dropout_pre_lin = dropout_pre_lin
self.dropout_output = dropout_output
# Modules
self.linear0 = nn.Linear(input_dims[0], mm_dim)
self.linear1 = nn.Linear(input_dims[1], mm_dim)
self.linear_out = nn.Linear(mm_dim, output_dim)
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
x0 = self.linear0(x[0])
x1 = self.linear1(x[1])
if self.activ_input:
x0 = getattr(F, self.activ_input)(x0)
x1 = getattr(F, self.activ_input)(x1)
if self.dropout_input > 0:
x0 = F.dropout(x0, p=self.dropout_input, training=self.training)
x1 = F.dropout(x1, p=self.dropout_input, training=self.training)
z = x0 + x1
if self.normalize:
z = torch.sqrt(F.relu(z)) - torch.sqrt(F.relu(-z))
z = F.normalize(z,p=2)
if self.dropout_pre_lin > 0:
z = F.dropout(z, p=self.dropout_pre_lin, training=self.training)
z = self.linear_out(z)
if self.activ_output:
z = getattr(F, self.activ_output)(z)
if self.dropout_output > 0:
z = F.dropout(z, p=self.dropout_output, training=self.training)
return z
class ConcatMLP(nn.Module):
def __init__(self,
input_dims,
output_dim,
dimensions=[500,500],
activation='relu',
dropout=0.):
super(ConcatMLP, self).__init__()
self.input_dims = input_dims
self.output_dim = output_dim
self.input_dim = sum(input_dims)
self.dimensions = dimensions + [output_dim]
self.activation = activation
self.dropout = dropout
# Modules
self.mlp = mlp.MLP(
self.input_dim,
self.dimensions,
self.activation,
self.dropout)
self.hidden_activations = self.mlp.activ_layers
self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)
def forward(self, x):
if x[0].dim() == 3 and x[1].dim() == 2:
x[1] = x[1].unsqueeze(1).reshape_as(x[0])
if x[1].dim() == 3 and x[0].dim() == 2:
x[0] = x[0].unsqueeze(1).reshape_as(x[1])
z = torch.cat(x, dim=x[0].dim()-1)
z = self.mlp(z)
return z | 34.458522 | 91 | 0.55664 | 3,082 | 22,846 | 3.901363 | 0.044127 | 0.08508 | 0.036926 | 0.032518 | 0.846973 | 0.826098 | 0.804308 | 0.793912 | 0.782019 | 0.75682 | 0 | 0.028977 | 0.333844 | 22,846 | 663 | 92 | 34.458522 | 0.761088 | 0.007354 | 0 | 0.748175 | 0 | 0 | 0.006182 | 0 | 0 | 0 | 0 | 0 | 0.009124 | 1 | 0.043796 | false | 0 | 0.009124 | 0 | 0.096715 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c35a9d25f86d3cd20de1596a75076e8e7e0fb11a | 67,712 | py | Python | nova/tests/unit/api/openstack/compute/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/api/openstack/compute/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/api/openstack/compute/test_cells.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2011-2012 OpenStack Foundation'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'copy'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'timeutils'
newline|'\n'
name|'from'
name|'webob'
name|'import'
name|'exc'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
op|'.'
name|'compute'
name|'import'
name|'cells'
name|'as'
name|'cells_ext_v21'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'extensions'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'cells'
name|'import'
name|'rpcapi'
name|'as'
name|'cells_rpcapi'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'context'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'rpc'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'test'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'fakes'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|BaseCellsTest
name|'class'
name|'BaseCellsTest'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'BaseCellsTest'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'fake_cells'
op|'='
op|'['
nl|'\n'
name|'dict'
op|'('
name|'id'
op|'='
number|'1'
op|','
name|'name'
op|'='
string|"'cell1'"
op|','
name|'is_parent'
op|'='
name|'True'
op|','
nl|'\n'
name|'weight_scale'
op|'='
number|'1.0'
op|','
name|'weight_offset'
op|'='
number|'0.0'
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'rabbit://bob:xxxx@r1.example.org/'"
op|')'
op|','
nl|'\n'
name|'dict'
op|'('
name|'id'
op|'='
number|'2'
op|','
name|'name'
op|'='
string|"'cell2'"
op|','
name|'is_parent'
op|'='
name|'False'
op|','
nl|'\n'
name|'weight_scale'
op|'='
number|'1.0'
op|','
name|'weight_offset'
op|'='
number|'0.0'
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'rabbit://alice:qwerty@r2.example.org/'"
op|')'
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'fake_capabilities'
op|'='
op|'['
nl|'\n'
op|'{'
string|"'cap1'"
op|':'
string|"'0,1'"
op|','
string|"'cap2'"
op|':'
string|"'2,3'"
op|'}'
op|','
nl|'\n'
op|'{'
string|"'cap3'"
op|':'
string|"'4,5'"
op|','
string|"'cap4'"
op|':'
string|"'5,6'"
op|'}'
op|']'
newline|'\n'
nl|'\n'
DECL|function|fake_cell_get
name|'def'
name|'fake_cell_get'
op|'('
name|'_self'
op|','
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'cell'
name|'in'
name|'self'
op|'.'
name|'fake_cells'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'cell_name'
op|'=='
name|'cell'
op|'['
string|"'name'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cell'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'CellNotFound'
op|'('
name|'cell_name'
op|'='
name|'cell_name'
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_cell_create
dedent|''
dedent|''
name|'def'
name|'fake_cell_create'
op|'('
name|'_self'
op|','
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'='
name|'dict'
op|'('
name|'id'
op|'='
number|'1'
op|')'
newline|'\n'
name|'cell'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'return'
name|'cell'
newline|'\n'
nl|'\n'
DECL|function|fake_cell_update
dedent|''
name|'def'
name|'fake_cell_update'
op|'('
name|'_self'
op|','
name|'context'
op|','
name|'cell_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'='
name|'fake_cell_get'
op|'('
name|'_self'
op|','
name|'context'
op|','
name|'cell_id'
op|')'
newline|'\n'
name|'cell'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'return'
name|'cell'
newline|'\n'
nl|'\n'
DECL|function|fake_cells_api_get_all_cell_info
dedent|''
name|'def'
name|'fake_cells_api_get_all_cell_info'
op|'('
op|'*'
name|'args'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'self'
op|'.'
name|'_get_all_cell_info'
op|'('
op|'*'
name|'args'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_get'"
op|','
name|'fake_cell_get'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_update'"
op|','
name|'fake_cell_update'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_create'"
op|','
name|'fake_cell_create'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'get_cell_info_for_neighbors'"
op|','
nl|'\n'
name|'fake_cells_api_get_all_cell_info'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_get_all_cell_info
dedent|''
name|'def'
name|'_get_all_cell_info'
op|'('
name|'self'
op|','
op|'*'
name|'args'
op|')'
op|':'
newline|'\n'
DECL|function|insecure_transport_url
indent|' '
name|'def'
name|'insecure_transport_url'
op|'('
name|'url'
op|')'
op|':'
newline|'\n'
indent|' '
name|'transport_url'
op|'='
name|'rpc'
op|'.'
name|'get_transport_url'
op|'('
name|'url'
op|')'
newline|'\n'
name|'transport_url'
op|'.'
name|'hosts'
op|'['
number|'0'
op|']'
op|'.'
name|'password'
op|'='
name|'None'
newline|'\n'
name|'return'
name|'str'
op|'('
name|'transport_url'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'cells'
op|'='
name|'copy'
op|'.'
name|'deepcopy'
op|'('
name|'self'
op|'.'
name|'fake_cells'
op|')'
newline|'\n'
name|'cells'
op|'['
number|'0'
op|']'
op|'['
string|"'transport_url'"
op|']'
op|'='
name|'insecure_transport_url'
op|'('
nl|'\n'
name|'cells'
op|'['
number|'0'
op|']'
op|'['
string|"'transport_url'"
op|']'
op|')'
newline|'\n'
name|'cells'
op|'['
number|'1'
op|']'
op|'['
string|"'transport_url'"
op|']'
op|'='
name|'insecure_transport_url'
op|'('
nl|'\n'
name|'cells'
op|'['
number|'1'
op|']'
op|'['
string|"'transport_url'"
op|']'
op|')'
newline|'\n'
name|'for'
name|'i'
op|','
name|'cell'
name|'in'
name|'enumerate'
op|'('
name|'cells'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'['
string|"'capabilities'"
op|']'
op|'='
name|'self'
op|'.'
name|'fake_capabilities'
op|'['
name|'i'
op|']'
newline|'\n'
dedent|''
name|'return'
name|'cells'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|CellsTestV21
dedent|''
dedent|''
name|'class'
name|'CellsTestV21'
op|'('
name|'BaseCellsTest'
op|')'
op|':'
newline|'\n'
DECL|variable|cell_extension
indent|' '
name|'cell_extension'
op|'='
string|"'os_compute_api:os-cells'"
newline|'\n'
DECL|variable|bad_request
name|'bad_request'
op|'='
name|'exception'
op|'.'
name|'ValidationError'
newline|'\n'
nl|'\n'
DECL|member|_get_cell_controller
name|'def'
name|'_get_cell_controller'
op|'('
name|'self'
op|','
name|'ext_mgr'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cells_ext_v21'
op|'.'
name|'CellsController'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_get_request
dedent|''
name|'def'
name|'_get_request'
op|'('
name|'self'
op|','
name|'resource'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'fakes'
op|'.'
name|'HTTPRequest'
op|'.'
name|'blank'
op|'('
string|"'/v2/fake/'"
op|'+'
name|'resource'
op|')'
newline|'\n'
nl|'\n'
DECL|member|setUp
dedent|''
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'CellsTestV21'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'ext_mgr'
op|'='
name|'self'
op|'.'
name|'mox'
op|'.'
name|'CreateMock'
op|'('
name|'extensions'
op|'.'
name|'ExtensionManager'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'='
name|'self'
op|'.'
name|'_get_cell_controller'
op|'('
name|'self'
op|'.'
name|'ext_mgr'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'context'
op|'='
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'flags'
op|'('
name|'enable'
op|'='
name|'True'
op|','
name|'group'
op|'='
string|"'cells'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_index
dedent|''
name|'def'
name|'test_index'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'index'
op|'('
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'len'
op|'('
name|'res_dict'
op|'['
string|"'cells'"
op|']'
op|')'
op|','
number|'2'
op|')'
newline|'\n'
name|'for'
name|'i'
op|','
name|'cell'
name|'in'
name|'enumerate'
op|'('
name|'res_dict'
op|'['
string|"'cells'"
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
name|'self'
op|'.'
name|'fake_cells'
op|'['
name|'i'
op|']'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'capabilitiles'"
op|','
name|'cell'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_detail
dedent|''
dedent|''
name|'def'
name|'test_detail'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/detail"'
op|')'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'detail'
op|'('
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'len'
op|'('
name|'res_dict'
op|'['
string|"'cells'"
op|']'
op|')'
op|','
number|'2'
op|')'
newline|'\n'
name|'for'
name|'i'
op|','
name|'cell'
name|'in'
name|'enumerate'
op|'('
name|'res_dict'
op|'['
string|"'cells'"
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
name|'self'
op|'.'
name|'fake_cells'
op|'['
name|'i'
op|']'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'capabilities'"
op|']'
op|','
name|'self'
op|'.'
name|'fake_capabilities'
op|'['
name|'i'
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_bogus_cell_raises
dedent|''
dedent|''
name|'def'
name|'test_show_bogus_cell_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/bogus"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotFound'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'show'
op|','
name|'req'
op|','
string|"'bogus'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_get_cell_by_name
dedent|''
name|'def'
name|'test_get_cell_by_name'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'show'
op|'('
name|'req'
op|','
string|"'cell1'"
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'cell1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r1.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_cell_delete
dedent|''
name|'def'
name|'_cell_delete'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'call_info'
op|'='
op|'{'
string|"'delete_called'"
op|':'
number|'0'
op|'}'
newline|'\n'
nl|'\n'
DECL|function|fake_cell_delete
name|'def'
name|'fake_cell_delete'
op|'('
name|'inst'
op|','
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_name'
op|','
string|"'cell999'"
op|')'
newline|'\n'
name|'call_info'
op|'['
string|"'delete_called'"
op|']'
op|'+='
number|'1'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_delete'"
op|','
name|'fake_cell_delete'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell999"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'delete'
op|'('
name|'req'
op|','
string|"'cell999'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'call_info'
op|'['
string|"'delete_called'"
op|']'
op|','
number|'1'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_delete
dedent|''
name|'def'
name|'test_cell_delete'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Test cell delete with just cell policy'
nl|'\n'
indent|' '
name|'rules'
op|'='
op|'{'
string|'"default"'
op|':'
string|'"is_admin:true"'
op|','
nl|'\n'
name|'self'
op|'.'
name|'cell_extension'
op|':'
string|'"is_admin:true"'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'policy'
op|'.'
name|'set_rules'
op|'('
name|'rules'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_cell_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_delete_with_delete_policy
dedent|''
name|'def'
name|'test_cell_delete_with_delete_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_cell_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_delete_bogus_cell_raises
dedent|''
name|'def'
name|'test_delete_bogus_cell_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
DECL|function|fake_cell_delete
indent|' '
name|'def'
name|'fake_cell_delete'
op|'('
name|'inst'
op|','
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
number|'0'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_delete'"
op|','
name|'fake_cell_delete'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell999"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotFound'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'delete'
op|','
name|'req'
op|','
nl|'\n'
string|"'cell999'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_delete_fails_for_invalid_policy
dedent|''
name|'def'
name|'test_cell_delete_fails_for_invalid_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
DECL|function|fake_cell_delete
indent|' '
name|'def'
name|'fake_cell_delete'
op|'('
name|'inst'
op|','
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'cell_delete'"
op|','
name|'fake_cell_delete'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell999"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'.'
name|'is_admin'
op|'='
name|'False'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'PolicyNotAuthorized'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'delete'
op|','
name|'req'
op|','
string|"'cell999'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|_cell_create_parent
dedent|''
name|'def'
name|'_cell_create_parent'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'meow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'fubar'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'meow'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'username'"
op|']'
op|','
string|"'fred'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r3.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'type'"
op|']'
op|','
string|"'parent'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'is_parent'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_parent
dedent|''
name|'def'
name|'test_cell_create_parent'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Test create with just cells policy'
nl|'\n'
indent|' '
name|'rules'
op|'='
op|'{'
string|'"default"'
op|':'
string|'"is_admin:true"'
op|','
nl|'\n'
name|'self'
op|'.'
name|'cell_extension'
op|':'
string|'"is_admin:true"'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'policy'
op|'.'
name|'set_rules'
op|'('
name|'rules'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_cell_create_parent'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_parent_with_create_policy
dedent|''
name|'def'
name|'test_cell_create_parent_with_create_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_cell_create_parent'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_cell_create_child
dedent|''
name|'def'
name|'_cell_create_child'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'meow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'fubar'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'child'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'meow'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'username'"
op|']'
op|','
string|"'fred'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r3.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'type'"
op|']'
op|','
string|"'child'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'is_parent'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_child
dedent|''
name|'def'
name|'test_cell_create_child'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Test create with just cells policy'
nl|'\n'
indent|' '
name|'rules'
op|'='
op|'{'
string|'"default"'
op|':'
string|'"is_admin:true"'
op|','
nl|'\n'
name|'self'
op|'.'
name|'cell_extension'
op|':'
string|'"is_admin:true"'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'policy'
op|'.'
name|'set_rules'
op|'('
name|'rules'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_cell_create_child'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_child_with_create_policy
dedent|''
name|'def'
name|'test_cell_create_child_with_create_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_cell_create_child'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_no_name_raises
dedent|''
name|'def'
name|'test_cell_create_no_name_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'moocow'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_empty_string_raises
dedent|''
name|'def'
name|'test_cell_create_name_empty_string_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_invalid_character_raises
dedent|''
name|'def'
name|'test_cell_create_name_with_invalid_character_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moo\\x00cow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_dot_raises
dedent|''
name|'def'
name|'test_cell_create_name_with_dot_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moo.cow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
nl|'\n'
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_exclamation_point_raises
dedent|''
name|'def'
name|'test_cell_create_name_with_exclamation_point_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moo!cow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
nl|'\n'
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_at_raises
dedent|''
name|'def'
name|'test_cell_create_name_with_at_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moo@cow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
nl|'\n'
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_leading_trailing_spaces
dedent|''
name|'def'
name|'test_cell_create_name_with_leading_trailing_spaces'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"' moocow '"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
nl|'\n'
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_leading_trailing_spaces_compat_mode
dedent|''
name|'def'
name|'test_cell_create_name_with_leading_trailing_spaces_compat_mode'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"' moocow '"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'set_legacy_v2'
op|'('
op|')'
newline|'\n'
name|'resp'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'moocow'"
op|','
name|'resp'
op|'['
string|"'cell'"
op|']'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_name_with_invalid_type_raises
dedent|''
name|'def'
name|'test_cell_create_name_with_invalid_type_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moocow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'invalid'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_fails_for_invalid_policy
dedent|''
name|'def'
name|'test_cell_create_fails_for_invalid_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|'}'
op|'}'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'.'
name|'is_admin'
op|'='
name|'False'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'PolicyNotAuthorized'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_rpc_port_with_string
dedent|''
name|'def'
name|'test_cell_create_rpc_port_with_string'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
string|"'123'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_rpc_port_with_null
dedent|''
name|'def'
name|'test_cell_create_rpc_port_with_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_create_rpc_port_empty_string_raises
dedent|''
name|'def'
name|'test_cell_create_rpc_port_empty_string_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moocow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_cell_update
dedent|''
name|'def'
name|'_cell_update'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'zeb'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'sneaky'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'cell1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r1.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'username'"
op|']'
op|','
string|"'zeb'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|"'password'"
op|','
name|'cell'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update
dedent|''
name|'def'
name|'test_cell_update'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Test cell update with just cell policy'
nl|'\n'
indent|' '
name|'rules'
op|'='
op|'{'
string|'"default"'
op|':'
string|'"is_admin:true"'
op|','
nl|'\n'
name|'self'
op|'.'
name|'cell_extension'
op|':'
string|'"is_admin:true"'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'policy'
op|'.'
name|'set_rules'
op|'('
name|'rules'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_cell_update'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_with_update_policy
dedent|''
name|'def'
name|'test_cell_update_with_update_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_cell_update'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_fails_for_invalid_policy
dedent|''
name|'def'
name|'test_cell_update_fails_for_invalid_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'got_changed'"
op|'}'
op|'}'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'.'
name|'is_admin'
op|'='
name|'False'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'PolicyNotAuthorized'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_empty_name_raises
dedent|''
name|'def'
name|'test_cell_update_empty_name_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'zeb'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'sneaky'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_empty_rpc_port_raises
dedent|''
name|'def'
name|'test_cell_update_empty_rpc_port_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'zeb'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'sneaky'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
string|"''"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_invalid_type_raises
dedent|''
name|'def'
name|'test_cell_update_invalid_type_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'zeb'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'invalid'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'sneaky'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_without_type_specified
dedent|''
name|'def'
name|'test_cell_update_without_type_specified'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'wingwj'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'cell1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r1.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'username'"
op|']'
op|','
string|"'wingwj'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'type'"
op|']'
op|','
string|"'parent'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_with_type_specified
dedent|''
name|'def'
name|'test_cell_update_with_type_specified'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body1'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'wingwj'"
op|','
string|"'type'"
op|':'
string|"'child'"
op|'}'
op|'}'
newline|'\n'
name|'body2'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'username'"
op|':'
string|"'wingwj'"
op|','
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req1'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'req1'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict1'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'req1'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body1'
op|')'
newline|'\n'
name|'cell1'
op|'='
name|'res_dict1'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'req2'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell2"'
op|')'
newline|'\n'
name|'req2'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict2'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'req2'
op|','
string|"'cell2'"
op|','
name|'body'
op|'='
name|'body2'
op|')'
newline|'\n'
name|'cell2'
op|'='
name|'res_dict2'
op|'['
string|"'cell'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell1'
op|'['
string|"'name'"
op|']'
op|','
string|"'cell1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell1'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r1.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell1'
op|'['
string|"'username'"
op|']'
op|','
string|"'wingwj'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell1'
op|'['
string|"'type'"
op|']'
op|','
string|"'child'"
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell2'
op|'['
string|"'name'"
op|']'
op|','
string|"'cell2'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell2'
op|'['
string|"'rpc_host'"
op|']'
op|','
string|"'r2.example.org'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell2'
op|'['
string|"'username'"
op|']'
op|','
string|"'wingwj'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell2'
op|'['
string|"'type'"
op|']'
op|','
string|"'parent'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_rpc_port_with_string
dedent|''
name|'def'
name|'test_cell_update_rpc_port_with_string'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
string|"'123'"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_rpc_port_with_null
dedent|''
name|'def'
name|'test_cell_update_rpc_port_with_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_update_rpc_port_empty_string_raises
dedent|''
name|'def'
name|'test_cell_update_rpc_port_empty_string_raises'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
string|"'cell'"
op|':'
op|'{'
string|"'name'"
op|':'
string|"'moocow'"
op|','
nl|'\n'
string|"'username'"
op|':'
string|"'fred'"
op|','
nl|'\n'
string|"'password'"
op|':'
string|"'secret'"
op|','
nl|'\n'
string|"'rpc_host'"
op|':'
string|"'r3.example.org'"
op|','
nl|'\n'
string|"'rpc_port'"
op|':'
string|"''"
op|','
nl|'\n'
string|"'type'"
op|':'
string|"'parent'"
op|'}'
op|'}'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
name|'req'
op|','
string|"'cell1'"
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cell_info
dedent|''
name|'def'
name|'test_cell_info'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'caps'
op|'='
op|'['
string|"'cap1=a;b'"
op|','
string|"'cap2=c;d'"
op|']'
newline|'\n'
name|'self'
op|'.'
name|'flags'
op|'('
name|'name'
op|'='
string|"'darksecret'"
op|','
name|'capabilities'
op|'='
name|'caps'
op|','
name|'group'
op|'='
string|"'cells'"
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/info"'
op|')'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'info'
op|'('
name|'req'
op|')'
newline|'\n'
name|'cell'
op|'='
name|'res_dict'
op|'['
string|"'cell'"
op|']'
newline|'\n'
name|'cell_caps'
op|'='
name|'cell'
op|'['
string|"'capabilities'"
op|']'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell'
op|'['
string|"'name'"
op|']'
op|','
string|"'darksecret'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_caps'
op|'['
string|"'cap1'"
op|']'
op|','
string|"'a;b'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_caps'
op|'['
string|"'cap2'"
op|']'
op|','
string|"'c;d'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_capacities
dedent|''
name|'def'
name|'test_show_capacities'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'cell_extension'
op|'=='
string|"'compute_extension:cells'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'ext_mgr'
op|'.'
name|'is_loaded'
op|'('
string|"'os-cell-capacities'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'True'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|','
nl|'\n'
string|"'get_capacities'"
op|')'
newline|'\n'
name|'response'
op|'='
op|'{'
string|'"ram_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"8192"'
op|':'
number|'0'
op|','
string|'"512"'
op|':'
number|'13'
op|','
nl|'\n'
string|'"4096"'
op|':'
number|'1'
op|','
string|'"2048"'
op|':'
number|'3'
op|','
string|'"16384"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'7680'
op|'}'
op|','
nl|'\n'
string|'"disk_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"81920"'
op|':'
number|'11'
op|','
string|'"20480"'
op|':'
number|'46'
op|','
nl|'\n'
string|'"40960"'
op|':'
number|'23'
op|','
string|'"163840"'
op|':'
number|'5'
op|','
string|'"0"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'1052672'
op|'}'
nl|'\n'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|'.'
name|'get_capacities'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'cell_name'
op|'='
name|'None'
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'response'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/capacities"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'capacities'
op|'('
name|'req'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'response'
op|','
name|'res_dict'
op|'['
string|"'cell'"
op|']'
op|'['
string|"'capacities'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_capacity_fails_with_non_admin_context
dedent|''
name|'def'
name|'test_show_capacity_fails_with_non_admin_context'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'cell_extension'
op|'=='
string|"'compute_extension:cells'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'ext_mgr'
op|'.'
name|'is_loaded'
op|'('
string|"'os-cell-capacities'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'True'
op|')'
newline|'\n'
dedent|''
name|'rules'
op|'='
op|'{'
name|'self'
op|'.'
name|'cell_extension'
op|':'
string|'"is_admin:true"'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'policy'
op|'.'
name|'set_rules'
op|'('
name|'rules'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/capacities"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'.'
name|'is_admin'
op|'='
name|'False'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'PolicyNotAuthorized'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'capacities'
op|','
name|'req'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_capacities_for_invalid_cell
dedent|''
name|'def'
name|'test_show_capacities_for_invalid_cell'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'cell_extension'
op|'=='
string|"'compute_extension:cells'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'ext_mgr'
op|'.'
name|'is_loaded'
op|'('
string|"'os-cell-capacities'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'True'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|','
nl|'\n'
string|"'get_capacities'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|'.'
name|'get_capacities'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'cell_name'
op|'='
string|'"invalid_cell"'
op|')'
op|'.'
name|'AndRaise'
op|'('
nl|'\n'
name|'exception'
op|'.'
name|'CellNotFound'
op|'('
name|'cell_name'
op|'='
string|'"invalid_cell"'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/invalid_cell/capacities"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotFound'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'capacities'
op|','
name|'req'
op|','
string|'"invalid_cell"'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_capacities_for_cell
dedent|''
name|'def'
name|'test_show_capacities_for_cell'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'cell_extension'
op|'=='
string|"'compute_extension:cells'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'ext_mgr'
op|'.'
name|'is_loaded'
op|'('
string|"'os-cell-capacities'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'True'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'mox'
op|'.'
name|'StubOutWithMock'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|','
nl|'\n'
string|"'get_capacities'"
op|')'
newline|'\n'
name|'response'
op|'='
op|'{'
string|'"ram_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"8192"'
op|':'
number|'0'
op|','
string|'"512"'
op|':'
number|'13'
op|','
nl|'\n'
string|'"4096"'
op|':'
number|'1'
op|','
string|'"2048"'
op|':'
number|'3'
op|','
string|'"16384"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'7680'
op|'}'
op|','
nl|'\n'
string|'"disk_free"'
op|':'
nl|'\n'
op|'{'
string|'"units_by_mb"'
op|':'
op|'{'
string|'"81920"'
op|':'
number|'11'
op|','
string|'"20480"'
op|':'
number|'46'
op|','
nl|'\n'
string|'"40960"'
op|':'
number|'23'
op|','
string|'"163840"'
op|':'
number|'5'
op|','
string|'"0"'
op|':'
number|'0'
op|'}'
op|','
nl|'\n'
string|'"total_mb"'
op|':'
number|'1052672'
op|'}'
nl|'\n'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'cells_rpcapi'
op|'.'
name|'get_capacities'
op|'('
name|'self'
op|'.'
name|'context'
op|','
name|'cell_name'
op|'='
string|"'cell_name'"
op|')'
op|'.'
name|'AndReturn'
op|'('
name|'response'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'mox'
op|'.'
name|'ReplayAll'
op|'('
op|')'
newline|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/capacities"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|'"nova.context"'
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'capacities'
op|'('
name|'req'
op|','
string|"'cell_name'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'response'
op|','
name|'res_dict'
op|'['
string|"'cell'"
op|']'
op|'['
string|"'capacities'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_sync_instances
dedent|''
name|'def'
name|'test_sync_instances'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'call_info'
op|'='
op|'{'
op|'}'
newline|'\n'
nl|'\n'
DECL|function|sync_instances
name|'def'
name|'sync_instances'
op|'('
name|'self'
op|','
name|'context'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'kwargs'
op|'.'
name|'get'
op|'('
string|"'project_id'"
op|')'
newline|'\n'
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|'='
name|'kwargs'
op|'.'
name|'get'
op|'('
string|"'updated_since'"
op|')'
newline|'\n'
name|'call_info'
op|'['
string|"'deleted'"
op|']'
op|'='
name|'kwargs'
op|'.'
name|'get'
op|'('
string|"'deleted'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'sync_instances'"
op|','
name|'sync_instances'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/sync_instances"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'body'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'project_id'"
op|':'
string|"'test-project'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|','
string|"'test-project'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'expected'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'.'
name|'isoformat'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'expected'
op|'.'
name|'endswith'
op|'('
string|'"+00:00"'
op|')'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'+='
string|'"+00:00"'
newline|'\n'
nl|'\n'
dedent|''
name|'body'
op|'='
op|'{'
string|"'updated_since'"
op|':'
name|'expected'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|','
name|'expected'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'updated_since'"
op|':'
string|"'skjdfkjsdkf'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'deleted'"
op|':'
name|'False'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'call_info'
op|'['
string|"'deleted'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'deleted'"
op|':'
string|"'False'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'call_info'
op|'['
string|"'deleted'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'deleted'"
op|':'
string|"'True'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|'('
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'project_id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIsNone'
op|'('
name|'call_info'
op|'['
string|"'updated_since'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'call_info'
op|'['
string|"'deleted'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'deleted'"
op|':'
string|"'foo'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
string|"'foo'"
op|':'
string|"'meow'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'bad_request'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_sync_instances_fails_for_invalid_policy
dedent|''
name|'def'
name|'test_sync_instances_fails_for_invalid_policy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
DECL|function|sync_instances
indent|' '
name|'def'
name|'sync_instances'
op|'('
name|'self'
op|','
name|'context'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|','
string|"'sync_instances'"
op|','
name|'sync_instances'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/sync_instances"'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'='
name|'self'
op|'.'
name|'context'
newline|'\n'
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
op|'.'
name|'is_admin'
op|'='
name|'False'
newline|'\n'
nl|'\n'
name|'body'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'PolicyNotAuthorized'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|','
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_cells_disabled
dedent|''
name|'def'
name|'test_cells_disabled'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'enable'
op|'='
name|'False'
op|','
name|'group'
op|'='
string|"'cells'"
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'index'
op|','
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/detail"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'detail'
op|','
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cell1"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'show'
op|','
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'delete'
op|','
name|'req'
op|','
string|"'cell999'"
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/cells"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|','
name|'req'
op|','
op|'{'
op|'}'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/capacities"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'capacities'
op|','
name|'req'
op|')'
newline|'\n'
nl|'\n'
name|'req'
op|'='
name|'self'
op|'.'
name|'_get_request'
op|'('
string|'"cells/sync_instances"'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exc'
op|'.'
name|'HTTPNotImplemented'
op|','
nl|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'sync_instances'
op|','
name|'req'
op|','
op|'{'
op|'}'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 11.973828 | 88 | 0.587532 | 9,943 | 67,712 | 3.904455 | 0.031278 | 0.163361 | 0.097625 | 0.117562 | 0.942661 | 0.923136 | 0.90526 | 0.892638 | 0.865927 | 0.835068 | 0 | 0.00513 | 0.096128 | 67,712 | 5,654 | 89 | 11.975946 | 0.629185 | 0 | 0 | 0.96728 | 0 | 0 | 0.337813 | 0.027056 | 0 | 0 | 0 | 0 | 0.016802 | 0 | null | null | 0.005306 | 0.001946 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5eda6544508b1e3650be5c2e6e0e6007a2525b5f | 325 | py | Python | koalixcrm/djangoUserExtension/models.py | Cataldir/koalixcrm | 87d125379845d6ab990c19500d63cbed4051040a | [
"BSD-3-Clause"
] | 290 | 2015-01-11T03:01:05.000Z | 2019-12-17T03:56:17.000Z | koalixcrm/djangoUserExtension/models.py | Cataldir/koalixcrm | 87d125379845d6ab990c19500d63cbed4051040a | [
"BSD-3-Clause"
] | 178 | 2016-02-26T14:41:49.000Z | 2019-12-29T08:34:21.000Z | koalixcrm/djangoUserExtension/models.py | Cataldir/koalixcrm | 87d125379845d6ab990c19500d63cbed4051040a | [
"BSD-3-Clause"
] | 124 | 2015-02-28T20:56:37.000Z | 2019-12-13T18:15:35.000Z | # -*- coding: utf-8 -*-
from koalixcrm.djangoUserExtension.user_extension.document_template import *
from koalixcrm.djangoUserExtension.user_extension.template_set import *
from koalixcrm.djangoUserExtension.user_extension.user_extension import *
from koalixcrm.djangoUserExtension.user_extension.text_paragraph import *
| 32.5 | 76 | 0.846154 | 35 | 325 | 7.628571 | 0.4 | 0.243446 | 0.479401 | 0.539326 | 0.741573 | 0.573034 | 0 | 0 | 0 | 0 | 0 | 0.003333 | 0.076923 | 325 | 9 | 77 | 36.111111 | 0.886667 | 0.064615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6f1e69fb5b2132a5e165ef44172ce3a47a82cfaa | 11,615 | py | Python | openprocurement/auctions/core/tests/plugins/awarding/v3_1/tests/blanks/chronograph_blanks.py | EBRD-ProzorroSale/openprocurement.auctions.core | 52bd59f193f25e4997612fca0f87291decf06966 | [
"Apache-2.0"
] | 2 | 2016-09-15T20:17:43.000Z | 2017-01-08T03:32:43.000Z | openprocurement/auctions/core/tests/plugins/awarding/v3_1/tests/blanks/chronograph_blanks.py | EBRD-ProzorroSale/openprocurement.auctions.core | 52bd59f193f25e4997612fca0f87291decf06966 | [
"Apache-2.0"
] | 183 | 2017-12-21T11:04:37.000Z | 2019-03-27T08:14:34.000Z | openprocurement/auctions/core/tests/plugins/awarding/v3_1/tests/blanks/chronograph_blanks.py | EBRD-ProzorroSale/openprocurement.auctions.core | 52bd59f193f25e4997612fca0f87291decf06966 | [
"Apache-2.0"
] | 12 | 2016-09-05T12:07:48.000Z | 2019-02-26T09:24:17.000Z | # -*- coding: utf-8 -*-
from openprocurement.auctions.core.utils import get_related_contract_of_award
# AuctionAwardSwitchResourceTest
def not_switch_verification_to_unsuccessful(self):
auction = self.db.get(self.auction_id)
auction['awards'][0]['verificationPeriod']['endDate'] = auction['awards'][0]['verificationPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(auction['awards'][0]['status'], 'pending')
self.assertEqual(auction['awards'][1]['status'], 'pending.waiting')
self.assertEqual(auction['status'], 'active.qualification')
self.assertNotIn('endDate', auction['awardPeriod'])
def not_switch_active_to_unsuccessful(self):
response = self.app.post('/auctions/{}/awards/{}/documents?acc_token={}'.format(
self.auction_id, self.award_id, self.auction_token), upload_files=[('file', 'auction_protocol.pdf', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
response = self.app.patch_json(
'/auctions/{}/awards/{}/documents/{}?acc_token={}'.format(self.auction_id, self.award_id, doc_id,
self.auction_token), {"data": {
"description": "auction protocol",
"documentType": 'auctionProtocol'
}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json["data"]["documentType"], 'auctionProtocol')
response = self.app.patch_json('/auctions/{}/awards/{}?acc_token={}'.format(
self.auction_id, self.award_id, self.auction_token
), {"data": {"status": "active"}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data']["status"], "active")
auction = self.db.get(self.auction_id)
related_contract = get_related_contract_of_award(auction['awards'][0]['id'], auction)
related_contract['signingPeriod']['endDate'] = related_contract['signingPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'active')
self.assertEqual(auction['contracts'][0]['status'], 'pending')
self.assertEqual(auction['awards'][1]['status'], 'pending.waiting')
self.assertEqual(auction['status'], 'active.awarded')
self.assertIn('endDate', auction['awardPeriod'])
def switch_admission_to_unsuccessful(self):
auction = self.db.get(self.auction_id)
auction['awards'][0]['admissionPeriod']['endDate'] = auction['awards'][0]['admissionPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'unsuccessful')
self.assertEqual(auction['status'], 'unsuccessful')
self.assertIn('endDate', auction['awardPeriod'])
# AuctionDontSwitchSuspendedAuctionResourceTest
def switch_suspended_verification_to_unsuccessful(self):
auction = self.db.get(self.auction_id)
auction['awards'][0]['verificationPeriod']['endDate'] = auction['awards'][0]['verificationPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('administrator', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'suspended': True}})
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'pending')
self.assertEqual(auction['awards'][1]['status'], 'pending.waiting')
self.app.authorization = ('Basic', ('administrator', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'suspended': False}})
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'unsuccessful')
self.assertEqual(auction['awards'][1]['status'], 'pending')
self.assertEqual(auction['status'], 'active.qualification')
self.assertNotIn('endDate', auction['awardPeriod'])
def switch_suspended_active_to_unsuccessful(self):
response = self.app.post('/auctions/{}/awards/{}/documents?acc_token={}'.format(
self.auction_id, self.award_id, self.auction_token), upload_files=[('file', 'auction_protocol.pdf', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
response = self.app.patch_json(
'/auctions/{}/awards/{}/documents/{}?acc_token={}'.format(self.auction_id, self.award_id, doc_id,
self.auction_token), {"data": {
"description": "auction protocol",
"documentType": 'auctionProtocol'
}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json["data"]["documentType"], 'auctionProtocol')
response = self.app.patch_json('/auctions/{}/awards/{}'.format(self.auction_id, self.award_id),
{"data": {"status": "active"}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data']["status"], "active")
auction = self.db.get(self.auction_id)
related_contract = get_related_contract_of_award(auction['awards'][0]['id'], auction)
related_contract['signingPeriod']['endDate'] = related_contract['signingPeriod']['startDate']
related_contract['signingPeriod']['endDate'] = related_contract['signingPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('administrator', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'suspended': True}})
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'active')
self.assertEqual(auction['contracts'][0]['status'], 'pending')
self.assertEqual(auction['awards'][1]['status'], 'pending.waiting')
self.assertEqual(auction['status'], 'active.awarded')
self.assertIn('endDate', auction['awardPeriod'])
self.app.authorization = ('Basic', ('administrator', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'suspended': False}})
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'unsuccessful')
self.assertEqual(auction['contracts'][0]['status'], 'cancelled')
self.assertEqual(auction['awards'][1]['status'], 'pending')
self.assertEqual(auction['status'], 'active.qualification')
self.assertNotIn('endDate', auction['awardPeriod'])
# AuctionAwardSwitch2ResourceTest
def switch_verification_to_unsuccessful_2(self):
auction = self.db.get(self.auction_id)
auction['awards'][0]['verificationPeriod']['endDate'] = auction['awards'][0]['verificationPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
self.assertEqual(response.status, '200 OK')
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'unsuccessful')
if 'Insider' not in auction['procurementMethodType']:
self.assertEqual(auction['awards'][1]['status'], 'unsuccessful')
self.assertEqual(auction['status'], 'unsuccessful')
self.assertIn('endDate', auction['awardPeriod'])
def switch_active_to_unsuccessful_2(self):
response = self.app.post('/auctions/{}/awards/{}/documents?acc_token={}'.format(
self.auction_id, self.award_id, self.auction_token), upload_files=[('file', 'auction_protocol.pdf', 'content')])
self.assertEqual(response.status, '201 Created')
self.assertEqual(response.content_type, 'application/json')
doc_id = response.json["data"]['id']
response = self.app.patch_json(
'/auctions/{}/awards/{}/documents/{}?acc_token={}'.format(self.auction_id, self.award_id, doc_id,
self.auction_token), {"data": {
"description": "auction protocol",
"documentType": 'auctionProtocol'
}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json["data"]["documentType"], 'auctionProtocol')
response = self.app.patch_json('/auctions/{}/awards/{}'.format(self.auction_id, self.award_id),
{"data": {"status": "active"}})
self.assertEqual(response.status, '200 OK')
self.assertEqual(response.content_type, 'application/json')
self.assertEqual(response.json['data']["status"], "active")
auction = self.db.get(self.auction_id)
related_contract = get_related_contract_of_award(auction['awards'][0]['id'], auction)
related_contract['signingPeriod']['endDate'] = related_contract['signingPeriod']['startDate']
self.db.save(auction)
self.app.authorization = ('Basic', ('chronograph', ''))
response = self.app.patch_json('/auctions/{}'.format(self.auction_id), {'data': {'id': self.auction_id}})
auction = response.json['data']
self.assertEqual(response.status, '200 OK')
self.assertEqual(auction['awards'][0]['status'], 'unsuccessful')
self.assertEqual(auction['contracts'][0]['status'], 'cancelled')
if 'Insider' not in auction['procurementMethodType']:
self.assertEqual(auction['awards'][1]['status'], 'unsuccessful')
self.assertEqual(auction['status'], 'unsuccessful')
self.assertIn('endDate', auction['awardPeriod'])
| 51.852679 | 120 | 0.670684 | 1,265 | 11,615 | 6.035573 | 0.068775 | 0.131631 | 0.064702 | 0.087361 | 0.957695 | 0.953635 | 0.953635 | 0.951277 | 0.951277 | 0.951277 | 0 | 0.010521 | 0.140766 | 11,615 | 223 | 121 | 52.085202 | 0.754509 | 0.011192 | 0 | 0.932961 | 0 | 0 | 0.255836 | 0.034843 | 0 | 0 | 0 | 0 | 0.418994 | 1 | 0.039106 | false | 0 | 0.005587 | 0 | 0.044693 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6f547e3748e5baeb58417706f621da1aa3dae458 | 7,504 | py | Python | dfirtrack_main/tests/taskname/test_taskname_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | 4 | 2020-03-06T17:37:09.000Z | 2020-03-17T07:50:55.000Z | dfirtrack_main/tests/taskname/test_taskname_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | null | null | null | dfirtrack_main/tests/taskname/test_taskname_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | 1 | 2020-03-06T20:54:52.000Z | 2020-03-06T20:54:52.000Z | from django.contrib.auth.models import User
from django.test import TestCase
from dfirtrack_main.models import Taskname
import urllib.parse
class TasknameViewTestCase(TestCase):
""" taskname view tests """
@classmethod
def setUpTestData(cls):
# create object
Taskname.objects.create(taskname_name='taskname_1')
# create user
test_user = User.objects.create_user(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
def test_tasknames_list_not_logged_in(self):
""" test list view """
# create url
destination = '/login/?next=' + urllib.parse.quote('/tasknames/', safe='')
# get response
response = self.client.get('/tasknames/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_tasknames_list_logged_in(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/')
# compare
self.assertEqual(response.status_code, 200)
def test_tasknames_list_template(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/taskname/tasknames_list.html')
def test_tasknames_list_get_user_context(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_taskname')
def test_tasknames_detail_not_logged_in(self):
""" test detail view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# create url
destination = '/login/?next=' + urllib.parse.quote('/tasknames/' + str(taskname_1.taskname_id), safe='')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id), follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_tasknames_detail_logged_in(self):
""" test detail view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id))
# compare
self.assertEqual(response.status_code, 200)
def test_tasknames_detail_template(self):
""" test detail view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id))
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/taskname/tasknames_detail.html')
def test_tasknames_detail_get_user_context(self):
""" test detail view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id))
# compare
self.assertEqual(str(response.context['user']), 'testuser_taskname')
def test_tasknames_add_not_logged_in(self):
""" test add view """
# create url
destination = '/login/?next=' + urllib.parse.quote('/tasknames/add/', safe='')
# get response
response = self.client.get('/tasknames/add/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_tasknames_add_logged_in(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/add/')
# compare
self.assertEqual(response.status_code, 200)
def test_tasknames_add_template(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/add/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/taskname/tasknames_add.html')
def test_tasknames_add_get_user_context(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/add/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_taskname')
def test_tasknames_edit_not_logged_in(self):
""" test edit view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# create url
destination = '/login/?next=' + urllib.parse.quote('/tasknames/' + str(taskname_1.taskname_id) + '/edit/', safe='')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id) + '/edit/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_tasknames_edit_logged_in(self):
""" test edit view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id) + '/edit/')
# compare
self.assertEqual(response.status_code, 200)
def test_tasknames_edit_template(self):
""" test edit view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id) + '/edit/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/taskname/tasknames_edit.html')
def test_tasknames_edit_get_user_context(self):
""" test edit view """
# get object
taskname_1 = Taskname.objects.get(taskname_name='taskname_1')
# login testuser
login = self.client.login(username='testuser_taskname', password='7xajmDLqQh1hs8i5PAx7')
# get response
response = self.client.get('/tasknames/' + str(taskname_1.taskname_id) + '/edit/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_taskname')
| 39.083333 | 123 | 0.656983 | 809 | 7,504 | 5.89864 | 0.080346 | 0.058676 | 0.064124 | 0.077117 | 0.906329 | 0.875524 | 0.863998 | 0.856035 | 0.789606 | 0.781643 | 0 | 0.021993 | 0.224414 | 7,504 | 191 | 124 | 39.287958 | 0.797938 | 0.127265 | 0 | 0.54321 | 0 | 0 | 0.179997 | 0.027078 | 0 | 0 | 0 | 0 | 0.197531 | 1 | 0.209877 | false | 0.160494 | 0.049383 | 0 | 0.271605 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
48a8adca689ae7b1e9f6c39f927c5822193a31cd | 14,190 | py | Python | tests/plantcv/transform/test_color_correction.py | ygarrot/plantcv | e934a891e0d1bf8987ca6a9f982a4ac1f420bfe7 | [
"MIT"
] | 1 | 2022-02-03T12:08:59.000Z | 2022-02-03T12:08:59.000Z | tests/plantcv/transform/test_color_correction.py | HUISTENCOFFEE/plantcv | f38f7de53663522eb770870b70823d5fc46d0c0f | [
"MIT"
] | null | null | null | tests/plantcv/transform/test_color_correction.py | HUISTENCOFFEE/plantcv | f38f7de53663522eb770870b70823d5fc46d0c0f | [
"MIT"
] | null | null | null | import pytest
import os
import cv2
import numpy as np
from plantcv.plantcv.transform import (get_color_matrix, get_matrix_m, calc_transformation_matrix, apply_transformation_matrix,
save_matrix, load_matrix, correct_color, create_color_card_mask, quick_color_check,
find_color_card)
from plantcv.plantcv import outputs
def test_get_color_matrix(transform_test_data):
"""Test for PlantCV."""
# load in target_matrix
matrix_compare = transform_test_data.load_npz(transform_test_data.target_matrix_file)
# Read in rgb_img and gray-scale mask
rgb_img = cv2.imread(transform_test_data.target_img)
mask = cv2.imread(transform_test_data.colorcard_mask, -1)
# The result should be a len(np.unique(mask))-1 x 4 matrix
_, matrix = get_color_matrix(rgb_img, mask)
assert np.array_equal(matrix, matrix_compare)
def test_get_color_matrix_img(transform_test_data):
"""Test for PlantCV."""
# Read in two gray-scale images
rgb_img = cv2.imread(transform_test_data.colorcard_mask, -1)
mask = cv2.imread(transform_test_data.colorcard_mask, -1)
# The input for rgb_img needs to be an RGB image
with pytest.raises(RuntimeError):
_, _ = get_color_matrix(rgb_img, mask)
def test_get_color_matrix_mask(transform_test_data):
"""Test for PlantCV."""
# Read in two gray-scale images
rgb_img = cv2.imread(transform_test_data.target_img)
mask = cv2.imread(transform_test_data.colorcard_mask)
# The input for rgb_img needs to be an RGB image
with pytest.raises(RuntimeError):
_, _ = get_color_matrix(rgb_img, mask)
def test_get_matrix_m(transform_test_data):
"""Test for PlantCV."""
# load in comparison matrices
matrix_compare_m = transform_test_data.load_npz(transform_test_data.matrix_m1_file)
matrix_compare_b = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
# read in matrices
t_matrix = transform_test_data.load_npz(transform_test_data.target_matrix_file)
s_matrix = transform_test_data.load_npz(transform_test_data.source1_matrix_file)
# apply matrices to function
_, matrix_m, matrix_b = get_matrix_m(t_matrix, s_matrix)
matrix_compare_m = np.rint(matrix_compare_m)
matrix_compare_b = np.rint(matrix_compare_b)
matrix_m = np.rint(matrix_m)
matrix_b = np.rint(matrix_b)
assert np.array_equal(matrix_m, matrix_compare_m) and np.array_equal(matrix_b, matrix_compare_b)
def test_get_matrix_m_unequal_data(transform_test_data):
"""Test for PlantCV."""
# load in comparison matrices
matrix_compare_m = transform_test_data.load_npz(transform_test_data.matrix_m2_file)
matrix_compare_b = transform_test_data.load_npz(transform_test_data.matrix_b2_file)
# read in matrices
t_matrix = transform_test_data.load_npz(transform_test_data.target_matrix_file)
s_matrix = transform_test_data.load_npz(transform_test_data.source2_matrix_file)
# apply matrices to function
_, matrix_m, matrix_b = get_matrix_m(t_matrix, s_matrix)
matrix_compare_m = np.rint(matrix_compare_m)
matrix_compare_b = np.rint(matrix_compare_b)
matrix_m = np.rint(matrix_m)
matrix_b = np.rint(matrix_b)
assert np.array_equal(matrix_m, matrix_compare_m) and np.array_equal(matrix_b, matrix_compare_b)
def test_calc_transformation_matrix(transform_test_data):
"""Test for PlantCV."""
# load in comparison matrices
matrix_compare = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# read in matrices
matrix_m = transform_test_data.load_npz(transform_test_data.matrix_m1_file)
matrix_b = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
# apply to function
_, matrix_t = calc_transformation_matrix(matrix_m, matrix_b)
matrix_t = np.rint(matrix_t)
matrix_compare = np.rint(matrix_compare)
assert np.array_equal(matrix_t, matrix_compare)
def test_calc_transformation_matrix_b_incorrect(transform_test_data):
"""Test for PlantCV."""
# read in matrices
matrix_m = transform_test_data.load_npz(transform_test_data.matrix_m1_file)
matrix_b = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
matrix_b = np.asmatrix(matrix_b, float)
with pytest.raises(RuntimeError):
_, _ = calc_transformation_matrix(matrix_m, matrix_b.T)
def test_calc_transformation_matrix_not_mult(transform_test_data):
"""Test for PlantCV."""
# read in matrices
matrix_m = transform_test_data.load_npz(transform_test_data.matrix_m1_file)
matrix_b = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
with pytest.raises(RuntimeError):
_, _ = calc_transformation_matrix(matrix_m, matrix_b[:3])
def test_calc_transformation_matrix_not_mat(transform_test_data):
"""Test for PlantCV."""
# read in matrices
matrix_m = transform_test_data.load_npz(transform_test_data.matrix_m1_file)
matrix_b = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
with pytest.raises(RuntimeError):
_, _ = calc_transformation_matrix(matrix_m[:, 1], matrix_b[:, 1])
def test_apply_transformation(transform_test_data):
"""Test for PlantCV."""
# load corrected image to compare
corrected_compare = cv2.imread(transform_test_data.source_corrected)
# read in matrices
matrix_t = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# read in images
target_img = cv2.imread(transform_test_data.target_img)
source_img = cv2.imread(transform_test_data.source1_img)
corrected_img = apply_transformation_matrix(source_img, target_img, matrix_t)
# assert source and corrected have same shape
assert np.array_equal(corrected_img, corrected_compare)
def test_apply_transformation_incorrect_t(transform_test_data):
"""Test for PlantCV."""
# read in matrices
matrix_t = transform_test_data.load_npz(transform_test_data.matrix_b1_file)
# read in images
target_img = cv2.imread(transform_test_data.target_img)
source_img = cv2.imread(transform_test_data.source1_img)
with pytest.raises(RuntimeError):
_ = apply_transformation_matrix(source_img, target_img, matrix_t)
def test_apply_transformation_incorrect_img(transform_test_data):
"""Test for PlantCV."""
# read in matrices
matrix_t = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# read in images
target_img = cv2.imread(transform_test_data.target_img)
source_img = cv2.imread(transform_test_data.colorcard_mask, -1)
with pytest.raises(RuntimeError):
_ = apply_transformation_matrix(source_img, target_img, matrix_t)
def test_save_matrix(transform_test_data, tmpdir):
"""Test for PlantCV."""
# Create a test tmp directory
cache_dir = tmpdir.mkdir("cache")
# read in matrix
matrix_t = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# .npz filename
filename = os.path.join(cache_dir, 'test.npz')
save_matrix(matrix_t, filename)
assert os.path.exists(filename) is True
def test_save_matrix_incorrect_filename(transform_test_data):
"""Test for PlantCV."""
# read in matrix
matrix_t = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# .npz filename
filename = "test"
with pytest.raises(RuntimeError):
save_matrix(matrix_t, filename)
def test_load_matrix(transform_test_data):
"""Test for PlantCV."""
# read in matrix_t
matrix_t = transform_test_data.load_npz(transform_test_data.transformation_matrix_file)
# test load function with matrix_t
matrix_t_loaded = load_matrix(transform_test_data.transformation_matrix_file)
assert np.array_equal(matrix_t, matrix_t_loaded)
def test_correct_color(transform_test_data, tmpdir):
"""Test for PlantCV."""
# Create a test tmp directory
cache_dir = tmpdir.mkdir("cache")
# load corrected image to compare
corrected_compare = cv2.imread(transform_test_data.source_corrected)
# Read in target, source, and gray-scale mask
target_img = cv2.imread(transform_test_data.target_img)
source_img = cv2.imread(transform_test_data.source1_img)
mask = cv2.imread(transform_test_data.colorcard_mask, -1)
_, _, _, corrected_img = correct_color(target_img, mask, source_img, mask, cache_dir)
# assert source and corrected have same shape
assert all([np.array_equal(corrected_img, corrected_compare),
os.path.exists(os.path.join(cache_dir, "target_matrix.npz")) is True,
os.path.exists(os.path.join(cache_dir, "source_matrix.npz")) is True,
os.path.exists(os.path.join(cache_dir, "transformation_matrix.npz")) is True])
def test_correct_color_output_dne(transform_test_data, tmpdir):
"""Test for PlantCV."""
# Create a test tmp directory
tmp_dir = tmpdir.mkdir("cache")
cache_dir = os.path.join(tmp_dir, "outputs")
# load corrected image to compare
corrected_compare = cv2.imread(transform_test_data.source_corrected)
# Read in target, source, and gray-scale mask
target_img = cv2.imread(transform_test_data.target_img)
source_img = cv2.imread(transform_test_data.source1_img)
mask = cv2.imread(transform_test_data.colorcard_mask, -1)
_, _, _, corrected_img = correct_color(target_img, mask, source_img, mask, cache_dir)
# assert source and corrected have same shape
assert all([np.array_equal(corrected_img, corrected_compare),
os.path.exists(os.path.join(cache_dir, "target_matrix.npz")) is True,
os.path.exists(os.path.join(cache_dir, "source_matrix.npz")) is True,
os.path.exists(os.path.join(cache_dir, "transformation_matrix.npz")) is True])
def test_create_color_card_mask(transform_test_data):
"""Test for PlantCV."""
# Load target image
rgb_img = cv2.imread(transform_test_data.target_img)
mask = create_color_card_mask(rgb_img=rgb_img, radius=6, start_coord=(166, 166), spacing=(21, 21), nrows=6, ncols=4,
exclude=[20, 0])
assert all([i == j] for i, j in zip(np.unique(mask), np.array([0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 110,
120, 130, 140, 150, 160, 170, 180, 190, 200, 210,
220], dtype=np.uint8)))
def test_quick_color_check(transform_test_data):
"""Test for PlantCV."""
# Load target image
target_matrix = transform_test_data.load_npz(transform_test_data.target_matrix_file)
source_matrix = transform_test_data.load_npz(transform_test_data.source1_matrix_file)
quick_color_check(target_matrix, source_matrix, num_chips=22)
assert True
def test_find_color_card(transform_test_data):
"""Test for PlantCV."""
# Load rgb image
rgb_img = cv2.imread(transform_test_data.target_img)
_, start, space = find_color_card(rgb_img=rgb_img, threshold_type='adaptgauss', blurry=False, threshvalue=90)
assert start == (210, 212) and space == (8, 8)
def test_find_color_card_optional_parameters(transform_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
# Load rgb image
rgb_img = cv2.imread(transform_test_data.colorcard_img)
# Test with threshold ='normal'
_, _, _ = find_color_card(rgb_img=rgb_img, threshold_type='normal', blurry=True, background='light',
threshvalue=90, label="prefix")
assert int(outputs.observations["prefix"]["color_chip_size"]["value"]) == 15626
def test_find_color_card_otsu(transform_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
# Load rgb image
rgb_img = cv2.imread(transform_test_data.colorcard_img)
# Test with threshold ='normal'
_, _, _ = find_color_card(rgb_img=rgb_img, threshold_type='otsu', blurry=True, background='light',
threshvalue=90, label="prefix")
assert int(outputs.observations["prefix"]["color_chip_size"]["value"]) == 15132
def test_find_color_card_optional_size_parameters(transform_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
# Load rgb image
rgb_img = cv2.imread(transform_test_data.colorcard_img)
_, _, _ = find_color_card(rgb_img=rgb_img, record_chip_size="mean")
assert int(outputs.observations["default"]["color_chip_size"]["value"]) == 15515
def test_find_color_card_optional_size_parameters_none(transform_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
# Load rgb image
rgb_img = cv2.imread(transform_test_data.colorcard_img)
_, _, _ = find_color_card(rgb_img=rgb_img, record_chip_size=None)
assert outputs.observations.get("default") is None
def test_find_color_card_bad_record_chip_size(transform_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
# Load rgb image
rgb_img = cv2.imread(transform_test_data.target_img)
_, _, _ = find_color_card(rgb_img=rgb_img, record_chip_size='averageeeed')
assert outputs.observations["default"]["color_chip_size"]["value"] is None
def test_find_color_card_bad_thresh_input(transform_test_data):
"""Test for PlantCV."""
# Load rgb image
rgb_img = cv2.imread(transform_test_data.target_img)
with pytest.raises(RuntimeError):
_, _, _ = find_color_card(rgb_img=rgb_img, threshold_type='gaussian')
def test_find_color_card_bad_background_input(transform_test_data):
"""Test for PlantCV."""
# Load rgb image
rgb_img = cv2.imread(transform_test_data.target_img)
with pytest.raises(RuntimeError):
_, _, _ = find_color_card(rgb_img=rgb_img, background='lite')
def test_find_color_card_none_found(transform_test_data):
"""Test for PlantCV."""
# Load rgb image
rgb_img = cv2.imread(transform_test_data.target_img)
with pytest.raises(RuntimeError):
_, _, _ = find_color_card(rgb_img=rgb_img, threshold_type="otsu")
| 43.394495 | 127 | 0.736223 | 2,003 | 14,190 | 4.817274 | 0.09336 | 0.150896 | 0.197326 | 0.070681 | 0.862369 | 0.824127 | 0.805058 | 0.776764 | 0.758628 | 0.72422 | 0 | 0.01436 | 0.170613 | 14,190 | 326 | 128 | 43.527607 | 0.805506 | 0.139605 | 0 | 0.5 | 0 | 0 | 0.028146 | 0.004164 | 0 | 0 | 0 | 0 | 0.093407 | 1 | 0.153846 | false | 0 | 0.032967 | 0 | 0.186813 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
48de3542ba6bb2c3ce4f8b06d1a6744a818a8a92 | 2,308 | py | Python | tools/pythonpkg/tests/fast/types/test_nested.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 2,816 | 2018-06-26T18:52:52.000Z | 2021-04-06T10:39:15.000Z | tools/pythonpkg/tests/fast/types/test_nested.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 1,310 | 2021-04-06T16:04:52.000Z | 2022-03-31T13:52:53.000Z | tools/pythonpkg/tests/fast/types/test_nested.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 270 | 2021-04-09T06:18:28.000Z | 2022-03-31T11:55:37.000Z | import duckdb
class TestNested(object):
def test_lists(self, duckdb_cursor):
duckdb_conn = duckdb.connect()
result = duckdb_conn.execute("SELECT LIST_VALUE(1, 2, 3, 4) ").fetchall()
assert result == [([1, 2, 3, 4],)]
result = duckdb_conn.execute("SELECT LIST_VALUE() ").fetchall()
assert result == [([],)]
result = duckdb_conn.execute("SELECT LIST_VALUE(1, 2, 3, NULL) ").fetchall()
assert result == [([1, 2, 3, None],)]
def test_nested_lists(self, duckdb_cursor):
duckdb_conn = duckdb.connect()
result = duckdb_conn.execute("SELECT LIST_VALUE(LIST_VALUE(1, 2, 3, 4), LIST_VALUE(1, 2, 3, 4)) ").fetchall()
assert result == [([[1, 2, 3, 4], [1, 2, 3, 4]],)]
result = duckdb_conn.execute("SELECT LIST_VALUE(LIST_VALUE(1, 2, 3, 4), LIST_VALUE(1, 2, 3, NULL)) ").fetchall()
assert result == [([[1, 2, 3, 4], [1, 2, 3, None]],)]
def test_struct(self, duckdb_cursor):
duckdb_conn = duckdb.connect()
result = duckdb_conn.execute("SELECT STRUCT_PACK(a := 42, b := 43)").fetchall()
assert result == [({'a': 42, 'b': 43},)]
result = duckdb_conn.execute("SELECT STRUCT_PACK(a := 42, b := NULL)").fetchall()
assert result == [({'a': 42, 'b': None},)]
def test_nested_struct(self, duckdb_cursor):
duckdb_conn = duckdb.connect()
result = duckdb_conn.execute("SELECT STRUCT_PACK(a := 42, b := LIST_VALUE(10, 9, 8, 7))").fetchall()
assert result == [({'a': 42, 'b': [10,9,8,7]},)]
result = duckdb_conn.execute("SELECT STRUCT_PACK(a := 42, b := LIST_VALUE(10, 9, 8, NULL))").fetchall()
assert result == [({'a': 42, 'b': [10,9,8,None]},)]
def test_map(self, duckdb_cursor):
duckdb_conn = duckdb.connect()
result = duckdb_conn.execute("select MAP(LIST_VALUE(1, 2, 3, 4),LIST_VALUE(10, 9, 8, 7))").fetchall()
assert result == [({'key': [1,2,3,4], 'value': [10,9,8,7]},)]
result = duckdb_conn.execute("select MAP(LIST_VALUE(1, 2, 3, 4),LIST_VALUE(10, 9, 8, NULL))").fetchall()
assert result == [({'key': [1,2,3,4], 'value': [10,9,8,None]},)]
result = duckdb_conn.execute("SELECT MAP() ").fetchall()
assert result == [({'key': [], 'value': []},)]
| 43.54717 | 120 | 0.570191 | 324 | 2,308 | 3.91358 | 0.117284 | 0.134069 | 0.037855 | 0.217666 | 0.89511 | 0.89511 | 0.833596 | 0.811514 | 0.811514 | 0.757886 | 0 | 0.064804 | 0.224437 | 2,308 | 53 | 121 | 43.54717 | 0.643575 | 0 | 0 | 0.138889 | 0 | 0.111111 | 0.248159 | 0.020788 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.138889 | false | 0 | 0.027778 | 0 | 0.194444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5b1926d304edfdfdf5e361f205649edf5764286e | 240 | py | Python | keras/applications/inception_resnet_v2.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 5 | 2020-11-30T22:26:03.000Z | 2020-12-01T22:34:25.000Z | keras/applications/inception_resnet_v2.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 10 | 2020-12-01T22:55:29.000Z | 2020-12-11T18:31:46.000Z | keras/applications/inception_resnet_v2.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 15 | 2020-11-30T22:12:22.000Z | 2020-12-09T01:32:48.000Z | from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2
from tensorflow.keras.applications.inception_resnet_v2 import decode_predictions
from tensorflow.keras.applications.inception_resnet_v2 import preprocess_input
| 60 | 80 | 0.9125 | 29 | 240 | 7.275862 | 0.448276 | 0.199052 | 0.270142 | 0.440758 | 0.767773 | 0.767773 | 0.767773 | 0.767773 | 0 | 0 | 0 | 0.017544 | 0.05 | 240 | 3 | 81 | 80 | 0.907895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
d29ababc4a430df30c59e0175c6425547f0ba01e | 37 | py | Python | autogl/datasets/utils/_split_edges/__init__.py | dedsec-9/AutoGL | 487f2b2f798b9b1363ad5dc100fb410b12222e06 | [
"MIT"
] | null | null | null | autogl/datasets/utils/_split_edges/__init__.py | dedsec-9/AutoGL | 487f2b2f798b9b1363ad5dc100fb410b12222e06 | [
"MIT"
] | null | null | null | autogl/datasets/utils/_split_edges/__init__.py | dedsec-9/AutoGL | 487f2b2f798b9b1363ad5dc100fb410b12222e06 | [
"MIT"
] | null | null | null | from .split_edges import split_edges
| 18.5 | 36 | 0.864865 | 6 | 37 | 5 | 0.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 1 | 37 | 37 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d2b100a5af67a3e83a0eb5179590256d05e9f0bb | 158 | py | Python | wazimap_ng/datasets/tasks/__init__.py | arghyaiitb/wazimap-ng | 2a77860526d865b8fd0c22a2204f121fdb3b28a0 | [
"Apache-2.0"
] | null | null | null | wazimap_ng/datasets/tasks/__init__.py | arghyaiitb/wazimap-ng | 2a77860526d865b8fd0c22a2204f121fdb3b28a0 | [
"Apache-2.0"
] | null | null | null | wazimap_ng/datasets/tasks/__init__.py | arghyaiitb/wazimap-ng | 2a77860526d865b8fd0c22a2204f121fdb3b28a0 | [
"Apache-2.0"
] | null | null | null | from .process_uploaded_file import process_uploaded_file
from .indicator_data_extraction import indicator_data_extraction
from .delete_data import delete_data | 52.666667 | 64 | 0.911392 | 22 | 158 | 6.090909 | 0.409091 | 0.223881 | 0.283582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06962 | 158 | 3 | 65 | 52.666667 | 0.911565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d2cf7407583e6dccf88103efa382f3c275219d8c | 17,670 | py | Python | frimcla/shallowmodels/classificationModelsMultiClass.py | ManuGar/ObjectClassificationByTransferLearning | fc009fc5a71668355a94ea1a8f506fdde8e7bde0 | [
"MIT"
] | 3 | 2021-04-22T09:15:34.000Z | 2022-01-05T09:50:18.000Z | frimcla/shallowmodels/classificationModelsMultiClass.py | ManuGar/ObjectClassificationByTransferLearning | fc009fc5a71668355a94ea1a8f506fdde8e7bde0 | [
"MIT"
] | 4 | 2020-09-25T22:46:39.000Z | 2021-08-25T15:01:14.000Z | frimcla/shallowmodels/classificationModelsMultiClass.py | ManuGar/ObjectClassificationByTransferLearning | fc009fc5a71668355a94ea1a8f506fdde8e7bde0 | [
"MIT"
] | 3 | 2020-07-31T14:11:26.000Z | 2021-11-24T01:53:01.000Z | from scipy.stats import randint as sp_randint
from sklearn.ensemble import RandomForestClassifier, GradientBoostingClassifier, ExtraTreesClassifier
from sklearn.neural_network import MLPClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
from sklearn.neighbors import KNeighborsClassifier
from sklearn.multiclass import OneVsRestClassifier
from skmultilearn.problem_transform import BinaryRelevance
from skmultilearn.problem_transform import ClassifierChain
from skmultilearn.problem_transform import LabelPowerset
from skmultilearn.adapt import MLTSVM
from skmultilearn.adapt import MLkNN
class classifierModel:
def getModel(self):
pass
def getParams(self):
pass
def getNIterations(self):
pass
def setParams(self, params):
pass
def setNIterations(self,nIterations):
pass
class RandomForest(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params = {"estimator__max_depth": [3, None],
"estimator__max_features": [1, 3, 10],
"estimator__min_samples_leaf": [1, 3, 10],
"estimator__bootstrap": [True, False],
"estimator__criterion": ["gini", "entropy"]}, niterations=10):
self.model = BinaryRelevance(RandomForestClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class SVM(classifierModel):
def __init__(self, random_state=84, params={'estimator__C': [1, 10, 100, 1000],
'estimator__gamma': [0.001, 0.0001],
'estimator__kernel': ['rbf', 'linear']},
niterations=10):
self.model = BinaryRelevance(SVC(random_state=random_state))
self.params = params
self.niterations= niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class KNN(classifierModel):
def __init__(self,params={'estimator__n_neighbors': range(5, 27,2)}, niterations=10):
self.model = BinaryRelevance(KNeighborsClassifier())
self.params = params
self.niteraciones = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niteraciones
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class LogRegression(classifierModel):
def __init__(self, rdm_state=84,params={"estimator__C": [0.1, 1.0, 10.0, 100.0, 1000.0, 10000.0]},
niterations=5):
self.model = BinaryRelevance(LogisticRegression(random_state=rdm_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class MultiLayerPerceptron(classifierModel):
def __init__(self, random_state=84,params={'classifier__activation':['identity', 'logistic', 'tanh', 'relu'],
'classifier__solver':['lbfgs','sgd','adam'], 'classifier__alpha': sp_randint(0.0001, 1),
'classifier__learning_rate':['constant','invscaling','adaptive'],'classifier__momentum':[0.9,0.95,0.99]},
niterations=5):
self.model = BinaryRelevance(MLPClassifier(random_state=random_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class GradientBoost(classifierModel):
def __init__(self, random_state=84, n_estimators=20,
params={"classifier__max_depth": [3, None],
"classifier__max_features": [1, 3, 10],
"classifier__min_samples_leaf": [1, 3, 10]},
niterations=10):
self.model = BinaryRelevance(GradientBoostingClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ExtraTrees(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params={'classifier__n_estimator': [250, 500, 1000, 1500],
'classifier__min_samples_split': [2, 4, 8]}, niterations=10):
self.model = BinaryRelevance(ExtraTreesClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccRandomForest(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params = {"classifier__max_depth": [3, None],
"classifier__max_features": [1, 3, 10],
"classifier__min_samples_leaf": [1, 3, 10],
"classifier__bootstrap": [True, False],
"classifier__criterion": ["gini", "entropy"]}, niterations=10):
self.model = ClassifierChain(RandomForestClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccSVM(classifierModel):
def __init__(self, random_state=84, params={'classifier__C': [1, 10, 100, 1000],
'classifier__gamma': [0.001, 0.0001],
'classifier__kernel': ['rbf', 'linear']},
niterations=10):
self.model = ClassifierChain(SVC(random_state=random_state))
self.params = params
self.niterations= niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccKNN(classifierModel):
def __init__(self,params={'classifier__n_neighbors': range(5, 27,2)}, niterations=10):
self.model = ClassifierChain(KNeighborsClassifier())
self.params = params
self.niteraciones = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niteraciones
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccLogRegression(classifierModel):
def __init__(self, rdm_state=84,params={"classifier__C": [0.1, 1.0, 10.0, 100.0, 1000.0, 10000.0]},
niterations=5):
self.model = ClassifierChain(LogisticRegression(random_state=rdm_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccMultiLayerPerceptron(classifierModel):
def __init__(self, random_state=84,params={'classifier__activation':['identity', 'logistic', 'tanh', 'relu'],
'classifier__solver':['lbfgs','sgd','adam'], 'classifier__alpha': sp_randint(0.0001, 1),
'classifier__learning_rate':['constant','invscaling','adaptive'],'classifier__momentum':[0.9,0.95,0.99]},
niterations=5):
self.model = ClassifierChain(MLPClassifier(random_state=random_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccGradientBoost(classifierModel):
def __init__(self, random_state=84, n_estimators=20,
params={"classifier__max_depth": [3, None],
"classifier__max_features": [1, 3, 10],
"classifier__min_samples_leaf": [1, 3, 10]},
niterations=10):
self.model = ClassifierChain(GradientBoostingClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class ccExtraTrees(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params={'classifier__n_estimators': [250, 500, 1000, 1500],
'classifier__min_samples_split': [2, 4, 8]}, niterations=10):
self.model = ClassifierChain(ExtraTreesClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpRandomForest(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params = {"classifier__max_depth": [3, None],
"classifier__max_features": [1, 3, 10],
"classifier__min_samples_leaf": [1, 3, 10],
"classifier__bootstrap": [True, False],
"classifier__criterion": ["gini", "entropy"]}, niterations=10):
self.model = LabelPowerset(RandomForestClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpSVM(classifierModel):
def __init__(self, random_state=84, params={'classifier__C': [1, 10, 100, 1000],
'classifier__gamma': [0.001, 0.0001],
'classifier__kernel': ['rbf', 'linear']},
niterations=10):
self.model = LabelPowerset(SVC(random_state=random_state))
self.params = params
self.niterations= niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpKNN(classifierModel):
def __init__(self,params={'classifier__n_neighbors': range(5, 27,2)}, niterations=10):
self.model = LabelPowerset(KNeighborsClassifier())
self.params = params
self.niteraciones = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niteraciones
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpLogRegression(classifierModel):
def __init__(self, rdm_state=84,params={"classifier__C": [0.1, 1.0, 10.0, 100.0, 1000.0, 10000.0]},
niterations=5):
self.model = LabelPowerset(LogisticRegression(random_state=rdm_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpMultiLayerPerceptron(classifierModel):
def __init__(self, random_state=84,params={'classifier__activation':['identity', 'logistic', 'tanh', 'relu'],
'classifier__solver':['lbfgs','sgd','adam'], 'classifier__alpha': sp_randint(0.0001, 1),
'classifier__learning_rate':['constant','invscaling','adaptive'],'classifier__momentum':[0.9,0.95,0.99]},
niterations=5):
self.model = LabelPowerset(MLPClassifier(random_state=random_state))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpGradientBoost(classifierModel):
def __init__(self, random_state=84, n_estimators=20,
params={"classifier__max_depth": [3, None],
"classifier__max_features": [1, 3, 10],
"classifier__min_samples_leaf": [1, 3, 10]},
niterations=10):
self.model = LabelPowerset(GradientBoostingClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class lpExtraTrees(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params={'classifier__n_estimators': [250, 500, 1000, 1500],
'classifier__min_samples_split': [2, 4, 8]}, niterations=10):
self.model = LabelPowerset(ExtraTreesClassifier(random_state=random_state,n_estimators=n_estimators))
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class mMLkNN(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params={'k': range(5,27,2),
's': [0.5, 0.7, 1.0]}, niterations=10):
self.model = MLkNN()
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
class mMLTSVM(classifierModel):
def __init__(self, random_state=84, n_estimators=20,params={'c_k': [2**i for i in range(-7, 7, 2)]}, niterations=10):
self.model = MLTSVM()
self.params = params
self.niterations = niterations
def getModel(self):
return self.model
def getParams(self):
return self.params
def getNIterations(self):
return self.niterations
def setParams(self, params):
self.params = params
def setNIterations(self, nIterations):
self.niterations = nIterations
| 31.329787 | 121 | 0.659819 | 1,870 | 17,670 | 6.049733 | 0.074866 | 0.084858 | 0.085388 | 0.046672 | 0.908424 | 0.861752 | 0.860161 | 0.853178 | 0.845487 | 0.845487 | 0 | 0.029593 | 0.238879 | 17,670 | 563 | 122 | 31.385435 | 0.811585 | 0 | 0 | 0.82494 | 0 | 0 | 0.086083 | 0.049012 | 0 | 0 | 0 | 0 | 0 | 1 | 0.342926 | false | 0.01199 | 0.028777 | 0.165468 | 0.594724 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
828ea269b054ddc8f50bba0ec32a1a43e5538663 | 82,541 | py | Python | ml/code/svm/classifiers/HP_KNN_LCS.py | cyberdeception/deepdig | 482061bc5039181a95631fcb8515d990eb2e16a9 | [
"Apache-2.0"
] | 5 | 2020-03-07T21:28:51.000Z | 2022-03-11T06:03:46.000Z | ml/code/svm/classifiers/HP_KNN_LCS.py | cyberdeception/deepdig | 482061bc5039181a95631fcb8515d990eb2e16a9 | [
"Apache-2.0"
] | null | null | null | ml/code/svm/classifiers/HP_KNN_LCS.py | cyberdeception/deepdig | 482061bc5039181a95631fcb8515d990eb2e16a9 | [
"Apache-2.0"
] | 4 | 2019-10-18T18:32:35.000Z | 2021-07-16T11:52:38.000Z |
import config
from KNN_LCS import KNN_LCS
import numpy as np
from Trace import Trace
from EventTrace import EventTrace
class HP_KNN_LCS:
@staticmethod
def traceToInstance( trace ):
instance = []
if isinstance(trace, Trace): # if pcap files, get packet lengths
if trace.getPacketCount()==0:
instance.append(0)
instance.append('webpage'+str(trace.getId()))
return instance
for packet in trace.getPackets():
instance.append(packet.getLength())
elif isinstance(trace, EventTrace): # if sysdig (scap files), get syscalls
if trace.getEventCount() == 0:
instance.append(0)
instance.append('webpage'+str(trace.getId()))
return instance
for event in trace.getEvents():
# IO
#if event.getSystemcallIndex() == 0 or event.getSystemcallIndex() == 1: # commented as done in reading sysdig file
instance.append(event.getSystemcallIndex())
# Label
instance.append('webpage'+str(trace.getId()))
return instance
@staticmethod
def classify( runID, trainingSet, testingSet ):
#testing
#print trainingSet
#print "\n"
#print testingSet
Xtrain = []
Ytrain = []
for instance in trainingSet:
Xtrain.append(instance[:-1])
Ytrain.append(instance[-1]) # label
Xtest = []
Ytest = []
for instance in testingSet:
Xtest.append(instance[:-1])
Ytest.append(instance[-1]) # label
knn_lcs_obj = KNN_LCS(Xtrain, Xtest, Ytrain, Ytest, neighbors=config.NUM_NEIGHBORS)
[accuracy,debugInfo] = knn_lcs_obj.getAccuracyDebugInfo()
return [accuracy,debugInfo]
'''
Xtrain = np.array([[1,2,6,5,4,8], [2,1,6,5,4,4], [2,1,6,5], [2,1,6,5,4,4,7,6], [2,1,6,5,4],[2,3,6,5,4,4]])
Xtest = np.array([[2,1,6,5,4,4],[2,1,6,5,4]])
#Ytrain = np.array([[0], [0], [0], [1], [1], [1]])
#Ytrain = [0, 0, 0, 1, 1, 1]
Ytrain = ['webpage0', 'webpage0', 'webpage0', 'webpage1', 'webpage1', 'webpage1']
#Ytest = np.array([[1], [0]])
Ytest = ['webpage0', 'webpage0']
knn_lcs_obj = KNN_LCS(Xtrain, Xtest, Ytrain, Ytest, neighbors=3)
#knn_lcs_obj.calcHP_KNN_LCS()
print knn_lcs_obj.getAccuracyDebugInfo()
Xtrain = np.array([ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 788, 100, 100, 1500, 484, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 580, 1300, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 100, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 252, 884, 532, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 100, 1500, 1500, 1500, 884, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 252, 1500, 92, 1500, 1500, 1364, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 100, 1076, 676, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 140, 1500, 1500, 1500, 876, 1500, 1500, 1500, 1500, 212, 1500, 252, 100, 564, 196, 148, 84, 100, 148, 100, 1500, 508, 84, 1500, 972, 1012, 1492, 548, 388, 628, 580, 1028, 84, 1500, 1500, 692, 1500, 572, 1500, 1500, 180, 1156, 500, 1500, 148, 380, 100, 1124, 116, 148, 1500, 92, 1500, 1500, 1500, 1404, 340, 1500, 1500, 292, 148, 1380, 100, 1108, 484, 148, 84, 100, 1460, 84, 148, 1500, 908, 532, 1500, 524, 836, 100, 580, 84, 148, 100, 148, 100, 148, 100, 1500, 1180, 148, 116, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 908, 1500, 1500, 100, 1500, 700, 1500, 1500, 212, 1500, 108, 148, 1500, 108, 1500, 1500, 1500, 220, 1500, 1500, 164, 596, 596, 148, 100, 1500, 156, 148, 196, 612, 148, 788, 132, 1500, 1500, 468, 84, 148, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1388, 452, 1460, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 956, 100, 1500, 92, 980, 516, 1076, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 892, 516, 484, 1500, 1500, 676, 644, 116, 580, 84, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 92, 1500, 1180, 948, 468, 500, 1236, 500, 1284, 1500, 1084, 500, 1500, 500, 92, 1500, 1388, 964, 1500, 92, 1500, 1500, 1500, 796, 964, 1500, 92, 1500, 1500, 1500, 1500, 1500, 92, 1124, 1500, 92, 1500, 1148, 100, 1500, 1028, 92, 1500, 1500, 1108, 500, 772, 500, 980, 1500, 92, 1500, 1500, 1500, 172, 516, 660, 484, 388, 500, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 388, 1500, 1500, 132, 500, 420, 500, 756, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 140, 100, 1500, 1500, 1500, 172, 948, 772, 532, 772, 1500, 1500, 276, 500, 1500, 1500, 100, 628, 500, 1500, 500, 92, 1500, 828, 532, 1500, 92, 1500, 1500, 84, 484, 756, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 836, 1500, 268, 980, 388, 1500, 364, 500, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1436, 1500, 492, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1500, 452, 148, 100, 532, 820, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 580, 1500, 92, 1500, 1500, 1476, 148, 100, 596, 1028, 84, 84, 148, 84, 84, 100, 580, 804, 148, 84, 100, 148, 100, 1108, 84, 1396, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1308, 1500, 92, 1500, 940, 596, 1500, 596, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 84, 148, 100, 148, 100, 1092, 340, 1500, 1500, 740, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 1500, 1364, 148, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 420, 484, 1500, 1500, 1500, 1500, 1412, 1500, 1500, 1500, 140, 100, 1268, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1364, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 692, 500, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 844, 1500, 100, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1404, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 172, 1500, 652, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 148, 100, 1476, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 100, 1500, 1500, 1500, 180, 1500, 1500, 1500, 876, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 180, 1500, 700, 964, 1332, 532, 1316, 596, 84, 148, 1492, 100, 148, 100, 1500, 988, 148, 1500, 92, 1500, 1500, 1500, 1356, 1500, 1500, 180, 1252, 148, 100, 996, 484, 148, 84, 1500, 1500, 148, 1500, 380, 1380, 1172, 100, 148, 100, 1428, 1500, 148, 92, 1500, 1500, 596, 1500, 556, 148, 100, 1500, 716, 148, 180, 1500, 1388, 1500, 1500, 564, 1500, 108, 1500, 60, 100, 1500, 172, 148, 116, 1500, 1500, 132, 1500, 1500, 1348, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 1500, 1420, 100, 148, 516, 452, 148, 100, 1500, 156, 148, 196, 564, 100, 148, 788, 132, 1500, 1436, 148, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 100, 1444, 1500, 92, 1500, 1500, 1500, 1100, 1500, 796, 100, 484, 516, 84, 116, 452, 1500, 1500, 1500, 1500, 1500, 1500, 916, 516, 1500, 92, 1500, 1260, 516, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1012, 148, 100, 580, 1108, 148, 84, 1500, 1500, 164, 1500, 108, 1500, 892, 1500, 172, 116, 1500, 1500, 996, 1500, 108, 948, 1500, 940, 1236, 100, 980, 948, 1500, 92, 1500, 860, 964, 1500, 92, 1500, 1500, 1396, 1236, 948, 1500, 92, 1500, 1500, 884, 1500, 204, 980, 1500, 92, 1500, 1500, 1500, 1500, 1500, 684, 964, 772, 148, 1060, 100, 1500, 60, 980, 1500, 1500, 132, 1500, 1500, 612, 932, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 1500, 92, 1500, 236, 1500, 1500, 660, 516, 1500, 500, 92, 1500, 924, 484, 1500, 92, 1500, 1500, 84, 548, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 388, 148, 100, 1500, 60, 884, 500, 388, 500, 1500, 364, 532, 148, 868, 1076, 420, 148, 100, 148, 100, 1500, 540, 1500, 492, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 92, 1500, 1436, 1500, 1500, 1500, 316, 1500, 1500, 1500, 172, 1500, 1500, 932, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1220, 116, 84, 660, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 100, 148, 100, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1172, 1500, 668, 596, 1028, 84, 84, 84, 84, 148, 100, 1060, 756, 836, 116, 84, 548, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 1500, 452, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 692, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1500, 1500, 1500, 788, 1140, 1500, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 1500, 1500, 1292, 148, 100, 148, 100, 1092, 628, 1500, 1500, 420, 84, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 500, 1476, 148, 1476, 1476, 1476, 1500, 1420, 100, 484, 1500, 252, 484, 1500, 268, 148, 100, 468, 1500, 156, 468, 692, 468, 1500, 108, 1500, 1500, 1492, 148, 100, 148, 100, 916, 564, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1100, 1500, 652, 484, 1500, 1196, 484, 1500, 484, 620, 1500, 732, 484, 1500, 484, 732, 1500, 484, 860, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1332, 484, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1492, 500, 740, 500, 1500, 92, 1500, 1500, 1428, 500, 148, 100, 500, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 644, 756, 84, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 204, 100, 1284, 84, 500, 1092, 84, 84, 84, 84, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 604, 100, 1300, 772, 84, 84, 84, 84, 1500, 92, 1500, 92, 1500, 1500, 132, 916, 500, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 876, 484, 1172, 1500, 1500, 132, 1500, 1500, 1300, 996, 1476, 1500, 556, 1500, 316, 948, 868, 1500, 332]
,[148, 100, 500, 1500, 1404, 1500, 1404, 148, 1500, 1276, 100, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 156, 468, 644, 468, 1500, 1500, 1500, 1452, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 964, 1500, 652, 916, 1500, 92, 1124, 1500, 620, 916, 1500, 700, 1500, 732, 484, 1500, 812, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1364, 148, 100, 1500, 172, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1500, 108, 804, 964, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 332, 1500, 1500, 564, 1500, 1500, 212, 1500, 1500, 1500, 588, 1500, 92, 1500, 1500, 1500, 1292, 500, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 916, 500, 1500, 500, 92, 1500, 1500, 1500, 716, 1500, 1500, 132, 1500, 1500, 1364, 484, 1172, 1124, 116, 84, 148, 100, 500, 756, 148, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 1500, 452, 1500, 300, 948, 1500, 332, 772]]
)
print Xtrain
Xtest = np.array([ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1300]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 172, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 348, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 300, 148, 100, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 932, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 932, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 220, 1316, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 788, 468, 1500, 92, 1268, 516, 1012, 148, 100, 516, 1500, 516, 92, 1500, 1500, 1500, 140, 1500, 1500, 356, 1500, 92, 1500, 1500, 1500, 172, 548, 1412, 1500, 500, 1500, 692, 500, 1500, 92, 1500, 92, 1500, 684, 516, 676, 516, 1500, 92, 1500, 1500, 468, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 204, 1332, 548, 1140, 532, 1140, 148, 100, 148, 100, 1500, 572, 388, 1060, 180, 116, 628, 1044, 1500, 92, 1500, 1484, 1500, 1500, 132, 100, 1500, 1500, 132, 500, 1428, 148, 1380, 1500, 428, 1396, 1500, 148, 92, 1500, 1500, 132, 1500, 1500, 356, 100, 1460, 340, 1500, 204, 116, 1500, 908, 1500, 92, 1500, 1500, 132, 532, 1500, 1500, 1500, 1500, 1500, 1500, 1316, 532, 1500, 1500, 276, 500, 1500, 580, 92, 1500, 620, 500, 84, 1500, 1500, 132, 1500, 1500, 1500, 988, 484, 1500, 548, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 100, 148, 884, 148, 100, 1492, 1500, 92, 1500, 1500, 1348, 1500, 1500, 1500, 1500, 772, 964, 1500, 108, 1500, 1500, 164, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1172, 1012, 980, 1500, 148, 92, 1500, 780, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 988, 100, 1492, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1276, 1500, 1500, 1500, 1260, 100, 196, 132, 532, 484, 148, 84, 100, 148, 100, 1500, 540, 148, 84, 1284, 148, 516, 132, 148, 100, 1500, 300, 148, 84, 1460, 148, 788, 132, 1500, 636, 148, 116, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1420, 100, 516, 1220, 1500, 796, 180, 1500, 1276, 1500, 1500, 1396, 948, 1500, 92, 1076, 1500, 1500, 1500, 924, 948, 324, 1500, 1500, 452, 1284, 964, 1500, 940, 1500, 92, 1500, 668, 980, 1060, 1500, 92, 1500, 1340, 948, 772, 516, 980, 484, 756, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 388, 1500, 500, 364, 532, 500, 420, 500, 1500, 92, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 284, 1500, 492, 660, 1500, 92, 1500, 92, 148, 100, 564, 1500, 92, 1500, 236, 1500, 1500, 132, 1500, 92, 1500, 92, 148, 100, 580, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 1500, 1452, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 100, 1500, 140, 532, 404, 676, 100, 532, 1500, 1180, 148, 100, 628, 820, 852, 116, 84, 532, 756, 148, 100, 596, 1028, 548, 84, 116, 84, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 404, 100, 580, 804, 84, 84, 84, 84, 436, 1332, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 868, 148, 100, 148, 100, 1140, 1500, 92, 1500, 716, 1500, 1500, 1460, 596, 1500, 92, 1500, 1356, 596, 1500, 92, 1460, 596, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1228]
,[148, 100, 500, 1476, 1500, 1500, 1428, 148, 1476, 1396, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1412, 148, 100, 148, 100, 916, 1500, 1212, 900, 1500, 92, 1500, 1500, 1500, 1500, 724, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 924, 916, 1500, 620, 1500, 1132, 916, 1500, 92, 692, 1500, 732, 916, 1500, 812, 852, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1316, 1500, 108, 932, 1500, 92, 1500, 1500, 1500, 844, 980, 1500, 92, 1500, 1500, 1500, 1292, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 1500, 1500, 1500, 1500, 1500, 220, 1500, 100, 1500, 1500, 748, 148, 100, 1500, 204, 148, 1500, 92, 1500, 1500, 132, 1500, 636, 1500, 1500, 1500, 1500, 1476, 100, 932, 1172, 1124, 116, 84, 148, 756, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 108, 1500, 1500, 1500, 1500, 1284, 948, 1500, 332, 772, 100]
,[148, 100, 500, 1476, 1500, 1500, 1380, 148, 1476, 1396, 484, 1500, 236, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1396, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 964, 1500, 684, 916, 1500, 92, 1500, 1500, 244, 916, 1500, 732, 1500, 700, 916, 852, 1500, 860, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1332, 1500, 108, 932, 804, 516, 1500, 92, 1500, 1500, 1500, 108, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 276, 516, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1052, 100, 500, 148, 100, 500, 756, 1500, 1500, 164, 1500, 1500, 1268, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 500, 708, 500, 1500, 92, 1500, 1500, 1500, 668, 484, 1500, 92, 1500, 1500, 1500, 1404, 1172, 148, 100, 756, 1092, 84, 84, 148, 84, 84, 100, 852, 772, 84, 84, 84, 84, 996, 1500, 1500, 1500, 1500, 1076, 1500, 268, 948, 1500, 332, 772]]
)
Xtrain = [ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 788, 100, 100, 1500, 484, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 580, 1300, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 100, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 252, 884, 532, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 100, 1500, 1500, 1500, 884, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 252, 1500, 92, 1500, 1500, 1364, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 100, 1076, 676, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 140, 1500, 1500, 1500, 876, 1500, 1500, 1500, 1500, 212, 1500, 252, 100, 564, 196, 148, 84, 100, 148, 100, 1500, 508, 84, 1500, 972, 1012, 1492, 548, 388, 628, 580, 1028, 84, 1500, 1500, 692, 1500, 572, 1500, 1500, 180, 1156, 500, 1500, 148, 380, 100, 1124, 116, 148, 1500, 92, 1500, 1500, 1500, 1404, 340, 1500, 1500, 292, 148, 1380, 100, 1108, 484, 148, 84, 100, 1460, 84, 148, 1500, 908, 532, 1500, 524, 836, 100, 580, 84, 148, 100, 148, 100, 148, 100, 1500, 1180, 148, 116, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 908, 1500, 1500, 100, 1500, 700, 1500, 1500, 212, 1500, 108, 148, 1500, 108, 1500, 1500, 1500, 220, 1500, 1500, 164, 596, 596, 148, 100, 1500, 156, 148, 196, 612, 148, 788, 132, 1500, 1500, 468, 84, 148, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1388, 452, 1460, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 956, 100, 1500, 92, 980, 516, 1076, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 892, 516, 484, 1500, 1500, 676, 644, 116, 580, 84, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 92, 1500, 1180, 948, 468, 500, 1236, 500, 1284, 1500, 1084, 500, 1500, 500, 92, 1500, 1388, 964, 1500, 92, 1500, 1500, 1500, 796, 964, 1500, 92, 1500, 1500, 1500, 1500, 1500, 92, 1124, 1500, 92, 1500, 1148, 100, 1500, 1028, 92, 1500, 1500, 1108, 500, 772, 500, 980, 1500, 92, 1500, 1500, 1500, 172, 516, 660, 484, 388, 500, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 388, 1500, 1500, 132, 500, 420, 500, 756, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 140, 100, 1500, 1500, 1500, 172, 948, 772, 532, 772, 1500, 1500, 276, 500, 1500, 1500, 100, 628, 500, 1500, 500, 92, 1500, 828, 532, 1500, 92, 1500, 1500, 84, 484, 756, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 836, 1500, 268, 980, 388, 1500, 364, 500, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1436, 1500, 492, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1500, 452, 148, 100, 532, 820, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 580, 1500, 92, 1500, 1500, 1476, 148, 100, 596, 1028, 84, 84, 148, 84, 84, 100, 580, 804, 148, 84, 100, 148, 100, 1108, 84, 1396, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1308, 1500, 92, 1500, 940, 596, 1500, 596, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 84, 148, 100, 148, 100, 1092, 340, 1500, 1500, 740, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 1500, 1364, 148, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 420, 484, 1500, 1500, 1500, 1500, 1412, 1500, 1500, 1500, 140, 100, 1268, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1364, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 692, 500, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 844, 1500, 100, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1404, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 172, 1500, 652, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 148, 100, 1476, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 100, 1500, 1500, 1500, 180, 1500, 1500, 1500, 876, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 180, 1500, 700, 964, 1332, 532, 1316, 596, 84, 148, 1492, 100, 148, 100, 1500, 988, 148, 1500, 92, 1500, 1500, 1500, 1356, 1500, 1500, 180, 1252, 148, 100, 996, 484, 148, 84, 1500, 1500, 148, 1500, 380, 1380, 1172, 100, 148, 100, 1428, 1500, 148, 92, 1500, 1500, 596, 1500, 556, 148, 100, 1500, 716, 148, 180, 1500, 1388, 1500, 1500, 564, 1500, 108, 1500, 60, 100, 1500, 172, 148, 116, 1500, 1500, 132, 1500, 1500, 1348, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 1500, 1420, 100, 148, 516, 452, 148, 100, 1500, 156, 148, 196, 564, 100, 148, 788, 132, 1500, 1436, 148, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 100, 1444, 1500, 92, 1500, 1500, 1500, 1100, 1500, 796, 100, 484, 516, 84, 116, 452, 1500, 1500, 1500, 1500, 1500, 1500, 916, 516, 1500, 92, 1500, 1260, 516, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1012, 148, 100, 580, 1108, 148, 84, 1500, 1500, 164, 1500, 108, 1500, 892, 1500, 172, 116, 1500, 1500, 996, 1500, 108, 948, 1500, 940, 1236, 100, 980, 948, 1500, 92, 1500, 860, 964, 1500, 92, 1500, 1500, 1396, 1236, 948, 1500, 92, 1500, 1500, 884, 1500, 204, 980, 1500, 92, 1500, 1500, 1500, 1500, 1500, 684, 964, 772, 148, 1060, 100, 1500, 60, 980, 1500, 1500, 132, 1500, 1500, 612, 932, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 1500, 92, 1500, 236, 1500, 1500, 660, 516, 1500, 500, 92, 1500, 924, 484, 1500, 92, 1500, 1500, 84, 548, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 388, 148, 100, 1500, 60, 884, 500, 388, 500, 1500, 364, 532, 148, 868, 1076, 420, 148, 100, 148, 100, 1500, 540, 1500, 492, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 92, 1500, 1436, 1500, 1500, 1500, 316, 1500, 1500, 1500, 172, 1500, 1500, 932, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1220, 116, 84, 660, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 100, 148, 100, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1172, 1500, 668, 596, 1028, 84, 84, 84, 84, 148, 100, 1060, 756, 836, 116, 84, 548, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 1500, 452, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 692, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1500, 1500, 1500, 788, 1140, 1500, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 1500, 1500, 1292, 148, 100, 148, 100, 1092, 628, 1500, 1500, 420, 84, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 500, 1476, 148, 1476, 1476, 1476, 1500, 1420, 100, 484, 1500, 252, 484, 1500, 268, 148, 100, 468, 1500, 156, 468, 692, 468, 1500, 108, 1500, 1500, 1492, 148, 100, 148, 100, 916, 564, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1100, 1500, 652, 484, 1500, 1196, 484, 1500, 484, 620, 1500, 732, 484, 1500, 484, 732, 1500, 484, 860, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1332, 484, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1492, 500, 740, 500, 1500, 92, 1500, 1500, 1428, 500, 148, 100, 500, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 644, 756, 84, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 204, 100, 1284, 84, 500, 1092, 84, 84, 84, 84, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 604, 100, 1300, 772, 84, 84, 84, 84, 1500, 92, 1500, 92, 1500, 1500, 132, 916, 500, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 876, 484, 1172, 1500, 1500, 132, 1500, 1500, 1300, 996, 1476, 1500, 556, 1500, 316, 948, 868, 1500, 332]
,[148, 100, 500, 1500, 1404, 1500, 1404, 148, 1500, 1276, 100, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 156, 468, 644, 468, 1500, 1500, 1500, 1452, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 964, 1500, 652, 916, 1500, 92, 1124, 1500, 620, 916, 1500, 700, 1500, 732, 484, 1500, 812, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1364, 148, 100, 1500, 172, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1500, 108, 804, 964, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 332, 1500, 1500, 564, 1500, 1500, 212, 1500, 1500, 1500, 588, 1500, 92, 1500, 1500, 1500, 1292, 500, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 916, 500, 1500, 500, 92, 1500, 1500, 1500, 716, 1500, 1500, 132, 1500, 1500, 1364, 484, 1172, 1124, 116, 84, 148, 100, 500, 756, 148, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 1500, 452, 1500, 300, 948, 1500, 332, 772]]
#print Xtrain
Xtest = [ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1300]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 172, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 348, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 300, 148, 100, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 932, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 932, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 220, 1316, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 788, 468, 1500, 92, 1268, 516, 1012, 148, 100, 516, 1500, 516, 92, 1500, 1500, 1500, 140, 1500, 1500, 356, 1500, 92, 1500, 1500, 1500, 172, 548, 1412, 1500, 500, 1500, 692, 500, 1500, 92, 1500, 92, 1500, 684, 516, 676, 516, 1500, 92, 1500, 1500, 468, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 204, 1332, 548, 1140, 532, 1140, 148, 100, 148, 100, 1500, 572, 388, 1060, 180, 116, 628, 1044, 1500, 92, 1500, 1484, 1500, 1500, 132, 100, 1500, 1500, 132, 500, 1428, 148, 1380, 1500, 428, 1396, 1500, 148, 92, 1500, 1500, 132, 1500, 1500, 356, 100, 1460, 340, 1500, 204, 116, 1500, 908, 1500, 92, 1500, 1500, 132, 532, 1500, 1500, 1500, 1500, 1500, 1500, 1316, 532, 1500, 1500, 276, 500, 1500, 580, 92, 1500, 620, 500, 84, 1500, 1500, 132, 1500, 1500, 1500, 988, 484, 1500, 548, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 100, 148, 884, 148, 100, 1492, 1500, 92, 1500, 1500, 1348, 1500, 1500, 1500, 1500, 772, 964, 1500, 108, 1500, 1500, 164, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1172, 1012, 980, 1500, 148, 92, 1500, 780, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 988, 100, 1492, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1276, 1500, 1500, 1500, 1260, 100, 196, 132, 532, 484, 148, 84, 100, 148, 100, 1500, 540, 148, 84, 1284, 148, 516, 132, 148, 100, 1500, 300, 148, 84, 1460, 148, 788, 132, 1500, 636, 148, 116, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1420, 100, 516, 1220, 1500, 796, 180, 1500, 1276, 1500, 1500, 1396, 948, 1500, 92, 1076, 1500, 1500, 1500, 924, 948, 324, 1500, 1500, 452, 1284, 964, 1500, 940, 1500, 92, 1500, 668, 980, 1060, 1500, 92, 1500, 1340, 948, 772, 516, 980, 484, 756, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 388, 1500, 500, 364, 532, 500, 420, 500, 1500, 92, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 284, 1500, 492, 660, 1500, 92, 1500, 92, 148, 100, 564, 1500, 92, 1500, 236, 1500, 1500, 132, 1500, 92, 1500, 92, 148, 100, 580, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 1500, 1452, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 100, 1500, 140, 532, 404, 676, 100, 532, 1500, 1180, 148, 100, 628, 820, 852, 116, 84, 532, 756, 148, 100, 596, 1028, 548, 84, 116, 84, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 404, 100, 580, 804, 84, 84, 84, 84, 436, 1332, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 868, 148, 100, 148, 100, 1140, 1500, 92, 1500, 716, 1500, 1500, 1460, 596, 1500, 92, 1500, 1356, 596, 1500, 92, 1460, 596, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1228]
,[148, 100, 500, 1476, 1500, 1500, 1428, 148, 1476, 1396, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1412, 148, 100, 148, 100, 916, 1500, 1212, 900, 1500, 92, 1500, 1500, 1500, 1500, 724, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 924, 916, 1500, 620, 1500, 1132, 916, 1500, 92, 692, 1500, 732, 916, 1500, 812, 852, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1316, 1500, 108, 932, 1500, 92, 1500, 1500, 1500, 844, 980, 1500, 92, 1500, 1500, 1500, 1292, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 1500, 1500, 1500, 1500, 1500, 220, 1500, 100, 1500, 1500, 748, 148, 100, 1500, 204, 148, 1500, 92, 1500, 1500, 132, 1500, 636, 1500, 1500, 1500, 1500, 1476, 100, 932, 1172, 1124, 116, 84, 148, 756, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 108, 1500, 1500, 1500, 1500, 1284, 948, 1500, 332, 772, 100]
,[148, 100, 500, 1476, 1500, 1500, 1380, 148, 1476, 1396, 484, 1500, 236, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1396, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 964, 1500, 684, 916, 1500, 92, 1500, 1500, 244, 916, 1500, 732, 1500, 700, 916, 852, 1500, 860, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1332, 1500, 108, 932, 804, 516, 1500, 92, 1500, 1500, 1500, 108, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 276, 516, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1052, 100, 500, 148, 100, 500, 756, 1500, 1500, 164, 1500, 1500, 1268, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 500, 708, 500, 1500, 92, 1500, 1500, 1500, 668, 484, 1500, 92, 1500, 1500, 1500, 1404, 1172, 148, 100, 756, 1092, 84, 84, 148, 84, 84, 100, 852, 772, 84, 84, 84, 84, 996, 1500, 1500, 1500, 1500, 1076, 1500, 268, 948, 1500, 332, 772]]
Ytrain = ['webpage0', 'webpage0', 'webpage1', 'webpage1']
Ytest = ['webpage0', 'webpage0', 'webpage1', 'webpage1']
knn_lcs_obj = KNN_LCS(Xtrain, Xtest, Ytrain, Ytest, neighbors=config.NUM_NEIGHBORS)
#knn_lcs_obj.calcHP_KNN_LCS()
print knn_lcs_obj.getAccuracyDebugInfo()
Xtrain = [ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 788, 100, 100, 1500, 484, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 580, 1300, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 100, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 252, 884, 532, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 100, 1500, 1500, 1500, 884, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1500, 252, 1500, 92, 1500, 1500, 1364, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 100, 1076, 676, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 140, 1500, 1500, 1500, 876, 1500, 1500, 1500, 1500, 212, 1500, 252, 100, 564, 196, 148, 84, 100, 148, 100, 1500, 508, 84, 1500, 972, 1012, 1492, 548, 388, 628, 580, 1028, 84, 1500, 1500, 692, 1500, 572, 1500, 1500, 180, 1156, 500, 1500, 148, 380, 100, 1124, 116, 148, 1500, 92, 1500, 1500, 1500, 1404, 340, 1500, 1500, 292, 148, 1380, 100, 1108, 484, 148, 84, 100, 1460, 84, 148, 1500, 908, 532, 1500, 524, 836, 100, 580, 84, 148, 100, 148, 100, 148, 100, 1500, 1180, 148, 116, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 908, 1500, 1500, 100, 1500, 700, 1500, 1500, 212, 1500, 108, 148, 1500, 108, 1500, 1500, 1500, 220, 1500, 1500, 164, 596, 596, 148, 100, 1500, 156, 148, 196, 612, 148, 788, 132, 1500, 1500, 468, 84, 148, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1388, 452, 1460, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 956, 100, 1500, 92, 980, 516, 1076, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 892, 516, 484, 1500, 1500, 676, 644, 116, 580, 84, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 92, 1500, 1180, 948, 468, 500, 1236, 500, 1284, 1500, 1084, 500, 1500, 500, 92, 1500, 1388, 964, 1500, 92, 1500, 1500, 1500, 796, 964, 1500, 92, 1500, 1500, 1500, 1500, 1500, 92, 1124, 1500, 92, 1500, 1148, 100, 1500, 1028, 92, 1500, 1500, 1108, 500, 772, 500, 980, 1500, 92, 1500, 1500, 1500, 172, 516, 660, 484, 388, 500, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 388, 1500, 1500, 132, 500, 420, 500, 756, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 140, 100, 1500, 1500, 1500, 172, 948, 772, 532, 772, 1500, 1500, 276, 500, 1500, 1500, 100, 628, 500, 1500, 500, 92, 1500, 828, 532, 1500, 92, 1500, 1500, 84, 484, 756, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 836, 1500, 268, 980, 388, 1500, 364, 500, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1436, 1500, 492, 548, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 140, 1500, 1500, 452, 148, 100, 532, 820, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 580, 1500, 92, 1500, 1500, 1476, 148, 100, 596, 1028, 84, 84, 148, 84, 84, 100, 580, 804, 148, 84, 100, 148, 100, 1108, 84, 1396, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1308, 1500, 92, 1500, 940, 596, 1500, 596, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 84, 148, 100, 148, 100, 1092, 340, 1500, 1500, 740, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 1500, 1364, 148, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 420, 484, 1500, 1500, 1500, 1500, 1412, 1500, 1500, 1500, 140, 100, 1268, 1500, 1500, 132, 1500, 92, 1500, 1500, 132, 1500, 1500, 1364, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 692, 500, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 844, 1500, 100, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1404, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 172, 1500, 652, 468, 1500, 92, 1268, 516, 1012, 148, 100, 948, 1500, 92, 1500, 1500, 692, 500, 1500, 92, 1500, 636, 148, 100, 1476, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 100, 1500, 1500, 1500, 180, 1500, 1500, 1500, 876, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 180, 1500, 700, 964, 1332, 532, 1316, 596, 84, 148, 1492, 100, 148, 100, 1500, 988, 148, 1500, 92, 1500, 1500, 1500, 1356, 1500, 1500, 180, 1252, 148, 100, 996, 484, 148, 84, 1500, 1500, 148, 1500, 380, 1380, 1172, 100, 148, 100, 1428, 1500, 148, 92, 1500, 1500, 596, 1500, 556, 148, 100, 1500, 716, 148, 180, 1500, 1388, 1500, 1500, 564, 1500, 108, 1500, 60, 100, 1500, 172, 148, 116, 1500, 1500, 132, 1500, 1500, 1348, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 1500, 1420, 100, 148, 516, 452, 148, 100, 1500, 156, 148, 196, 564, 100, 148, 788, 132, 1500, 1436, 148, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 100, 1444, 1500, 92, 1500, 1500, 1500, 1100, 1500, 796, 100, 484, 516, 84, 116, 452, 1500, 1500, 1500, 1500, 1500, 1500, 916, 516, 1500, 92, 1500, 1260, 516, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1012, 148, 100, 580, 1108, 148, 84, 1500, 1500, 164, 1500, 108, 1500, 892, 1500, 172, 116, 1500, 1500, 996, 1500, 108, 948, 1500, 940, 1236, 100, 980, 948, 1500, 92, 1500, 860, 964, 1500, 92, 1500, 1500, 1396, 1236, 948, 1500, 92, 1500, 1500, 884, 1500, 204, 980, 1500, 92, 1500, 1500, 1500, 1500, 1500, 684, 964, 772, 148, 1060, 100, 1500, 60, 980, 1500, 1500, 132, 1500, 1500, 612, 932, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 1500, 92, 1500, 236, 1500, 1500, 660, 516, 1500, 500, 92, 1500, 924, 484, 1500, 92, 1500, 1500, 84, 548, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 388, 148, 100, 1500, 60, 884, 500, 388, 500, 1500, 364, 532, 148, 868, 1076, 420, 148, 100, 148, 100, 1500, 540, 1500, 492, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 92, 1500, 92, 1500, 1436, 1500, 1500, 1500, 316, 1500, 1500, 1500, 172, 1500, 1500, 932, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1220, 116, 84, 660, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 92, 100, 148, 100, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 1500, 1172, 1500, 668, 596, 1028, 84, 84, 84, 84, 148, 100, 1060, 756, 836, 116, 84, 548, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 180, 1500, 1500, 452, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 692, 148, 100, 148, 100, 1140, 1500, 92, 1500, 1500, 1500, 1500, 788, 1140, 1500, 92, 1500, 1500, 1500, 156, 596, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 1500, 1500, 1292, 148, 100, 148, 100, 1092, 628, 1500, 1500, 420, 84, 180, 116, 148, 100, 564, 340, 164, 500, 84, 1332]
,[148, 100, 500, 1476, 148, 1476, 1476, 1476, 1500, 1420, 100, 484, 1500, 252, 484, 1500, 268, 148, 100, 468, 1500, 156, 468, 692, 468, 1500, 108, 1500, 1500, 1492, 148, 100, 148, 100, 916, 564, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1100, 1500, 652, 484, 1500, 1196, 484, 1500, 484, 620, 1500, 732, 484, 1500, 484, 732, 1500, 484, 860, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1332, 484, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1492, 500, 740, 500, 1500, 92, 1500, 1500, 1428, 500, 148, 100, 500, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 644, 756, 84, 1500, 92, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 204, 100, 1284, 84, 500, 1092, 84, 84, 84, 84, 148, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 172, 1500, 604, 100, 1300, 772, 84, 84, 84, 84, 1500, 92, 1500, 92, 1500, 1500, 132, 916, 500, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 876, 484, 1172, 1500, 1500, 132, 1500, 1500, 1300, 996, 1476, 1500, 556, 1500, 316, 948, 868, 1500, 332]
,[148, 100, 500, 1500, 1404, 1500, 1404, 148, 1500, 1276, 100, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 156, 468, 644, 468, 1500, 1500, 1500, 1452, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 1500, 100, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 964, 1500, 652, 916, 1500, 92, 1124, 1500, 620, 916, 1500, 700, 1500, 732, 484, 1500, 812, 484, 852, 484, 1252, 484, 1156, 484, 1380, 484, 1236, 484, 1364, 148, 100, 1500, 172, 1284, 484, 980, 484, 1252, 484, 1332, 484, 1500, 108, 804, 964, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 260, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 332, 1500, 1500, 564, 1500, 1500, 212, 1500, 1500, 1500, 588, 1500, 92, 1500, 1500, 1500, 1292, 500, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 916, 500, 1500, 500, 92, 1500, 1500, 1500, 716, 1500, 1500, 132, 1500, 1500, 1364, 484, 1172, 1124, 116, 84, 148, 100, 500, 756, 148, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 1500, 452, 1500, 300, 948, 1500, 332, 772]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1300]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500]]
#print Xtrain
Xtest = [ [148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 148, 1500, 1500, 1500, 1500, 180, 132, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1500, 172, 1500, 1500, 1500, 1500, 132, 1500, 1500, 1316, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 540, 100, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 764, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 924, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 172, 1500, 1500, 1500, 876, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1356, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 772, 468, 1500, 1276, 516, 1012, 148, 100, 516, 1500, 148, 92, 1500, 1500, 1500, 140, 772, 1412, 1500, 148, 92, 1500, 1500, 1500, 140, 1332, 580, 1500, 76, 84, 1500, 92, 1500, 1500, 692, 964, 820, 1500, 1500, 708, 1060, 84, 1500, 1500, 132, 1500, 1500, 1500, 1228, 1500, 92, 1500, 1500, 1500, 1500, 100, 1500, 220, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 396, 1012, 1492, 548, 388, 628, 1060, 1500, 92, 1500, 1500, 1500, 668, 1500, 1500, 132, 100, 500, 1500, 148, 92, 1500, 172, 1500, 1500, 1156, 100, 148, 100, 1500, 60, 148, 1500, 92, 1500, 1500, 1500, 1372, 1500, 492, 100, 148, 100, 1492, 804, 148, 84, 1500, 1500, 1500, 140, 1500, 652, 1500, 1500, 164, 1500, 108, 1060, 1500, 124, 116, 1500, 1388, 980, 484, 1500, 1500, 308, 1500, 1500, 660, 1028, 84, 1500, 1500, 1460, 1500, 1500, 1500, 92, 1044, 84, 1500, 1500, 132, 1500, 1500, 1500, 1420, 1500, 1500, 1500, 1500, 916, 100, 1060, 1500, 972, 1500, 1500, 1500, 1500, 884, 100, 564, 196, 148, 84, 100, 1476, 1500, 148, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 932, 1500, 1308, 452, 676, 84, 148, 100, 148, 1108, 84, 100, 148, 100, 1500, 1148, 148, 84, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1332, 100, 1500, 60, 532, 1500, 212, 1036, 164, 580, 1500, 148, 92, 1500, 1500, 1500, 172, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 172, 1500, 1500, 1316, 1500, 172, 116, 1500, 1500, 1316, 1500, 1500, 1500, 556, 948, 1500, 116, 940, 1500, 860, 948, 84, 1500, 1276, 1500, 1084, 964, 1500, 92, 1500, 1500, 1500, 1500, 1396, 836, 980, 1500, 92, 1500, 1388, 1500, 1500, 708, 100, 964, 884, 500, 772, 516, 980, 484, 660, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 484, 772, 1500, 1500, 132, 1500, 1500, 548, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 276, 100, 884, 532, 388, 500, 1500, 500, 364, 532, 500, 420, 500, 1500, 500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1484, 1500, 492, 660, 1500, 92, 1500, 92, 1500, 92, 1500, 92, 1500, 1500, 132, 516, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 132, 1500, 92, 1500, 1500, 1500, 1500, 1500, 492, 1500, 1500, 164, 1500, 1500, 1364, 1500, 1500, 1500, 1500, 484, 148, 100, 628, 820, 84, 84, 84, 84, 148, 100, 1044, 1500, 92, 1500, 236, 820, 148, 100, 1076, 756, 884, 116, 84, 548, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 412, 100, 148, 100, 596, 148, 1012, 132, 148, 100, 1108, 84, 148, 724, 772, 580, 804, 84, 84, 84, 84, 596, 1500, 92, 1500, 1500, 740, 148, 100, 148, 100, 1076, 340, 1332, 180, 116, 148, 100, 148, 100, 1092, 628, 148, 1500, 172, 1092, 148, 100, 1268, 116, 436, 1500, 92, 1500, 1500, 1500, 1500, 756, 1140, 1500, 92, 1500, 1500, 1500, 108, 596, 1500, 92, 1500, 1500, 1500, 1420, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1300]
,[148, 100, 516, 1500, 92, 1500, 92, 1500, 1500, 132, 1500, 1500, 100, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 188, 1500, 92, 1500, 172, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 212, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 348, 100, 1500, 92, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 300, 148, 100, 484, 1268, 500, 1500, 92, 1500, 1500, 1500, 172, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 236, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 932, 532, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1500, 932, 1500, 1500, 100, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 1500, 1500, 1500, 220, 1316, 532, 1500, 92, 1500, 1500, 996, 484, 1500, 972, 484, 1500, 380, 468, 1500, 92, 1500, 1500, 1500, 1500, 788, 468, 1500, 92, 1268, 516, 1012, 148, 100, 516, 1500, 516, 92, 1500, 1500, 1500, 140, 1500, 1500, 356, 1500, 92, 1500, 1500, 1500, 172, 548, 1412, 1500, 500, 1500, 692, 500, 1500, 92, 1500, 92, 1500, 684, 516, 676, 516, 1500, 92, 1500, 1500, 468, 1500, 1500, 180, 1500, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 204, 1332, 548, 1140, 532, 1140, 148, 100, 148, 100, 1500, 572, 388, 1060, 180, 116, 628, 1044, 1500, 92, 1500, 1484, 1500, 1500, 132, 100, 1500, 1500, 132, 500, 1428, 148, 1380, 1500, 428, 1396, 1500, 148, 92, 1500, 1500, 132, 1500, 1500, 356, 100, 1460, 340, 1500, 204, 116, 1500, 908, 1500, 92, 1500, 1500, 132, 532, 1500, 1500, 1500, 1500, 1500, 1500, 1316, 532, 1500, 1500, 276, 500, 1500, 580, 92, 1500, 620, 500, 84, 1500, 1500, 132, 1500, 1500, 1500, 988, 484, 1500, 548, 92, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 188, 1500, 1500, 100, 148, 884, 148, 100, 1492, 1500, 92, 1500, 1500, 1348, 1500, 1500, 1500, 1500, 772, 964, 1500, 108, 1500, 1500, 164, 1500, 1500, 132, 1500, 1500, 1500, 1500, 212, 1500, 1500, 1500, 1500, 1500, 252, 1500, 1500, 1500, 1500, 1172, 1012, 980, 1500, 148, 92, 1500, 780, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 988, 100, 1492, 1500, 148, 92, 1500, 1500, 1500, 1500, 1500, 1276, 1500, 1500, 1500, 1260, 100, 196, 132, 532, 484, 148, 84, 100, 148, 100, 1500, 540, 148, 84, 1284, 148, 516, 132, 148, 100, 1500, 300, 148, 84, 1460, 148, 788, 132, 1500, 636, 148, 116, 1500, 1500, 100, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1420, 100, 516, 1220, 1500, 796, 180, 1500, 1276, 1500, 1500, 1396, 948, 1500, 92, 1076, 1500, 1500, 1500, 924, 948, 324, 1500, 1500, 452, 1284, 964, 1500, 940, 1500, 92, 1500, 668, 980, 1060, 1500, 92, 1500, 1340, 948, 772, 516, 980, 484, 756, 500, 388, 500, 388, 500, 420, 500, 756, 500, 772, 532, 772, 500, 388, 1500, 500, 364, 532, 500, 420, 500, 1500, 92, 500, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 284, 1500, 492, 660, 1500, 92, 1500, 92, 148, 100, 564, 1500, 92, 1500, 236, 1500, 1500, 132, 1500, 92, 1500, 92, 148, 100, 580, 1500, 92, 1500, 1500, 1500, 1500, 212, 1500, 1500, 132, 1500, 1500, 1500, 1452, 1500, 92, 1500, 1500, 132, 1500, 1500, 1500, 140, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 1500, 132, 100, 1500, 140, 532, 404, 676, 100, 532, 1500, 1180, 148, 100, 628, 820, 852, 116, 84, 532, 756, 148, 100, 596, 1028, 548, 84, 116, 84, 148, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 404, 100, 580, 804, 84, 84, 84, 84, 436, 1332, 148, 100, 148, 100, 1044, 724, 724, 596, 1500, 92, 1500, 1500, 868, 148, 100, 148, 100, 1140, 1500, 92, 1500, 716, 1500, 1500, 1460, 596, 1500, 92, 1500, 1356, 596, 1500, 92, 1460, 596, 1500, 92, 1500, 1500, 1500, 140, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 1500, 1500, 1500, 1228]
,[148, 100, 500, 1476, 1500, 1500, 1428, 148, 1476, 1396, 484, 1500, 204, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1412, 148, 100, 148, 100, 916, 1500, 1212, 900, 1500, 92, 1500, 1500, 1500, 1500, 724, 1500, 1500, 132, 1500, 1500, 1500, 1500, 180, 1500, 92, 1500, 924, 916, 1500, 620, 1500, 1132, 916, 1500, 92, 692, 1500, 732, 916, 1500, 812, 852, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1316, 1500, 108, 932, 1500, 92, 1500, 1500, 1500, 844, 980, 1500, 92, 1500, 1500, 1500, 1292, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1004, 1500, 1500, 1500, 1500, 1500, 220, 1500, 100, 1500, 1500, 748, 148, 100, 1500, 204, 148, 1500, 92, 1500, 1500, 132, 1500, 636, 1500, 1500, 1500, 1500, 1476, 100, 932, 1172, 1124, 116, 84, 148, 756, 84, 100, 916, 84, 772, 84, 1060, 84, 1500, 108, 1500, 1500, 1500, 1500, 1284, 948, 1500, 332, 772, 100]
,[148, 100, 500, 1476, 1500, 1500, 1380, 148, 1476, 1396, 484, 1500, 236, 484, 1500, 220, 148, 100, 468, 1500, 124, 468, 644, 468, 1500, 108, 1500, 1500, 1396, 148, 100, 148, 100, 916, 500, 1500, 764, 468, 1500, 484, 92, 1500, 1500, 1500, 140, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 964, 1500, 684, 916, 1500, 92, 1500, 1500, 244, 916, 1500, 732, 1500, 700, 916, 852, 1500, 860, 916, 1156, 484, 1252, 484, 1380, 484, 1236, 484, 1364, 484, 1284, 484, 980, 484, 1252, 484, 1332, 1500, 108, 932, 804, 516, 1500, 92, 1500, 1500, 1500, 108, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 276, 516, 1500, 1500, 132, 1500, 1500, 1500, 1500, 1500, 1500, 1500, 268, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1500, 180, 1500, 1500, 1500, 1500, 1500, 1500, 228, 1500, 1500, 1500, 1052, 100, 500, 148, 100, 500, 756, 1500, 1500, 164, 1500, 1500, 1268, 1500, 92, 1500, 1500, 1500, 1500, 1500, 220, 1500, 1500, 500, 708, 500, 1500, 92, 1500, 1500, 1500, 668, 484, 1500, 92, 1500, 1500, 1500, 1404, 1172, 148, 100, 756, 1092, 84, 84, 148, 84, 84, 100, 852, 772, 84, 84, 84, 84, 996, 1500, 1500, 1500, 1500, 1076, 1500, 268, 948, 1500, 332, 772]
]
Ytrain = ['webpage0', 'webpage0', 'webpage1', 'webpage1', 'webpage0', 'webpage1', 'webpage1', 'webpage0']
Ytest = ['webpage0', 'webpage0', 'webpage1', 'webpage1']
knn_lcs_obj = KNN_LCS(Xtrain, Xtest, Ytrain, Ytest, neighbors=config.NUM_NEIGHBORS)
#knn_lcs_obj.calcHP_KNN_LCS()
print knn_lcs_obj.getAccuracyDebugInfo()
''' | 598.123188 | 4,232 | 0.634279 | 14,830 | 82,541 | 3.527647 | 0.016521 | 0.639664 | 0.574826 | 0.473134 | 0.980961 | 0.980847 | 0.980732 | 0.980273 | 0.979356 | 0.979356 | 0 | 0.750757 | 0.184308 | 82,541 | 138 | 4,233 | 598.123188 | 0.02626 | 0.003065 | 0 | 0.25 | 0 | 0 | 0.013199 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.125 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
82b5f81174ee1cffebbaf0900fb50d2efdf027c6 | 109 | py | Python | tests/parser/choice.2a.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/choice.2a.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/choice.2a.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
f(a) :- not f(b).
f(b) :- not f(a).
"""
output = """
f(a) :- not f(b).
f(b) :- not f(a).
"""
| 12.111111 | 18 | 0.357798 | 22 | 109 | 1.772727 | 0.272727 | 0.205128 | 0.25641 | 0.307692 | 0.717949 | 0.717949 | 0.717949 | 0.717949 | 0.717949 | 0.717949 | 0 | 0 | 0.256881 | 109 | 8 | 19 | 13.625 | 0.481481 | 0 | 0 | 0.75 | 0 | 0 | 0.704762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
82d39b74e401e23334892624512705fec4ed0b79 | 129 | py | Python | The Core/Intro Gates/circleOfNumbers.py | shanemichaelarcaro/codesignal | 69b0460dbc163091dc115634bbb730da5caf65a9 | [
"MIT"
] | null | null | null | The Core/Intro Gates/circleOfNumbers.py | shanemichaelarcaro/codesignal | 69b0460dbc163091dc115634bbb730da5caf65a9 | [
"MIT"
] | null | null | null | The Core/Intro Gates/circleOfNumbers.py | shanemichaelarcaro/codesignal | 69b0460dbc163091dc115634bbb730da5caf65a9 | [
"MIT"
] | null | null | null | def circleOfNumbers(n, firstNumber):
return firstNumber + (n + 1) // 2 if firstNumber < n / 2 else firstNumber - (n + 1) // 2 | 64.5 | 92 | 0.643411 | 18 | 129 | 4.611111 | 0.5 | 0.433735 | 0.313253 | 0.337349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049505 | 0.217054 | 129 | 2 | 92 | 64.5 | 0.772277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
82e179bb3af6ef0e9eafc812b8e263613d1e6a1a | 145,270 | py | Python | isi_sdk_8_2_2/isi_sdk_8_2_2/api/cluster_nodes_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 24 | 2018-06-22T14:13:23.000Z | 2022-03-23T01:21:26.000Z | isi_sdk_8_2_2/isi_sdk_8_2_2/api/cluster_nodes_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 46 | 2018-04-30T13:28:22.000Z | 2022-03-21T21:11:07.000Z | isi_sdk_8_2_2/isi_sdk_8_2_2/api/cluster_nodes_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 29 | 2018-06-19T00:14:04.000Z | 2022-02-08T17:51:19.000Z | # coding: utf-8
"""
Isilon SDK
Isilon SDK - Language bindings for the OneFS API # noqa: E501
OpenAPI spec version: 9
Contact: sdk@isilon.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from isi_sdk_8_2_2.api_client import ApiClient
class ClusterNodesApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_drives_drive_add_item(self, drives_drive_add_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_add_item # noqa: E501
Add a drive to a node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_add_item(drives_drive_add_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_add_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_add_item_with_http_info(drives_drive_add_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_add_item_with_http_info(drives_drive_add_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_add_item_with_http_info(self, drives_drive_add_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_add_item # noqa: E501
Add a drive to a node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_add_item_with_http_info(drives_drive_add_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_add_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_add_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_add_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_add_item' is set
if ('drives_drive_add_item' not in params or
params['drives_drive_add_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_add_item` when calling `create_drives_drive_add_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_add_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_add_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_add_item' in params:
body_params = params['drives_drive_add_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/add', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_firmware_update_item(self, drives_drive_firmware_update_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_firmware_update_item # noqa: E501
Start a drive firmware update. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_firmware_update_item(drives_drive_firmware_update_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDriveFirmwareUpdateItem drives_drive_firmware_update_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_firmware_update_item_with_http_info(drives_drive_firmware_update_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_firmware_update_item_with_http_info(drives_drive_firmware_update_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_firmware_update_item_with_http_info(self, drives_drive_firmware_update_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_firmware_update_item # noqa: E501
Start a drive firmware update. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_firmware_update_item_with_http_info(drives_drive_firmware_update_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDriveFirmwareUpdateItem drives_drive_firmware_update_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_firmware_update_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_firmware_update_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_firmware_update_item' is set
if ('drives_drive_firmware_update_item' not in params or
params['drives_drive_firmware_update_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_firmware_update_item` when calling `create_drives_drive_firmware_update_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_firmware_update_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_firmware_update_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_firmware_update_item' in params:
body_params = params['drives_drive_firmware_update_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/firmware/update', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_format_item(self, drives_drive_format_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_format_item # noqa: E501
Format a drive for use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_format_item(drives_drive_format_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDriveFormatItem drives_drive_format_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_format_item_with_http_info(drives_drive_format_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_format_item_with_http_info(drives_drive_format_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_format_item_with_http_info(self, drives_drive_format_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_format_item # noqa: E501
Format a drive for use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_format_item_with_http_info(drives_drive_format_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDriveFormatItem drives_drive_format_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_format_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_format_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_format_item' is set
if ('drives_drive_format_item' not in params or
params['drives_drive_format_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_format_item` when calling `create_drives_drive_format_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_format_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_format_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_format_item' in params:
body_params = params['drives_drive_format_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/format', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_purpose_item(self, drives_drive_purpose_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_purpose_item # noqa: E501
Assign a drive to a specific use case. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_purpose_item(drives_drive_purpose_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDrivePurposeItem drives_drive_purpose_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_purpose_item_with_http_info(drives_drive_purpose_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_purpose_item_with_http_info(drives_drive_purpose_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_purpose_item_with_http_info(self, drives_drive_purpose_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_purpose_item # noqa: E501
Assign a drive to a specific use case. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_purpose_item_with_http_info(drives_drive_purpose_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DrivesDrivePurposeItem drives_drive_purpose_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_purpose_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_purpose_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_purpose_item' is set
if ('drives_drive_purpose_item' not in params or
params['drives_drive_purpose_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_purpose_item` when calling `create_drives_drive_purpose_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_purpose_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_purpose_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_purpose_item' in params:
body_params = params['drives_drive_purpose_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/purpose', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_smartfail_item(self, drives_drive_smartfail_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_smartfail_item # noqa: E501
Remove a drive from use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_smartfail_item(drives_drive_smartfail_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_smartfail_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_smartfail_item_with_http_info(drives_drive_smartfail_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_smartfail_item_with_http_info(drives_drive_smartfail_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_smartfail_item_with_http_info(self, drives_drive_smartfail_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_smartfail_item # noqa: E501
Remove a drive from use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_smartfail_item_with_http_info(drives_drive_smartfail_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_smartfail_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_smartfail_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_smartfail_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_smartfail_item' is set
if ('drives_drive_smartfail_item' not in params or
params['drives_drive_smartfail_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_smartfail_item` when calling `create_drives_drive_smartfail_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_smartfail_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_smartfail_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_smartfail_item' in params:
body_params = params['drives_drive_smartfail_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/smartfail', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_stopfail_item(self, drives_drive_stopfail_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_stopfail_item # noqa: E501
Stop restriping from a smartfailing drive. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_stopfail_item(drives_drive_stopfail_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_stopfail_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_stopfail_item_with_http_info(drives_drive_stopfail_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_stopfail_item_with_http_info(drives_drive_stopfail_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_stopfail_item_with_http_info(self, drives_drive_stopfail_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_stopfail_item # noqa: E501
Stop restriping from a smartfailing drive. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_stopfail_item_with_http_info(drives_drive_stopfail_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_stopfail_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_stopfail_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_stopfail_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_stopfail_item' is set
if ('drives_drive_stopfail_item' not in params or
params['drives_drive_stopfail_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_stopfail_item` when calling `create_drives_drive_stopfail_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_stopfail_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_stopfail_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_stopfail_item' in params:
body_params = params['drives_drive_stopfail_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/stopfail', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_drives_drive_suspend_item(self, drives_drive_suspend_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_suspend_item # noqa: E501
Temporarily remove a drive from use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_suspend_item(drives_drive_suspend_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_suspend_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_drives_drive_suspend_item_with_http_info(drives_drive_suspend_item, lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.create_drives_drive_suspend_item_with_http_info(drives_drive_suspend_item, lnn, driveid, **kwargs) # noqa: E501
return data
def create_drives_drive_suspend_item_with_http_info(self, drives_drive_suspend_item, lnn, driveid, **kwargs): # noqa: E501
"""create_drives_drive_suspend_item # noqa: E501
Temporarily remove a drive from use by OneFS. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_drives_drive_suspend_item_with_http_info(drives_drive_suspend_item, lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty drives_drive_suspend_item: (required)
:param int lnn: (required)
:param str driveid: (required)
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['drives_drive_suspend_item', 'lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_drives_drive_suspend_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'drives_drive_suspend_item' is set
if ('drives_drive_suspend_item' not in params or
params['drives_drive_suspend_item'] is None):
raise ValueError("Missing the required parameter `drives_drive_suspend_item` when calling `create_drives_drive_suspend_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_drives_drive_suspend_item`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `create_drives_drive_suspend_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'drives_drive_suspend_item' in params:
body_params = params['drives_drive_suspend_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/suspend', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_node_reboot_item(self, node_reboot_item, lnn, **kwargs): # noqa: E501
"""create_node_reboot_item # noqa: E501
Reboot the node specified by <LNN>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_node_reboot_item(node_reboot_item, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty node_reboot_item: (required)
:param int lnn: (required)
:param bool force: Force reboot on Infinity platform even if a drive sled is not present.
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_node_reboot_item_with_http_info(node_reboot_item, lnn, **kwargs) # noqa: E501
else:
(data) = self.create_node_reboot_item_with_http_info(node_reboot_item, lnn, **kwargs) # noqa: E501
return data
def create_node_reboot_item_with_http_info(self, node_reboot_item, lnn, **kwargs): # noqa: E501
"""create_node_reboot_item # noqa: E501
Reboot the node specified by <LNN>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_node_reboot_item_with_http_info(node_reboot_item, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty node_reboot_item: (required)
:param int lnn: (required)
:param bool force: Force reboot on Infinity platform even if a drive sled is not present.
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_reboot_item', 'lnn', 'force'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_node_reboot_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_reboot_item' is set
if ('node_reboot_item' not in params or
params['node_reboot_item'] is None):
raise ValueError("Missing the required parameter `node_reboot_item` when calling `create_node_reboot_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_node_reboot_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'force' in params:
query_params.append(('force', params['force'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_reboot_item' in params:
body_params = params['node_reboot_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/5/cluster/nodes/{Lnn}/reboot', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_node_shutdown_item(self, node_shutdown_item, lnn, **kwargs): # noqa: E501
"""create_node_shutdown_item # noqa: E501
Shutdown the node specified by <LNN>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_node_shutdown_item(node_shutdown_item, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty node_shutdown_item: (required)
:param int lnn: (required)
:param bool force: Force shutdown on Infinity platform even if a drive sled is not present.
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_node_shutdown_item_with_http_info(node_shutdown_item, lnn, **kwargs) # noqa: E501
else:
(data) = self.create_node_shutdown_item_with_http_info(node_shutdown_item, lnn, **kwargs) # noqa: E501
return data
def create_node_shutdown_item_with_http_info(self, node_shutdown_item, lnn, **kwargs): # noqa: E501
"""create_node_shutdown_item # noqa: E501
Shutdown the node specified by <LNN>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_node_shutdown_item_with_http_info(node_shutdown_item, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Empty node_shutdown_item: (required)
:param int lnn: (required)
:param bool force: Force shutdown on Infinity platform even if a drive sled is not present.
:return: Empty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_shutdown_item', 'lnn', 'force'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_node_shutdown_item" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_shutdown_item' is set
if ('node_shutdown_item' not in params or
params['node_shutdown_item'] is None):
raise ValueError("Missing the required parameter `node_shutdown_item` when calling `create_node_shutdown_item`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `create_node_shutdown_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'force' in params:
query_params.append(('force', params['force'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_shutdown_item' in params:
body_params = params['node_shutdown_item']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/5/cluster/nodes/{Lnn}/shutdown', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Empty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_drives_drive_firmware(self, lnn, driveid, **kwargs): # noqa: E501
"""get_drives_drive_firmware # noqa: E501
Retrieve drive firmware information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_drives_drive_firmware(lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param str driveid: (required)
:return: DrivesDriveFirmware
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_drives_drive_firmware_with_http_info(lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.get_drives_drive_firmware_with_http_info(lnn, driveid, **kwargs) # noqa: E501
return data
def get_drives_drive_firmware_with_http_info(self, lnn, driveid, **kwargs): # noqa: E501
"""get_drives_drive_firmware # noqa: E501
Retrieve drive firmware information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_drives_drive_firmware_with_http_info(lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param str driveid: (required)
:return: DrivesDriveFirmware
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_drives_drive_firmware" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_drives_drive_firmware`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `get_drives_drive_firmware`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/drives/{Driveid}/firmware', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DrivesDriveFirmware', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_drive(self, node_drive_id, lnn, **kwargs): # noqa: E501
"""get_node_drive # noqa: E501
Retrieve drive information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drive(node_drive_id, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str node_drive_id: Retrieve drive information. (required)
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDrives
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_drive_with_http_info(node_drive_id, lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_drive_with_http_info(node_drive_id, lnn, **kwargs) # noqa: E501
return data
def get_node_drive_with_http_info(self, node_drive_id, lnn, **kwargs): # noqa: E501
"""get_node_drive # noqa: E501
Retrieve drive information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drive_with_http_info(node_drive_id, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str node_drive_id: Retrieve drive information. (required)
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDrives
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_drive_id', 'lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_drive" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_drive_id' is set
if ('node_drive_id' not in params or
params['node_drive_id'] is None):
raise ValueError("Missing the required parameter `node_drive_id` when calling `get_node_drive`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_drive`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_drive`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_drive`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'node_drive_id' in params:
path_params['NodeDriveId'] = params['node_drive_id'] # noqa: E501
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/drives/{NodeDriveId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeDrives', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_driveconfig(self, lnn, **kwargs): # noqa: E501
"""get_node_driveconfig # noqa: E501
View a node's drive subsystem XML configuration file. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_driveconfig(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDriveconfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_driveconfig_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_driveconfig_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_driveconfig_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_driveconfig # noqa: E501
View a node's drive subsystem XML configuration file. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_driveconfig_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDriveconfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_driveconfig" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_driveconfig`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_driveconfig`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_driveconfig`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/driveconfig', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeDriveconfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_drives(self, lnn, **kwargs): # noqa: E501
"""get_node_drives # noqa: E501
List the drives on this node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drives(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDrives
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_drives_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_drives_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_drives_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_drives # noqa: E501
List the drives on this node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drives_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeDrives
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_drives" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_drives`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_drives`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_drives`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/drives', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeDrives', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_drives_purposelist(self, lnn, **kwargs): # noqa: E501
"""get_node_drives_purposelist # noqa: E501
Lists the available purposes for drives in this node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drives_purposelist(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeDrivesPurposelist
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_drives_purposelist_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_drives_purposelist_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_drives_purposelist_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_drives_purposelist # noqa: E501
Lists the available purposes for drives in this node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_drives_purposelist_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeDrivesPurposelist
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_drives_purposelist" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_drives_purposelist`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives-purposelist', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeDrivesPurposelist', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_hardware(self, lnn, **kwargs): # noqa: E501
"""get_node_hardware # noqa: E501
Retrieve node hardware identity information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_hardware(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeHardware
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_hardware_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_hardware_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_hardware_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_hardware # noqa: E501
Retrieve node hardware identity information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_hardware_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeHardware
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_hardware" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_hardware`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_hardware`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_hardware`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/5/cluster/nodes/{Lnn}/hardware', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeHardware', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_hardware_fast(self, lnn, **kwargs): # noqa: E501
"""get_node_hardware_fast # noqa: E501
Quickly retrieve a subset of node hardware identity information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_hardware_fast(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeHardwareFast
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_hardware_fast_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_hardware_fast_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_hardware_fast_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_hardware_fast # noqa: E501
Quickly retrieve a subset of node hardware identity information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_hardware_fast_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeHardwareFast
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_hardware_fast" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_hardware_fast`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/hardware-fast', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeHardwareFast', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_internal_ip_address(self, lnn, **kwargs): # noqa: E501
"""get_node_internal_ip_address # noqa: E501
View internal ip address with respect to node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_internal_ip_address(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeInternalIpAddress
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_internal_ip_address_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_internal_ip_address_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_internal_ip_address_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_internal_ip_address # noqa: E501
View internal ip address with respect to node. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_internal_ip_address_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeInternalIpAddress
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_internal_ip_address" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_internal_ip_address`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/internal-ip-address', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeInternalIpAddress', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_partitions(self, lnn, **kwargs): # noqa: E501
"""get_node_partitions # noqa: E501
Retrieve node partition information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_partitions(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodePartitions
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_partitions_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_partitions_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_partitions_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_partitions # noqa: E501
Retrieve node partition information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_partitions_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodePartitions
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_partitions" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_partitions`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/partitions', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodePartitions', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_sensors(self, lnn, **kwargs): # noqa: E501
"""get_node_sensors # noqa: E501
Retrieve node sensor information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sensors(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeSensors
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_sensors_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_sensors_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_sensors_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_sensors # noqa: E501
Retrieve node sensor information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sensors_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeSensors
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_sensors" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_sensors`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/sensors', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeSensors', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_sled(self, node_sled_id, lnn, **kwargs): # noqa: E501
"""get_node_sled # noqa: E501
Get detailed information for the sled specified by <SLEDID>, or all sleds in the case where <SLEDID> is 'all', in the node specified by <LNN>. Accepts <sledid> in either 'sled' or 'all' formats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sled(node_sled_id, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str node_sled_id: Get detailed information for the sled specified by <SLEDID>, or all sleds in the case where <SLEDID> is 'all', in the node specified by <LNN>. Accepts <sledid> in either 'sled' or 'all' formats. (required)
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeSleds
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_sled_with_http_info(node_sled_id, lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_sled_with_http_info(node_sled_id, lnn, **kwargs) # noqa: E501
return data
def get_node_sled_with_http_info(self, node_sled_id, lnn, **kwargs): # noqa: E501
"""get_node_sled # noqa: E501
Get detailed information for the sled specified by <SLEDID>, or all sleds in the case where <SLEDID> is 'all', in the node specified by <LNN>. Accepts <sledid> in either 'sled' or 'all' formats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sled_with_http_info(node_sled_id, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str node_sled_id: Get detailed information for the sled specified by <SLEDID>, or all sleds in the case where <SLEDID> is 'all', in the node specified by <LNN>. Accepts <sledid> in either 'sled' or 'all' formats. (required)
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeSleds
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_sled_id', 'lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_sled" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_sled_id' is set
if ('node_sled_id' not in params or
params['node_sled_id'] is None):
raise ValueError("Missing the required parameter `node_sled_id` when calling `get_node_sled`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_sled`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_sled`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_sled`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'node_sled_id' in params:
path_params['NodeSledId'] = params['node_sled_id'] # noqa: E501
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/5/cluster/nodes/{Lnn}/sleds/{NodeSledId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeSleds', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_sleds(self, lnn, **kwargs): # noqa: E501
"""get_node_sleds # noqa: E501
Get detailed information for all sleds in this node. Equivalent to /5/cluster/nodes/<lnn>/sleds/all. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sleds(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeSleds
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_sleds_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_sleds_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_sleds_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_sleds # noqa: E501
Get detailed information for all sleds in this node. Equivalent to /5/cluster/nodes/<lnn>/sleds/all. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_sleds_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param float timeout: Request timeout
:return: NodeSleds
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'timeout'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_sleds" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_sleds`") # noqa: E501
if 'timeout' in params and params['timeout'] > 4294967295: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_sleds`, must be a value less than or equal to `4294967295`") # noqa: E501
if 'timeout' in params and params['timeout'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `timeout` when calling `get_node_sleds`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/5/cluster/nodes/{Lnn}/sleds', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeSleds', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_state(self, lnn, **kwargs): # noqa: E501
"""get_node_state # noqa: E501
Retrieve node state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeState
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_state_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_state_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_state_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_state # noqa: E501
Retrieve node state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeState
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_state" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_state`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeState', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_state_readonly(self, lnn, **kwargs): # noqa: E501
"""get_node_state_readonly # noqa: E501
Retrieve node readonly state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_readonly(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateReadonly
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_state_readonly_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_state_readonly_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_state_readonly_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_state_readonly # noqa: E501
Retrieve node readonly state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_readonly_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateReadonly
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_state_readonly" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_state_readonly`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/readonly', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeStateReadonly', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_state_servicelight(self, lnn, **kwargs): # noqa: E501
"""get_node_state_servicelight # noqa: E501
Retrieve node service light state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_servicelight(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateServicelight
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_state_servicelight_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_state_servicelight_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_state_servicelight_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_state_servicelight # noqa: E501
Retrieve node service light state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_servicelight_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateServicelight
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_state_servicelight" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_state_servicelight`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/servicelight', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeStateServicelight', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_state_smartfail(self, lnn, **kwargs): # noqa: E501
"""get_node_state_smartfail # noqa: E501
Retrieve node smartfail state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_smartfail(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateSmartfail
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_state_smartfail_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_state_smartfail_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_state_smartfail_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_state_smartfail # noqa: E501
Retrieve node smartfail state information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_state_smartfail_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStateSmartfail
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_state_smartfail" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_state_smartfail`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/smartfail', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeStateSmartfail', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_status(self, lnn, **kwargs): # noqa: E501
"""get_node_status # noqa: E501
Retrieve node status information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_status(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_status_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_status_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_status_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_status # noqa: E501
Retrieve node status information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_status_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_status`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/status', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeStatus', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_node_status_batterystatus(self, lnn, **kwargs): # noqa: E501
"""get_node_status_batterystatus # noqa: E501
Retrieve node battery status information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_status_batterystatus(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStatusBatterystatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_node_status_batterystatus_with_http_info(lnn, **kwargs) # noqa: E501
else:
(data) = self.get_node_status_batterystatus_with_http_info(lnn, **kwargs) # noqa: E501
return data
def get_node_status_batterystatus_with_http_info(self, lnn, **kwargs): # noqa: E501
"""get_node_status_batterystatus # noqa: E501
Retrieve node battery status information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_node_status_batterystatus_with_http_info(lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:return: NodeStatusBatterystatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_node_status_batterystatus" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `get_node_status_batterystatus`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/status/batterystatus', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NodeStatusBatterystatus', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def list_drives_drive_firmware_update(self, lnn, driveid, **kwargs): # noqa: E501
"""list_drives_drive_firmware_update # noqa: E501
Retrieve firmware update information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_drives_drive_firmware_update(lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param str driveid: (required)
:return: DrivesDriveFirmwareUpdate
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.list_drives_drive_firmware_update_with_http_info(lnn, driveid, **kwargs) # noqa: E501
else:
(data) = self.list_drives_drive_firmware_update_with_http_info(lnn, driveid, **kwargs) # noqa: E501
return data
def list_drives_drive_firmware_update_with_http_info(self, lnn, driveid, **kwargs): # noqa: E501
"""list_drives_drive_firmware_update # noqa: E501
Retrieve firmware update information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_drives_drive_firmware_update_with_http_info(lnn, driveid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int lnn: (required)
:param str driveid: (required)
:return: DrivesDriveFirmwareUpdate
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['lnn', 'driveid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_drives_drive_firmware_update" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `list_drives_drive_firmware_update`") # noqa: E501
# verify the required parameter 'driveid' is set
if ('driveid' not in params or
params['driveid'] is None):
raise ValueError("Missing the required parameter `driveid` when calling `list_drives_drive_firmware_update`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
if 'driveid' in params:
path_params['Driveid'] = params['driveid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/drives/{Driveid}/firmware/update', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DrivesDriveFirmwareUpdate', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_node_driveconfig(self, node_driveconfig, lnn, **kwargs): # noqa: E501
"""update_node_driveconfig # noqa: E501
Modify a node's drive subsystem XML configuration file. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_driveconfig(node_driveconfig, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeDriveconfigExtended node_driveconfig: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_node_driveconfig_with_http_info(node_driveconfig, lnn, **kwargs) # noqa: E501
else:
(data) = self.update_node_driveconfig_with_http_info(node_driveconfig, lnn, **kwargs) # noqa: E501
return data
def update_node_driveconfig_with_http_info(self, node_driveconfig, lnn, **kwargs): # noqa: E501
"""update_node_driveconfig # noqa: E501
Modify a node's drive subsystem XML configuration file. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_driveconfig_with_http_info(node_driveconfig, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeDriveconfigExtended node_driveconfig: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_driveconfig', 'lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_node_driveconfig" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_driveconfig' is set
if ('node_driveconfig' not in params or
params['node_driveconfig'] is None):
raise ValueError("Missing the required parameter `node_driveconfig` when calling `update_node_driveconfig`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `update_node_driveconfig`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_driveconfig' in params:
body_params = params['node_driveconfig']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/7/cluster/nodes/{Lnn}/driveconfig', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_node_state_readonly(self, node_state_readonly, lnn, **kwargs): # noqa: E501
"""update_node_state_readonly # noqa: E501
Modify one or more node readonly state settings. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_readonly(node_state_readonly, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateReadonlyExtended node_state_readonly: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_node_state_readonly_with_http_info(node_state_readonly, lnn, **kwargs) # noqa: E501
else:
(data) = self.update_node_state_readonly_with_http_info(node_state_readonly, lnn, **kwargs) # noqa: E501
return data
def update_node_state_readonly_with_http_info(self, node_state_readonly, lnn, **kwargs): # noqa: E501
"""update_node_state_readonly # noqa: E501
Modify one or more node readonly state settings. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_readonly_with_http_info(node_state_readonly, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateReadonlyExtended node_state_readonly: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_state_readonly', 'lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_node_state_readonly" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_state_readonly' is set
if ('node_state_readonly' not in params or
params['node_state_readonly'] is None):
raise ValueError("Missing the required parameter `node_state_readonly` when calling `update_node_state_readonly`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `update_node_state_readonly`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_state_readonly' in params:
body_params = params['node_state_readonly']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/readonly', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_node_state_servicelight(self, node_state_servicelight, lnn, **kwargs): # noqa: E501
"""update_node_state_servicelight # noqa: E501
Modify one or more node service light state settings. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_servicelight(node_state_servicelight, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateServicelightExtended node_state_servicelight: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_node_state_servicelight_with_http_info(node_state_servicelight, lnn, **kwargs) # noqa: E501
else:
(data) = self.update_node_state_servicelight_with_http_info(node_state_servicelight, lnn, **kwargs) # noqa: E501
return data
def update_node_state_servicelight_with_http_info(self, node_state_servicelight, lnn, **kwargs): # noqa: E501
"""update_node_state_servicelight # noqa: E501
Modify one or more node service light state settings. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_servicelight_with_http_info(node_state_servicelight, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateServicelightExtended node_state_servicelight: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_state_servicelight', 'lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_node_state_servicelight" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_state_servicelight' is set
if ('node_state_servicelight' not in params or
params['node_state_servicelight'] is None):
raise ValueError("Missing the required parameter `node_state_servicelight` when calling `update_node_state_servicelight`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `update_node_state_servicelight`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_state_servicelight' in params:
body_params = params['node_state_servicelight']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/servicelight', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_node_state_smartfail(self, node_state_smartfail, lnn, **kwargs): # noqa: E501
"""update_node_state_smartfail # noqa: E501
Modify smartfail state of the node. Only the 'smartfailed' body member has any effect on node smartfail state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_smartfail(node_state_smartfail, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateSmartfailExtended node_state_smartfail: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_node_state_smartfail_with_http_info(node_state_smartfail, lnn, **kwargs) # noqa: E501
else:
(data) = self.update_node_state_smartfail_with_http_info(node_state_smartfail, lnn, **kwargs) # noqa: E501
return data
def update_node_state_smartfail_with_http_info(self, node_state_smartfail, lnn, **kwargs): # noqa: E501
"""update_node_state_smartfail # noqa: E501
Modify smartfail state of the node. Only the 'smartfailed' body member has any effect on node smartfail state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_node_state_smartfail_with_http_info(node_state_smartfail, lnn, async_req=True)
>>> result = thread.get()
:param async_req bool
:param NodeStateSmartfailExtended node_state_smartfail: (required)
:param int lnn: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['node_state_smartfail', 'lnn'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_node_state_smartfail" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'node_state_smartfail' is set
if ('node_state_smartfail' not in params or
params['node_state_smartfail'] is None):
raise ValueError("Missing the required parameter `node_state_smartfail` when calling `update_node_state_smartfail`") # noqa: E501
# verify the required parameter 'lnn' is set
if ('lnn' not in params or
params['lnn'] is None):
raise ValueError("Missing the required parameter `lnn` when calling `update_node_state_smartfail`") # noqa: E501
collection_formats = {}
path_params = {}
if 'lnn' in params:
path_params['Lnn'] = params['lnn'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'node_state_smartfail' in params:
body_params = params['node_state_smartfail']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/3/cluster/nodes/{Lnn}/state/smartfail', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 42.095045 | 239 | 0.618875 | 16,920 | 145,270 | 5.044799 | 0.016253 | 0.054359 | 0.020994 | 0.026992 | 0.984079 | 0.977518 | 0.969587 | 0.959149 | 0.948054 | 0.94516 | 0 | 0.018581 | 0.290184 | 145,270 | 3,450 | 240 | 42.107246 | 0.809213 | 0.31301 | 0 | 0.793885 | 1 | 0.006326 | 0.223548 | 0.077078 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034265 | false | 0 | 0.002109 | 0 | 0.087507 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8169255c8143810f407376e835d17277b416bbac | 13,833 | py | Python | test/inventory/test_views.py | gems-uff/labsys | b8990d7ef6377b6d34f66c277684af1ef94bd5c3 | [
"MIT"
] | 1 | 2017-05-04T17:32:17.000Z | 2017-05-04T17:32:17.000Z | test/inventory/test_views.py | gems-uff/labsys | b8990d7ef6377b6d34f66c277684af1ef94bd5c3 | [
"MIT"
] | 19 | 2017-06-05T22:52:45.000Z | 2018-06-02T18:17:26.000Z | test/inventory/test_views.py | gems-uff/labsys | b8990d7ef6377b6d34f66c277684af1ef94bd5c3 | [
"MIT"
] | null | null | null | import unittest
import datetime
from flask import request, url_for
from labsys.app import create_app, db
from labsys.auth.models import Role, User
from labsys.inventory import models
class TestInventoryViews(unittest.TestCase):
def setUp(self):
self.app = create_app('testing')
self.app_context = self.app.app_context()
self.app_context.push()
self.client = self.app.test_client(use_cookies=True)
db.create_all()
db.session.add(models.Stock(name='Test Stock'))
db.session.commit()
Role.insert_roles()
staff = Role.query.filter_by(name='Staff').first()
self.user = User(email='user@example.com',
password='example', role=staff, confirmed=True)
self.user.role = Role.query.filter_by(name='Staff').first()
def tearDown(self):
db.session.remove()
db.drop_all()
self.app_context.pop()
def test_purchase_same_product_two_catalogs_same_lot(self):
spec1 = models.Specification('cat1', 'man1', units=1)
db.session.add(spec1)
spec2 = models.Specification('cat2', 'man1', units=10)
db.session.add(spec2)
product1 = models.Product('Product1', spec1)
product1.specifications.append(spec2)
db.session.add(product1)
db.session.commit()
with self.client as client:
# login
res = client.post(url_for('auth.login'), data={
'email': 'user@example.com',
'password': 'example',
}, follow_redirects=True)
self.assertEqual(res.status_code, 200)
# Go to purchase screen
res = client.get(url_for('inventory.purchase_product'),
follow_redirects=False)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/add', request.path)
# Add order_item to cart
order_item_dict = {
'item_id': spec1.id,
'amount': 5,
'lot_number': 'lot1',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Add another order_item to cart same lot
order_item_dict = {
'item_id': spec2.id,
'amount': 5,
'lot_number': 'lot1',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Go to checkout
res = client.post(url_for('inventory.purchase_product'),
data={'finish_order': True},
follow_redirects=True)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/checkout', request.path)
# Finish order
order_dict = {
'invoice_type': 'Nota Fiscal',
'invoice': 'test invoice',
'financier': 'test financier',
'notes': 'test notes',
}
client.post(url_for('inventory.checkout'),
data=order_dict,
follow_redirects=True)
# Assert 2 transactions were created
transactions = models.Transaction.query.all()
self.assertEqual(len(transactions), 2)
transaction1 = transactions[0]
# Assert user was correctly assigned
self.assertEqual(transaction1.user, self.user)
order = models.Order.query.first()
self.assertEqual(order.user, self.user)
# Assert only 1 stock_product was created (same lot)
stock_products = models.StockProduct.query.all()
self.assertEqual(len(stock_products), 1)
# Assert products were added to stock
stock = models.Stock.get_reactive_stock()
self.assertEqual(stock_products, stock.stock_products)
# Assert its amount is 5 + 50
stock_product = stock_products[0]
self.assertEqual(stock_product.amount, 55)
def test_purchase_same_product_two_catalogs_diff_lot(self):
spec1 = models.Specification('cat1', 'man1', units=1)
db.session.add(spec1)
spec2 = models.Specification('cat2', 'man1', units=10)
db.session.add(spec2)
product1 = models.Product('Product1', spec1)
product1.specifications.append(spec2)
db.session.add(product1)
db.session.commit()
with self.client as client:
# login
res = client.post(url_for('auth.login'), data={
'email': 'user@example.com',
'password': 'example',
}, follow_redirects=True)
self.assertEqual(res.status_code, 200)
# Go to purchase screen
res = client.get(url_for('inventory.purchase_product'),
follow_redirects=False)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/add', request.path)
# Add order_item to cart
order_item_dict = {
'item_id': spec1.id,
'amount': 5,
'lot_number': 'lot1',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Add another order_item to cart same lot
order_item_dict = {
'item_id': spec2.id,
'amount': 5,
'lot_number': 'lot2',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Go to checkout
res = client.post(url_for('inventory.purchase_product'),
data={'finish_order': True},
follow_redirects=True)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/checkout', request.path)
# Finish order
order_dict = {
'invoice_type': 'Nota Fiscal',
'invoice': 'test invoice',
'financier': 'test financier',
'notes': 'test notes',
}
client.post(url_for('inventory.checkout'),
data=order_dict,
follow_redirects=True)
# Assert 2 transactions were created
transactions = models.Transaction.query.all()
self.assertEqual(len(transactions), 2)
transaction1 = transactions[0]
# Assert user was correctly assigned
self.assertEqual(transaction1.user, self.user)
order = models.Order.query.first()
self.assertEqual(order.user, self.user)
# Assert 2 stock_products were created (same lot)
stock_products = models.StockProduct.query.all()
self.assertEqual(len(stock_products), 2)
# Assert their amounts are 5 AND 50
stock_product = stock_products[0]
self.assertEqual(stock_product.amount, 5)
stock_product = stock_products[1]
self.assertEqual(stock_product.amount, 50)
def test_purchase_2_diff_products(self):
spec1 = models.Specification('cat1', 'man1', units=1)
db.session.add(spec1)
spec2 = models.Specification('cat2', 'man1', units=10)
db.session.add(spec2)
product1 = models.Product('Product1', spec1)
product2 = models.Product('Product2', spec2)
db.session.add(product1)
db.session.add(product2)
db.session.commit()
with self.client as client:
# login
res = client.post(url_for('auth.login'), data={
'email': 'user@example.com',
'password': 'example',
}, follow_redirects=True)
self.assertEqual(res.status_code, 200)
# Go to purchase screen
res = client.get(url_for('inventory.purchase_product'),
follow_redirects=False)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/add', request.path)
# Add order_item to cart
order_item_dict = {
'item_id': spec1.id,
'amount': 5,
'lot_number': 'lot1',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Add another order_item to cart same lot
order_item_dict = {
'item_id': spec2.id,
'amount': 5,
'lot_number': 'lot2',
'expiration_date': '2018-03-21',
}
res = client.post(url_for('inventory.purchase_product'),
data=order_item_dict,
follow_redirects=True)
self.assertIn('adicionado ao carrinho', res.get_data(as_text=True))
# Go to checkout
res = client.post(url_for('inventory.purchase_product'),
data={'finish_order': True},
follow_redirects=True)
self.assertEqual(res.status_code, 200)
self.assertEqual('/inventory/orders/checkout', request.path)
# Finish order
order_dict = {
'invoice_type': 'Nota Fiscal',
'invoice': 'test invoice',
'financier': 'test financier',
'notes': 'test notes',
}
client.post(url_for('inventory.checkout'),
data=order_dict,
follow_redirects=True)
# Assert 2 transactions were created
transactions = models.Transaction.query.all()
self.assertEqual(len(transactions), 2)
transaction1 = transactions[0]
# Assert user was correctly assigned
self.assertEqual(transaction1.user, self.user)
order = models.Order.query.first()
self.assertEqual(order.user, self.user)
self.assertEqual(order.notes, 'test notes')
# Assert 2 stock_products were created (same lot)
stock_products = models.StockProduct.query.all()
self.assertEqual(len(stock_products), 2)
# Assert their amounts are 5 AND 50
stock_product = stock_products[0]
self.assertEqual(stock_product.amount, 5)
stock_product = stock_products[1]
self.assertEqual(stock_product.amount, 50)
# Assert there are 2 products
self.assertEqual(len(models.Product.query.all()), 2)
def test_consume_product(self):
spec1 = models.Specification('cat1', 'man1')
prod1 = models.Product('Prod1', spec1)
# Add product to stock
models.Stock.get_reactive_stock().add(prod1, 'lot1', datetime.date.today(), 10)
stock_products = models.StockProduct.query.all()
# Asserts they were added
self.assertEqual(len(stock_products), 1)
self.assertEqual(stock_products[0].amount, 10)
with self.client as client:
# login
res = client.post(url_for('auth.login'), data={
'email': 'user@example.com',
'password': 'example',
}, follow_redirects=True)
self.assertEqual(res.status_code, 200)
# Try to consume amount greater than available
greater_amount_data = {
'stock_product_id': stock_products[0].id,
'amount': 11,
}
res = client.post(url_for('inventory.consume_product'),
data=greater_amount_data,
follow_redirects=True)
self.assertIn('Não há o suficiente', res.get_data(as_text=True))
# Assert stock is intact
self.assertEqual(stock_products[0].amount, 10)
# Try to consume a sufficient amount
sufficient_amount_data = {
'stock_product_id': stock_products[0].id,
'amount': 9,
}
res = client.post(url_for('inventory.consume_product'),
data=sufficient_amount_data,
follow_redirects=True)
self.assertIn('removidas do estoque', res.get_data(as_text=True))
# Assert stock was subtracted
self.assertEqual(stock_products[0].amount, 1)
# Assert transaction was created
transactions = models.Transaction.query.all()
self.assertEqual(len(transactions), 1)
self.assertEqual(transactions[0].amount, 9)
| 41.292537 | 87 | 0.551363 | 1,425 | 13,833 | 5.194386 | 0.118596 | 0.085112 | 0.031613 | 0.038908 | 0.84923 | 0.84923 | 0.816671 | 0.772629 | 0.764253 | 0.751824 | 0 | 0.024544 | 0.346129 | 13,833 | 334 | 88 | 41.416168 | 0.793809 | 0.078508 | 0 | 0.769531 | 0 | 0 | 0.133874 | 0.039588 | 0 | 0 | 0 | 0 | 0.195313 | 1 | 0.023438 | false | 0.019531 | 0.023438 | 0 | 0.050781 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4901a78da62df77c6ec6f2215f5258b81356faf | 405,122 | py | Python | pyboto3/codecommit.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 91 | 2016-12-31T11:38:37.000Z | 2021-09-16T19:33:23.000Z | pyboto3/codecommit.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 7 | 2017-01-02T18:54:23.000Z | 2020-08-11T13:54:02.000Z | pyboto3/codecommit.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 26 | 2016-12-31T13:11:00.000Z | 2022-03-03T21:01:12.000Z | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def associate_approval_rule_template_with_repository(approvalRuleTemplateName=None, repositoryName=None):
"""
Creates an association between an approval rule template and a specified repository. Then, the next time a pull request is created in the repository where the destination reference (if specified) matches the destination reference (branch) for the pull request, an approval rule that matches the template conditions is automatically created for that pull request. If no destination references are specified in the template, an approval rule that matches the template contents is created for all pull requests in that repository.
See also: AWS API Documentation
Exceptions
:example: response = client.associate_approval_rule_template_with_repository(
approvalRuleTemplateName='string',
repositoryName='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name for the approval rule template.\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that you want to associate with the template.\n
:returns:
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.MaximumRuleTemplatesAssociatedWithRepositoryException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def batch_associate_approval_rule_template_with_repositories(approvalRuleTemplateName=None, repositoryNames=None):
"""
Creates an association between an approval rule template and one or more specified repositories.
See also: AWS API Documentation
Exceptions
:example: response = client.batch_associate_approval_rule_template_with_repositories(
approvalRuleTemplateName='string',
repositoryNames=[
'string',
]
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the template you want to associate with one or more repositories.\n
:type repositoryNames: list
:param repositoryNames: [REQUIRED]\nThe names of the repositories you want to associate with the template.\n\nNote\nThe length constraint limit is for each string in the array. The array itself can be empty.\n\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'associatedRepositoryNames': [
'string',
],
'errors': [
{
'repositoryName': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
Response Structure
(dict) --
associatedRepositoryNames (list) --
A list of names of the repositories that have been associated with the template.
(string) --
errors (list) --
A list of any errors that might have occurred while attempting to create the association between the template and the repositories.
(dict) --
Returns information about errors in a BatchAssociateApprovalRuleTemplateWithRepositories operation.
repositoryName (string) --
The name of the repository where the association was not made.
errorCode (string) --
An error code that specifies whether the repository name was not valid or not found.
errorMessage (string) --
An error message that provides details about why the repository name was not found or not valid.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNamesRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryNamesExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'associatedRepositoryNames': [
'string',
],
'errors': [
{
'repositoryName': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
:returns:
(string) --
"""
pass
def batch_describe_merge_conflicts(repositoryName=None, destinationCommitSpecifier=None, sourceCommitSpecifier=None, mergeOption=None, maxMergeHunks=None, maxConflictFiles=None, filePaths=None, conflictDetailLevel=None, conflictResolutionStrategy=None, nextToken=None):
"""
Returns information about one or more merge conflicts in the attempted merge of two commit specifiers using the squash or three-way merge strategy.
See also: AWS API Documentation
Exceptions
:example: response = client.batch_describe_merge_conflicts(
repositoryName='string',
destinationCommitSpecifier='string',
sourceCommitSpecifier='string',
mergeOption='FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
maxMergeHunks=123,
maxConflictFiles=123,
filePaths=[
'string',
],
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
nextToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the merge conflicts you want to review.\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type mergeOption: string
:param mergeOption: [REQUIRED]\nThe merge option or strategy you want to use to merge the code.\n
:type maxMergeHunks: integer
:param maxMergeHunks: The maximum number of merge hunks to include in the output.
:type maxConflictFiles: integer
:param maxConflictFiles: The maximum number of files to include in the output.
:type filePaths: list
:param filePaths: The path of the target files used to describe the conflicts. If not specified, the default is all conflict files.\n\n(string) --\n\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:rtype: dict
ReturnsResponse Syntax
{
'conflicts': [
{
'conflictMetadata': {
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
'mergeHunks': [
{
'isConflict': True|False,
'source': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'destination': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'base': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
}
},
]
},
],
'nextToken': 'string',
'errors': [
{
'filePath': 'string',
'exceptionName': 'string',
'message': 'string'
},
],
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string'
}
Response Structure
(dict) --
conflicts (list) --
A list of conflicts for each file, including the conflict metadata and the hunks of the differences between the files.
(dict) --
Information about conflicts in a merge operation.
conflictMetadata (dict) --
Metadata about a conflict in a merge operation.
filePath (string) --
The path of the file that contains conflicts.
fileSizes (dict) --
The file sizes of the file in the source, destination, and base of the merge.
source (integer) --
The size of a file in the source of a merge or pull request.
destination (integer) --
The size of a file in the destination of a merge or pull request.
base (integer) --
The size of a file in the base of a merge or pull request.
fileModes (dict) --
The file modes of the file in the source, destination, and base of the merge.
source (string) --
The file mode of a file in the source of a merge or pull request.
destination (string) --
The file mode of a file in the destination of a merge or pull request.
base (string) --
The file mode of a file in the base of a merge or pull request.
objectTypes (dict) --
Information about any object type conflicts in a merge operation.
source (string) --
The type of the object in the source branch.
destination (string) --
The type of the object in the destination branch.
base (string) --
The type of the object in the base commit of the merge.
numberOfConflicts (integer) --
The number of conflicts, including both hunk conflicts and metadata conflicts.
isBinaryFile (dict) --
A boolean value (true or false) indicating whether the file is binary or textual in the source, destination, and base of the merge.
source (boolean) --
The binary or non-binary status of file in the source of a merge or pull request.
destination (boolean) --
The binary or non-binary status of a file in the destination of a merge or pull request.
base (boolean) --
The binary or non-binary status of a file in the base of a merge or pull request.
contentConflict (boolean) --
A boolean value indicating whether there are conflicts in the content of a file.
fileModeConflict (boolean) --
A boolean value indicating whether there are conflicts in the file mode of a file.
objectTypeConflict (boolean) --
A boolean value (true or false) indicating whether there are conflicts between the branches in the object type of a file, folder, or submodule.
mergeOperations (dict) --
Whether an add, modify, or delete operation caused the conflict between the source and destination of the merge.
source (string) --
The operation (add, modify, or delete) on a file in the source of a merge or pull request.
destination (string) --
The operation on a file in the destination of a merge or pull request.
mergeHunks (list) --
A list of hunks that contain the differences between files or lines causing the conflict.
(dict) --
Information about merge hunks in a merge or pull request operation.
isConflict (boolean) --
A Boolean value indicating whether a combination of hunks contains a conflict. Conflicts occur when the same file or the same lines in a file were modified in both the source and destination of a merge or pull request. Valid values include true, false, and null. True when the hunk represents a conflict and one or more files contains a line conflict. File mode conflicts in a merge do not set this to true.
source (dict) --
Information about the merge hunk in the source of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
destination (dict) --
Information about the merge hunk in the destination of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
base (dict) --
Information about the merge hunk in the base of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
errors (list) --
A list of any errors returned while describing the merge conflicts for each file.
(dict) --
Returns information about errors in a BatchDescribeMergeConflicts operation.
filePath (string) --
The path to the file.
exceptionName (string) --
The name of the exception.
message (string) --
The message provided by the exception.
destinationCommitId (string) --
The commit ID of the destination commit specifier that was used in the merge evaluation.
sourceCommitId (string) --
The commit ID of the source commit specifier that was used in the merge evaluation.
baseCommitId (string) --
The commit ID of the merge base.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.InvalidMaxConflictFilesException
CodeCommit.Client.exceptions.InvalidMaxMergeHunksException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'conflicts': [
{
'conflictMetadata': {
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
'mergeHunks': [
{
'isConflict': True|False,
'source': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'destination': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'base': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
}
},
]
},
],
'nextToken': 'string',
'errors': [
{
'filePath': 'string',
'exceptionName': 'string',
'message': 'string'
},
],
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.InvalidMaxConflictFilesException
CodeCommit.Client.exceptions.InvalidMaxMergeHunksException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def batch_disassociate_approval_rule_template_from_repositories(approvalRuleTemplateName=None, repositoryNames=None):
"""
Removes the association between an approval rule template and one or more specified repositories.
See also: AWS API Documentation
Exceptions
:example: response = client.batch_disassociate_approval_rule_template_from_repositories(
approvalRuleTemplateName='string',
repositoryNames=[
'string',
]
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the template that you want to disassociate from one or more repositories.\n
:type repositoryNames: list
:param repositoryNames: [REQUIRED]\nThe repository names that you want to disassociate from the approval rule template.\n\nNote\nThe length constraint limit is for each string in the array. The array itself can be empty.\n\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'disassociatedRepositoryNames': [
'string',
],
'errors': [
{
'repositoryName': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
Response Structure
(dict) --
disassociatedRepositoryNames (list) --
A list of repository names that have had their association with the template removed.
(string) --
errors (list) --
A list of any errors that might have occurred while attempting to remove the association between the template and the repositories.
(dict) --
Returns information about errors in a BatchDisassociateApprovalRuleTemplateFromRepositories operation.
repositoryName (string) --
The name of the repository where the association with the template was not able to be removed.
errorCode (string) --
An error code that specifies whether the repository name was not valid or not found.
errorMessage (string) --
An error message that provides details about why the repository name was either not found or not valid.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNamesRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryNamesExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'disassociatedRepositoryNames': [
'string',
],
'errors': [
{
'repositoryName': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
:returns:
(string) --
"""
pass
def batch_get_commits(commitIds=None, repositoryName=None):
"""
Returns information about the contents of one or more commits in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.batch_get_commits(
commitIds=[
'string',
],
repositoryName='string'
)
:type commitIds: list
:param commitIds: [REQUIRED]\nThe full commit IDs of the commits to get information about.\n\nNote\nYou must supply the full SHA IDs of each commit. You cannot use shortened SHA IDs.\n\n\n(string) --\n\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the commits.\n
:rtype: dict
ReturnsResponse Syntax
{
'commits': [
{
'commitId': 'string',
'treeId': 'string',
'parents': [
'string',
],
'message': 'string',
'author': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'committer': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'additionalData': 'string'
},
],
'errors': [
{
'commitId': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
Response Structure
(dict) --
commits (list) --
An array of commit data type objects, each of which contains information about a specified commit.
(dict) --
Returns information about a specific commit.
commitId (string) --
The full SHA ID of the specified commit.
treeId (string) --
Tree information for the specified commit.
parents (list) --
A list of parent commits for the specified commit. Each parent commit ID is the full commit ID.
(string) --
message (string) --
The commit message associated with the specified commit.
author (dict) --
Information about the author of the specified commit. Information includes the date in timestamp format with GMT offset, the name of the author, and the email address for the author, as configured in Git.
name (string) --
The name of the user who made the specified commit.
email (string) --
The email address associated with the user who made the commit, if any.
date (string) --
The date when the specified commit was commited, in timestamp format with GMT offset.
committer (dict) --
Information about the person who committed the specified commit, also known as the committer. Information includes the date in timestamp format with GMT offset, the name of the committer, and the email address for the committer, as configured in Git.
For more information about the difference between an author and a committer in Git, see Viewing the Commit History in Pro Git by Scott Chacon and Ben Straub.
name (string) --
The name of the user who made the specified commit.
email (string) --
The email address associated with the user who made the commit, if any.
date (string) --
The date when the specified commit was commited, in timestamp format with GMT offset.
additionalData (string) --
Any other data associated with the specified commit.
errors (list) --
Returns any commit IDs for which information could not be found. For example, if one of the commit IDs was a shortened SHA ID or that commit was not found in the specified repository, the ID returns an error object with more information.
(dict) --
Returns information about errors in a BatchGetCommits operation.
commitId (string) --
A commit ID that either could not be found or was not in a valid format.
errorCode (string) --
An error code that specifies whether the commit ID was not valid or not found.
errorMessage (string) --
An error message that provides detail about why the commit ID either was not found or was not valid.
Exceptions
CodeCommit.Client.exceptions.CommitIdsListRequiredException
CodeCommit.Client.exceptions.CommitIdsLimitExceededException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commits': [
{
'commitId': 'string',
'treeId': 'string',
'parents': [
'string',
],
'message': 'string',
'author': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'committer': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'additionalData': 'string'
},
],
'errors': [
{
'commitId': 'string',
'errorCode': 'string',
'errorMessage': 'string'
},
]
}
:returns:
(string) --
"""
pass
def batch_get_repositories(repositoryNames=None):
"""
Returns information about one or more repositories.
See also: AWS API Documentation
Exceptions
:example: response = client.batch_get_repositories(
repositoryNames=[
'string',
]
)
:type repositoryNames: list
:param repositoryNames: [REQUIRED]\nThe names of the repositories to get information about.\n\nNote\nThe length constraint limit is for each string in the array. The array itself can be empty.\n\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax{
'repositories': [
{
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
},
],
'repositoriesNotFound': [
'string',
]
}
Response Structure
(dict) --Represents the output of a batch get repositories operation.
repositories (list) --A list of repositories returned by the batch get repositories operation.
(dict) --Information about a repository.
accountId (string) --The ID of the AWS account associated with the repository.
repositoryId (string) --The ID of the repository.
repositoryName (string) --The repository\'s name.
repositoryDescription (string) --A comment or description about the repository.
defaultBranch (string) --The repository\'s default branch name.
lastModifiedDate (datetime) --The date and time the repository was last modified, in timestamp format.
creationDate (datetime) --The date and time the repository was created, in timestamp format.
cloneUrlHttp (string) --The URL to use for cloning the repository over HTTPS.
cloneUrlSsh (string) --The URL to use for cloning the repository over SSH.
Arn (string) --The Amazon Resource Name (ARN) of the repository.
repositoriesNotFound (list) --Returns a list of repository names for which information could not be found.
(string) --
Exceptions
CodeCommit.Client.exceptions.RepositoryNamesRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryNamesExceededException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'repositories': [
{
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
},
],
'repositoriesNotFound': [
'string',
]
}
:returns:
(string) --
"""
pass
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
"""
pass
def create_approval_rule_template(approvalRuleTemplateName=None, approvalRuleTemplateContent=None, approvalRuleTemplateDescription=None):
"""
Creates a template for approval rules that can then be associated with one or more repositories in your AWS account. When you associate a template with a repository, AWS CodeCommit creates an approval rule that matches the conditions of the template for all pull requests that meet the conditions of the template. For more information, see AssociateApprovalRuleTemplateWithRepository .
See also: AWS API Documentation
Exceptions
:example: response = client.create_approval_rule_template(
approvalRuleTemplateName='string',
approvalRuleTemplateContent='string',
approvalRuleTemplateDescription='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template. Provide descriptive names, because this name is applied to the approval rules created automatically in associated repositories.\n
:type approvalRuleTemplateContent: string
:param approvalRuleTemplateContent: [REQUIRED]\nThe content of the approval rule that is created on pull requests in associated repositories. If you specify one or more destination references (branches), approval rules are created in an associated repository only if their destination references (branches) match those specified in the template.\n\nNote\nWhen you create the content of the approval rule template, you can specify approvers in an approval pool in one of two ways:\n\nCodeCommitApprovers : This option only requires an AWS account and a resource. It can be used for both IAM users and federated access users whose name matches the provided resource name. This is a very powerful option that offers a great deal of flexibility. For example, if you specify the AWS account 123456789012 and Mary_Major , all of the following are counted as approvals coming from that user:\nAn IAM user in the account (arn:aws:iam::123456789012 :user/Mary_Major )\nA federated user identified in IAM as Mary_Major (arn:aws:sts::123456789012 :federated-user/Mary_Major )\n\n\n\nThis option does not recognize an active session of someone assuming the role of CodeCommitReview with a role session name of Mary_Major (arn:aws:sts::123456789012 :assumed-role/CodeCommitReview/Mary_Major ) unless you include a wildcard (*Mary_Major).\n\nFully qualified ARN : This option allows you to specify the fully qualified Amazon Resource Name (ARN) of the IAM user or role.\n\nFor more information about IAM ARNs, wildcards, and formats, see IAM Identifiers in the IAM User Guide .\n\n
:type approvalRuleTemplateDescription: string
:param approvalRuleTemplateDescription: The description of the approval rule template. Consider providing a description that explains what this template does and when it might be appropriate to associate it with repositories.
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
Response Structure
(dict) --
approvalRuleTemplate (dict) --
The content and structure of the created approval rule template.
approvalRuleTemplateId (string) --
The system-generated ID of the approval rule template.
approvalRuleTemplateName (string) --
The name of the approval rule template.
approvalRuleTemplateDescription (string) --
The description of the approval rule template.
approvalRuleTemplateContent (string) --
The content of the approval rule template.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule template.
lastModifiedDate (datetime) --
The date the approval rule template was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule template was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule template.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameAlreadyExistsException
CodeCommit.Client.exceptions.ApprovalRuleTemplateContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateContentException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateDescriptionException
CodeCommit.Client.exceptions.NumberOfRuleTemplatesExceededException
:return: {
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameAlreadyExistsException
CodeCommit.Client.exceptions.ApprovalRuleTemplateContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateContentException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateDescriptionException
CodeCommit.Client.exceptions.NumberOfRuleTemplatesExceededException
"""
pass
def create_branch(repositoryName=None, branchName=None, commitId=None):
"""
Creates a branch in a repository and points the branch to a commit.
See also: AWS API Documentation
Exceptions
:example: response = client.create_branch(
repositoryName='string',
branchName='string',
commitId='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository in which you want to create the new branch.\n
:type branchName: string
:param branchName: [REQUIRED]\nThe name of the new branch to create.\n
:type commitId: string
:param commitId: [REQUIRED]\nThe ID of the commit to point the new branch to.\n
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameExistsException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def create_commit(repositoryName=None, branchName=None, parentCommitId=None, authorName=None, email=None, commitMessage=None, keepEmptyFolders=None, putFiles=None, deleteFiles=None, setFileModes=None):
"""
Creates a commit for a repository on the tip of a specified branch.
See also: AWS API Documentation
Exceptions
:example: response = client.create_commit(
repositoryName='string',
branchName='string',
parentCommitId='string',
authorName='string',
email='string',
commitMessage='string',
keepEmptyFolders=True|False,
putFiles=[
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'fileContent': b'bytes',
'sourceFile': {
'filePath': 'string',
'isMove': True|False
}
},
],
deleteFiles=[
{
'filePath': 'string'
},
],
setFileModes=[
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you create the commit.\n
:type branchName: string
:param branchName: [REQUIRED]\nThe name of the branch where you create the commit.\n
:type parentCommitId: string
:param parentCommitId: The ID of the commit that is the parent of the commit you create. Not required if this is an empty repository.
:type authorName: string
:param authorName: The name of the author who created the commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address of the person who created the commit.
:type commitMessage: string
:param commitMessage: The commit message you want to include in the commit. Commit messages are limited to 256 KB. If no message is specified, a default message is used.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If true, a ..gitkeep file is created for empty folders. The default is false.
:type putFiles: list
:param putFiles: The files to add or update in this commit.\n\n(dict) --Information about a file added or updated as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path to the file in the repository, including the name of the file.\n\nfileMode (string) --The extrapolated file mode permissions for the file. Valid values include EXECUTABLE and NORMAL.\n\nfileContent (bytes) --The content of the file, if a source file is not specified.\n\nsourceFile (dict) --The name and full path of the file that contains the changes you want to make as part of the commit, if you are not providing the file content directly.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nisMove (boolean) --Whether to remove the source file from the parent commit.\n\n\n\n\n\n\n
:type deleteFiles: list
:param deleteFiles: The files to delete in this commit. These files still exist in earlier commits.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n
:type setFileModes: list
:param setFileModes: The file modes to update for files in this commit.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'treeId': 'string',
'filesAdded': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'filesUpdated': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'filesDeleted': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
Response Structure
(dict) --
commitId (string) --
The full commit ID of the commit that contains your committed file changes.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains the commited file changes.
filesAdded (list) --
The files added as part of the committed file changes.
(dict) --
A file to be added, updated, or deleted as part of a commit.
absolutePath (string) --
The full path to the file to be added or updated, including the name of the file.
blobId (string) --
The blob ID that contains the file information.
fileMode (string) --
The extrapolated file mode permissions for the file. Valid values include EXECUTABLE and NORMAL.
filesUpdated (list) --
The files updated as part of the commited file changes.
(dict) --
A file to be added, updated, or deleted as part of a commit.
absolutePath (string) --
The full path to the file to be added or updated, including the name of the file.
blobId (string) --
The blob ID that contains the file information.
fileMode (string) --
The extrapolated file mode permissions for the file. Valid values include EXECUTABLE and NORMAL.
filesDeleted (list) --
The files deleted as part of the committed file changes.
(dict) --
A file to be added, updated, or deleted as part of a commit.
absolutePath (string) --
The full path to the file to be added or updated, including the name of the file.
blobId (string) --
The blob ID that contains the file information.
fileMode (string) --
The extrapolated file mode permissions for the file. Valid values include EXECUTABLE and NORMAL.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.FileEntryRequiredException
CodeCommit.Client.exceptions.MaximumFileEntriesExceededException
CodeCommit.Client.exceptions.PutFileEntryConflictException
CodeCommit.Client.exceptions.SourceFileOrContentRequiredException
CodeCommit.Client.exceptions.FileContentAndSourceFileSpecifiedException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.SamePathRequestException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidDeletionParameterException
CodeCommit.Client.exceptions.RestrictedSourceFileException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.NoChangeException
CodeCommit.Client.exceptions.FileNameConflictsWithDirectoryNameException
CodeCommit.Client.exceptions.DirectoryNameConflictsWithFileNameException
CodeCommit.Client.exceptions.FilePathConflictsWithSubmodulePathException
:return: {
'commitId': 'string',
'treeId': 'string',
'filesAdded': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'filesUpdated': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'filesDeleted': [
{
'absolutePath': 'string',
'blobId': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.FileEntryRequiredException
CodeCommit.Client.exceptions.MaximumFileEntriesExceededException
CodeCommit.Client.exceptions.PutFileEntryConflictException
CodeCommit.Client.exceptions.SourceFileOrContentRequiredException
CodeCommit.Client.exceptions.FileContentAndSourceFileSpecifiedException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.SamePathRequestException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidDeletionParameterException
CodeCommit.Client.exceptions.RestrictedSourceFileException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.NoChangeException
CodeCommit.Client.exceptions.FileNameConflictsWithDirectoryNameException
CodeCommit.Client.exceptions.DirectoryNameConflictsWithFileNameException
CodeCommit.Client.exceptions.FilePathConflictsWithSubmodulePathException
"""
pass
def create_pull_request(title=None, description=None, targets=None, clientRequestToken=None):
"""
Creates a pull request in the specified repository.
See also: AWS API Documentation
Exceptions
:example: response = client.create_pull_request(
title='string',
description='string',
targets=[
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string'
},
],
clientRequestToken='string'
)
:type title: string
:param title: [REQUIRED]\nThe title of the pull request. This title is used to identify the pull request to other users in the repository.\n
:type description: string
:param description: A description of the pull request.
:type targets: list
:param targets: [REQUIRED]\nThe targets for the pull request, including the source of the code to be reviewed (the source branch) and the destination where the creator of the pull request intends the code to be merged after the pull request is closed (the destination branch).\n\n(dict) --Returns information about a target for a pull request.\n\nrepositoryName (string) -- [REQUIRED]The name of the repository that contains the pull request.\n\nsourceReference (string) -- [REQUIRED]The branch of the repository that contains the changes for the pull request. Also known as the source branch.\n\ndestinationReference (string) --The branch of the repository where the pull request changes are merged. Also known as the destination branch.\n\n\n\n\n
:type clientRequestToken: string
:param clientRequestToken: A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.\n\nNote\nThe AWS SDKs prepopulate client request tokens. If you are using an AWS SDK, an idempotency token is created for you.\n\nThis field is autopopulated if not provided.\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Information about the newly created pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.ReferenceNameRequiredException
CodeCommit.Client.exceptions.InvalidReferenceNameException
CodeCommit.Client.exceptions.ReferenceDoesNotExistException
CodeCommit.Client.exceptions.ReferenceTypeNotSupportedException
CodeCommit.Client.exceptions.TitleRequiredException
CodeCommit.Client.exceptions.InvalidTitleException
CodeCommit.Client.exceptions.InvalidDescriptionException
CodeCommit.Client.exceptions.TargetsRequiredException
CodeCommit.Client.exceptions.InvalidTargetsException
CodeCommit.Client.exceptions.TargetRequiredException
CodeCommit.Client.exceptions.InvalidTargetException
CodeCommit.Client.exceptions.MultipleRepositoriesInPullRequestException
CodeCommit.Client.exceptions.MaximumOpenPullRequestsExceededException
CodeCommit.Client.exceptions.SourceAndDestinationAreSameException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.ReferenceNameRequiredException
CodeCommit.Client.exceptions.InvalidReferenceNameException
CodeCommit.Client.exceptions.ReferenceDoesNotExistException
CodeCommit.Client.exceptions.ReferenceTypeNotSupportedException
CodeCommit.Client.exceptions.TitleRequiredException
CodeCommit.Client.exceptions.InvalidTitleException
CodeCommit.Client.exceptions.InvalidDescriptionException
CodeCommit.Client.exceptions.TargetsRequiredException
CodeCommit.Client.exceptions.InvalidTargetsException
CodeCommit.Client.exceptions.TargetRequiredException
CodeCommit.Client.exceptions.InvalidTargetException
CodeCommit.Client.exceptions.MultipleRepositoriesInPullRequestException
CodeCommit.Client.exceptions.MaximumOpenPullRequestsExceededException
CodeCommit.Client.exceptions.SourceAndDestinationAreSameException
"""
pass
def create_pull_request_approval_rule(pullRequestId=None, approvalRuleName=None, approvalRuleContent=None):
"""
Creates an approval rule for a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.create_pull_request_approval_rule(
pullRequestId='string',
approvalRuleName='string',
approvalRuleContent='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request for which you want to create the approval rule.\n
:type approvalRuleName: string
:param approvalRuleName: [REQUIRED]\nThe name for the approval rule.\n
:type approvalRuleContent: string
:param approvalRuleContent: [REQUIRED]\nThe content of the approval rule, including the number of approvals needed and the structure of an approval pool defined for approvals, if any. For more information about approval pools, see the AWS CodeCommit User Guide.\n\nNote\nWhen you create the content of the approval rule, you can specify approvers in an approval pool in one of two ways:\n\nCodeCommitApprovers : This option only requires an AWS account and a resource. It can be used for both IAM users and federated access users whose name matches the provided resource name. This is a very powerful option that offers a great deal of flexibility. For example, if you specify the AWS account 123456789012 and Mary_Major , all of the following would be counted as approvals coming from that user:\nAn IAM user in the account (arn:aws:iam::123456789012 :user/Mary_Major )\nA federated user identified in IAM as Mary_Major (arn:aws:sts::123456789012 :federated-user/Mary_Major )\n\n\n\nThis option does not recognize an active session of someone assuming the role of CodeCommitReview with a role session name of Mary_Major (arn:aws:sts::123456789012 :assumed-role/CodeCommitReview/Mary_Major ) unless you include a wildcard (*Mary_Major).\n\nFully qualified ARN : This option allows you to specify the fully qualified Amazon Resource Name (ARN) of the IAM user or role.\n\nFor more information about IAM ARNs, wildcards, and formats, see IAM Identifiers in the IAM User Guide .\n\n
:rtype: dict
ReturnsResponse Syntax
{
'approvalRule': {
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
}
}
Response Structure
(dict) --
approvalRule (dict) --
Information about the created approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.ApprovalRuleNameAlreadyExistsException
CodeCommit.Client.exceptions.ApprovalRuleContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleContentException
CodeCommit.Client.exceptions.NumberOfRulesExceededException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'approvalRule': {
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
}
}
:returns:
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.ApprovalRuleNameAlreadyExistsException
CodeCommit.Client.exceptions.ApprovalRuleContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleContentException
CodeCommit.Client.exceptions.NumberOfRulesExceededException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def create_repository(repositoryName=None, repositoryDescription=None, tags=None):
"""
Creates a new, empty repository.
See also: AWS API Documentation
Exceptions
:example: response = client.create_repository(
repositoryName='string',
repositoryDescription='string',
tags={
'string': 'string'
}
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the new repository to be created.\n\nNote\nThe repository name must be unique across the calling AWS account. Repository names are limited to 100 alphanumeric, dash, and underscore characters, and cannot include certain characters. For more information about the limits on repository names, see Limits in the AWS CodeCommit User Guide . The suffix .git is prohibited.\n\n
:type repositoryDescription: string
:param repositoryDescription: A comment or description about the new repository.\n\nNote\nThe description field for a repository accepts all HTML characters and all valid Unicode characters. Applications that do not HTML-encode the description and display it in a webpage can expose users to potentially malicious code. Make sure that you HTML-encode the description field in any application that uses this API to display the repository description on a webpage.\n\n
:type tags: dict
:param tags: One or more tag key-value pairs to use when tagging this repository.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'repositoryMetadata': {
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
}
}
Response Structure
(dict) --
Represents the output of a create repository operation.
repositoryMetadata (dict) --
Information about the newly created repository.
accountId (string) --
The ID of the AWS account associated with the repository.
repositoryId (string) --
The ID of the repository.
repositoryName (string) --
The repository\'s name.
repositoryDescription (string) --
A comment or description about the repository.
defaultBranch (string) --
The repository\'s default branch name.
lastModifiedDate (datetime) --
The date and time the repository was last modified, in timestamp format.
creationDate (datetime) --
The date and time the repository was created, in timestamp format.
cloneUrlHttp (string) --
The URL to use for cloning the repository over HTTPS.
cloneUrlSsh (string) --
The URL to use for cloning the repository over SSH.
Arn (string) --
The Amazon Resource Name (ARN) of the repository.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameExistsException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.InvalidRepositoryDescriptionException
CodeCommit.Client.exceptions.RepositoryLimitExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.InvalidTagsMapException
CodeCommit.Client.exceptions.TooManyTagsException
CodeCommit.Client.exceptions.InvalidSystemTagUsageException
CodeCommit.Client.exceptions.TagPolicyException
:return: {
'repositoryMetadata': {
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameExistsException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.InvalidRepositoryDescriptionException
CodeCommit.Client.exceptions.RepositoryLimitExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.InvalidTagsMapException
CodeCommit.Client.exceptions.TooManyTagsException
CodeCommit.Client.exceptions.InvalidSystemTagUsageException
CodeCommit.Client.exceptions.TagPolicyException
"""
pass
def create_unreferenced_merge_commit(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, mergeOption=None, conflictDetailLevel=None, conflictResolutionStrategy=None, authorName=None, email=None, commitMessage=None, keepEmptyFolders=None, conflictResolution=None):
"""
Creates an unreferenced commit that represents the result of merging two branches using a specified merge strategy. This can help you determine the outcome of a potential merge. This API cannot be used with the fast-forward merge strategy because that strategy does not create a merge commit.
See also: AWS API Documentation
Exceptions
:example: response = client.create_unreferenced_merge_commit(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
mergeOption='FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
authorName='string',
email='string',
commitMessage='string',
keepEmptyFolders=True|False,
conflictResolution={
'replaceContents': [
{
'filePath': 'string',
'replacementType': 'KEEP_BASE'|'KEEP_SOURCE'|'KEEP_DESTINATION'|'USE_NEW_CONTENT',
'content': b'bytes',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'deleteFiles': [
{
'filePath': 'string'
},
],
'setFileModes': [
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to create the unreferenced merge commit.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type mergeOption: string
:param mergeOption: [REQUIRED]\nThe merge option or strategy you want to use to merge the code.\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type authorName: string
:param authorName: The name of the author who created the unreferenced commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address for the person who created the unreferenced commit.
:type commitMessage: string
:param commitMessage: The commit message for the unreferenced commit.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If this is specified as true, a .gitkeep file is created for empty folders. The default is false.
:type conflictResolution: dict
:param conflictResolution: If AUTOMERGE is the conflict resolution strategy, a list of inputs to use when resolving conflicts during a merge.\n\nreplaceContents (list) --Files to have content replaced as part of the merge conflict resolution.\n\n(dict) --Information about a replacement content entry in the conflict of a merge or pull request operation.\n\nfilePath (string) -- [REQUIRED]The path of the conflicting file.\n\nreplacementType (string) -- [REQUIRED]The replacement type to use when determining how to resolve the conflict.\n\ncontent (bytes) --The base-64 encoded content to use when the replacement type is USE_NEW_CONTENT.\n\nfileMode (string) --The file mode to apply during conflict resoltion.\n\n\n\n\n\ndeleteFiles (list) --Files to be deleted as part of the merge conflict resolution.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n\nsetFileModes (list) --File modes that are set as part of the merge conflict resolution.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'treeId': 'string'
}
Response Structure
(dict) --
commitId (string) --
The full commit ID of the commit that contains your merge results.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains the merge results.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'treeId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def delete_approval_rule_template(approvalRuleTemplateName=None):
"""
Deletes a specified approval rule template. Deleting a template does not remove approval rules on pull requests already created with the template.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_approval_rule_template(
approvalRuleTemplateName='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template to delete.\n
:rtype: dict
ReturnsResponse Syntax{
'approvalRuleTemplateId': 'string'
}
Response Structure
(dict) --
approvalRuleTemplateId (string) --The system-generated ID of the deleted approval rule template. If the template has been previously deleted, the only response is a 200 OK.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateInUseException
:return: {
'approvalRuleTemplateId': 'string'
}
"""
pass
def delete_branch(repositoryName=None, branchName=None):
"""
Deletes a branch from a repository, unless that branch is the default branch for the repository.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_branch(
repositoryName='string',
branchName='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the branch to be deleted.\n
:type branchName: string
:param branchName: [REQUIRED]\nThe name of the branch to delete.\n
:rtype: dict
ReturnsResponse Syntax
{
'deletedBranch': {
'branchName': 'string',
'commitId': 'string'
}
}
Response Structure
(dict) --
Represents the output of a delete branch operation.
deletedBranch (dict) --
Information about the branch deleted by the operation, including the branch name and the commit ID that was the tip of the branch.
branchName (string) --
The name of the branch.
commitId (string) --
The ID of the last commit made to the branch.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.DefaultBranchCannotBeDeletedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'deletedBranch': {
'branchName': 'string',
'commitId': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.DefaultBranchCannotBeDeletedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def delete_comment_content(commentId=None):
"""
Deletes the content of a comment made on a change, file, or commit in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_comment_content(
commentId='string'
)
:type commentId: string
:param commentId: [REQUIRED]\nThe unique, system-generated ID of the comment. To get this ID, use GetCommentsForComparedCommit or GetCommentsForPullRequest .\n
:rtype: dict
ReturnsResponse Syntax{
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
comment (dict) --Information about the comment you just deleted.
commentId (string) --The system-generated comment ID.
content (string) --The content of the comment.
inReplyTo (string) --The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
CodeCommit.Client.exceptions.CommentDeletedException
:return: {
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
"""
pass
def delete_file(repositoryName=None, branchName=None, filePath=None, parentCommitId=None, keepEmptyFolders=None, commitMessage=None, name=None, email=None):
"""
Deletes a specified file from a specified branch. A commit is created on the branch that contains the revision. The file still exists in the commits earlier to the commit that contains the deletion.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_file(
repositoryName='string',
branchName='string',
filePath='string',
parentCommitId='string',
keepEmptyFolders=True|False,
commitMessage='string',
name='string',
email='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the file to delete.\n
:type branchName: string
:param branchName: [REQUIRED]\nThe name of the branch where the commit that deletes the file is made.\n
:type filePath: string
:param filePath: [REQUIRED]\nThe fully qualified path to the file that to be deleted, including the full name and extension of that file. For example, /examples/file.md is a fully qualified path to a file named file.md in a folder named examples.\n
:type parentCommitId: string
:param parentCommitId: [REQUIRED]\nThe ID of the commit that is the tip of the branch where you want to create the commit that deletes the file. This must be the HEAD commit for the branch. The commit that deletes the file is created from this commit ID.\n
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If a file is the only object in the folder or directory, specifies whether to delete the folder or directory that contains the file. By default, empty folders are deleted. This includes empty folders that are part of the directory structure. For example, if the path to a file is dir1/dir2/dir3/dir4, and dir2 and dir3 are empty, deleting the last file in dir4 also deletes the empty folders dir4, dir3, and dir2.
:type commitMessage: string
:param commitMessage: The commit message you want to include as part of deleting the file. Commit messages are limited to 256 KB. If no message is specified, a default message is used.
:type name: string
:param name: The name of the author of the commit that deletes the file. If no name is specified, the user\'s ARN is used as the author name and committer name.
:type email: string
:param email: The email address for the commit that deletes the file. If no email address is specified, the email address is left blank.
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'blobId': 'string',
'treeId': 'string',
'filePath': 'string'
}
Response Structure
(dict) --
commitId (string) --
The full commit ID of the commit that contains the change that deletes the file.
blobId (string) --
The blob ID removed from the tree as part of deleting the file.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains the delete file change.
filePath (string) --
The fully qualified path to the file to be deleted, including the full name and extension of that file.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'blobId': 'string',
'treeId': 'string',
'filePath': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def delete_pull_request_approval_rule(pullRequestId=None, approvalRuleName=None):
"""
Deletes an approval rule from a specified pull request. Approval rules can be deleted from a pull request only if the pull request is open, and if the approval rule was created specifically for a pull request and not generated from an approval rule template associated with the repository where the pull request was created. You cannot delete an approval rule from a merged or closed pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_pull_request_approval_rule(
pullRequestId='string',
approvalRuleName='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request that contains the approval rule you want to delete.\n
:type approvalRuleName: string
:param approvalRuleName: [REQUIRED]\nThe name of the approval rule you want to delete.\n
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleId': 'string'
}
Response Structure
(dict) --
approvalRuleId (string) --
The ID of the deleted approval rule.
Note
If the approval rule was deleted in an earlier API call, the response is 200 OK without content.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.CannotDeleteApprovalRuleFromTemplateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'approvalRuleId': 'string'
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.CannotDeleteApprovalRuleFromTemplateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def delete_repository(repositoryName=None):
"""
Deletes a repository. If a specified repository was already deleted, a null repository ID is returned.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_repository(
repositoryName='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository to delete.\n
:rtype: dict
ReturnsResponse Syntax{
'repositoryId': 'string'
}
Response Structure
(dict) --Represents the output of a delete repository operation.
repositoryId (string) --The ID of the repository that was deleted.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'repositoryId': 'string'
}
"""
pass
def describe_merge_conflicts(repositoryName=None, destinationCommitSpecifier=None, sourceCommitSpecifier=None, mergeOption=None, maxMergeHunks=None, filePath=None, conflictDetailLevel=None, conflictResolutionStrategy=None, nextToken=None):
"""
Returns information about one or more merge conflicts in the attempted merge of two commit specifiers using the squash or three-way merge strategy. If the merge option for the attempted merge is specified as FAST_FORWARD_MERGE, an exception is thrown.
See also: AWS API Documentation
Exceptions
:example: response = client.describe_merge_conflicts(
repositoryName='string',
destinationCommitSpecifier='string',
sourceCommitSpecifier='string',
mergeOption='FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
maxMergeHunks=123,
filePath='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
nextToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to get information about a merge conflict.\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type mergeOption: string
:param mergeOption: [REQUIRED]\nThe merge option or strategy you want to use to merge the code.\n
:type maxMergeHunks: integer
:param maxMergeHunks: The maximum number of merge hunks to include in the output.
:type filePath: string
:param filePath: [REQUIRED]\nThe path of the target files used to describe the conflicts.\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:rtype: dict
ReturnsResponse Syntax
{
'conflictMetadata': {
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
'mergeHunks': [
{
'isConflict': True|False,
'source': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'destination': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'base': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
}
},
],
'nextToken': 'string',
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string'
}
Response Structure
(dict) --
conflictMetadata (dict) --
Contains metadata about the conflicts found in the merge.
filePath (string) --
The path of the file that contains conflicts.
fileSizes (dict) --
The file sizes of the file in the source, destination, and base of the merge.
source (integer) --
The size of a file in the source of a merge or pull request.
destination (integer) --
The size of a file in the destination of a merge or pull request.
base (integer) --
The size of a file in the base of a merge or pull request.
fileModes (dict) --
The file modes of the file in the source, destination, and base of the merge.
source (string) --
The file mode of a file in the source of a merge or pull request.
destination (string) --
The file mode of a file in the destination of a merge or pull request.
base (string) --
The file mode of a file in the base of a merge or pull request.
objectTypes (dict) --
Information about any object type conflicts in a merge operation.
source (string) --
The type of the object in the source branch.
destination (string) --
The type of the object in the destination branch.
base (string) --
The type of the object in the base commit of the merge.
numberOfConflicts (integer) --
The number of conflicts, including both hunk conflicts and metadata conflicts.
isBinaryFile (dict) --
A boolean value (true or false) indicating whether the file is binary or textual in the source, destination, and base of the merge.
source (boolean) --
The binary or non-binary status of file in the source of a merge or pull request.
destination (boolean) --
The binary or non-binary status of a file in the destination of a merge or pull request.
base (boolean) --
The binary or non-binary status of a file in the base of a merge or pull request.
contentConflict (boolean) --
A boolean value indicating whether there are conflicts in the content of a file.
fileModeConflict (boolean) --
A boolean value indicating whether there are conflicts in the file mode of a file.
objectTypeConflict (boolean) --
A boolean value (true or false) indicating whether there are conflicts between the branches in the object type of a file, folder, or submodule.
mergeOperations (dict) --
Whether an add, modify, or delete operation caused the conflict between the source and destination of the merge.
source (string) --
The operation (add, modify, or delete) on a file in the source of a merge or pull request.
destination (string) --
The operation on a file in the destination of a merge or pull request.
mergeHunks (list) --
A list of merge hunks of the differences between the files or lines.
(dict) --
Information about merge hunks in a merge or pull request operation.
isConflict (boolean) --
A Boolean value indicating whether a combination of hunks contains a conflict. Conflicts occur when the same file or the same lines in a file were modified in both the source and destination of a merge or pull request. Valid values include true, false, and null. True when the hunk represents a conflict and one or more files contains a line conflict. File mode conflicts in a merge do not set this to true.
source (dict) --
Information about the merge hunk in the source of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
destination (dict) --
Information about the merge hunk in the destination of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
base (dict) --
Information about the merge hunk in the base of a merge or pull request.
startLine (integer) --
The start position of the hunk in the merge result.
endLine (integer) --
The end position of the hunk in the merge result.
hunkContent (string) --
The base-64 encoded content of the hunk merged region that might contain a conflict.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
destinationCommitId (string) --
The commit ID of the destination commit specifier that was used in the merge evaluation.
sourceCommitId (string) --
The commit ID of the source commit specifier that was used in the merge evaluation.
baseCommitId (string) --
The commit ID of the merge base.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxMergeHunksException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'conflictMetadata': {
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
'mergeHunks': [
{
'isConflict': True|False,
'source': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'destination': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
},
'base': {
'startLine': 123,
'endLine': 123,
'hunkContent': 'string'
}
},
],
'nextToken': 'string',
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxMergeHunksException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def describe_pull_request_events(pullRequestId=None, pullRequestEventType=None, actorArn=None, nextToken=None, maxResults=None):
"""
Returns information about one or more pull request events.
See also: AWS API Documentation
Exceptions
:example: response = client.describe_pull_request_events(
pullRequestId='string',
pullRequestEventType='PULL_REQUEST_CREATED'|'PULL_REQUEST_STATUS_CHANGED'|'PULL_REQUEST_SOURCE_REFERENCE_UPDATED'|'PULL_REQUEST_MERGE_STATE_CHANGED'|'PULL_REQUEST_APPROVAL_RULE_CREATED'|'PULL_REQUEST_APPROVAL_RULE_UPDATED'|'PULL_REQUEST_APPROVAL_RULE_DELETED'|'PULL_REQUEST_APPROVAL_RULE_OVERRIDDEN'|'PULL_REQUEST_APPROVAL_STATE_CHANGED',
actorArn='string',
nextToken='string',
maxResults=123
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type pullRequestEventType: string
:param pullRequestEventType: Optional. The pull request event type about which you want to return information.
:type actorArn: string
:param actorArn: The Amazon Resource Name (ARN) of the user whose actions resulted in the event. Examples include updating the pull request with more commits or changing the status of a pull request.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results. The default is 100 events, which is also the maximum number of events that can be returned in a result.
:rtype: dict
ReturnsResponse Syntax
{
'pullRequestEvents': [
{
'pullRequestId': 'string',
'eventDate': datetime(2015, 1, 1),
'pullRequestEventType': 'PULL_REQUEST_CREATED'|'PULL_REQUEST_STATUS_CHANGED'|'PULL_REQUEST_SOURCE_REFERENCE_UPDATED'|'PULL_REQUEST_MERGE_STATE_CHANGED'|'PULL_REQUEST_APPROVAL_RULE_CREATED'|'PULL_REQUEST_APPROVAL_RULE_UPDATED'|'PULL_REQUEST_APPROVAL_RULE_DELETED'|'PULL_REQUEST_APPROVAL_RULE_OVERRIDDEN'|'PULL_REQUEST_APPROVAL_STATE_CHANGED',
'actorArn': 'string',
'pullRequestCreatedEventMetadata': {
'repositoryName': 'string',
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'mergeBase': 'string'
},
'pullRequestStatusChangedEventMetadata': {
'pullRequestStatus': 'OPEN'|'CLOSED'
},
'pullRequestSourceReferenceUpdatedEventMetadata': {
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'mergeBase': 'string'
},
'pullRequestMergedStateChangedEventMetadata': {
'repositoryName': 'string',
'destinationReference': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
'approvalRuleEventMetadata': {
'approvalRuleName': 'string',
'approvalRuleId': 'string',
'approvalRuleContent': 'string'
},
'approvalStateChangedEventMetadata': {
'revisionId': 'string',
'approvalStatus': 'APPROVE'|'REVOKE'
},
'approvalRuleOverriddenEventMetadata': {
'revisionId': 'string',
'overrideStatus': 'OVERRIDE'|'REVOKE'
}
},
],
'nextToken': 'string'
}
Response Structure
(dict) --
pullRequestEvents (list) --
Information about the pull request events.
(dict) --
Returns information about a pull request event.
pullRequestId (string) --
The system-generated ID of the pull request.
eventDate (datetime) --
The day and time of the pull request event, in timestamp format.
pullRequestEventType (string) --
The type of the pull request event (for example, a status change event (PULL_REQUEST_STATUS_CHANGED) or update event (PULL_REQUEST_SOURCE_REFERENCE_UPDATED)).
actorArn (string) --
The Amazon Resource Name (ARN) of the user whose actions resulted in the event. Examples include updating the pull request with more commits or changing the status of a pull request.
pullRequestCreatedEventMetadata (dict) --
Information about the source and destination branches for the pull request.
repositoryName (string) --
The name of the repository where the pull request was created.
sourceCommitId (string) --
The commit ID on the source branch used when the pull request was created.
destinationCommitId (string) --
The commit ID of the tip of the branch specified as the destination branch when the pull request was created.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
pullRequestStatusChangedEventMetadata (dict) --
Information about the change in status for the pull request event.
pullRequestStatus (string) --
The changed status of the pull request.
pullRequestSourceReferenceUpdatedEventMetadata (dict) --
Information about the updated source branch for the pull request event.
repositoryName (string) --
The name of the repository where the pull request was updated.
beforeCommitId (string) --
The full commit ID of the commit in the destination branch that was the tip of the branch at the time the pull request was updated.
afterCommitId (string) --
The full commit ID of the commit in the source branch that was the tip of the branch at the time the pull request was updated.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
pullRequestMergedStateChangedEventMetadata (dict) --
Information about the change in mergability state for the pull request event.
repositoryName (string) --
The name of the repository where the pull request was created.
destinationReference (string) --
The name of the branch that the pull request is merged into.
mergeMetadata (dict) --
Information about the merge state change event.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
approvalRuleEventMetadata (dict) --
Information about a pull request event.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
approvalStateChangedEventMetadata (dict) --
Information about an approval state change for a pull request.
revisionId (string) --
The revision ID of the pull request when the approval state changed.
approvalStatus (string) --
The approval status for the pull request.
approvalRuleOverriddenEventMetadata (dict) --
Information about an approval rule override event for a pull request.
revisionId (string) --
The revision ID of the pull request when the override event occurred.
overrideStatus (string) --
The status of the override event.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestEventTypeException
CodeCommit.Client.exceptions.InvalidActorArnException
CodeCommit.Client.exceptions.ActorDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequestEvents': [
{
'pullRequestId': 'string',
'eventDate': datetime(2015, 1, 1),
'pullRequestEventType': 'PULL_REQUEST_CREATED'|'PULL_REQUEST_STATUS_CHANGED'|'PULL_REQUEST_SOURCE_REFERENCE_UPDATED'|'PULL_REQUEST_MERGE_STATE_CHANGED'|'PULL_REQUEST_APPROVAL_RULE_CREATED'|'PULL_REQUEST_APPROVAL_RULE_UPDATED'|'PULL_REQUEST_APPROVAL_RULE_DELETED'|'PULL_REQUEST_APPROVAL_RULE_OVERRIDDEN'|'PULL_REQUEST_APPROVAL_STATE_CHANGED',
'actorArn': 'string',
'pullRequestCreatedEventMetadata': {
'repositoryName': 'string',
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'mergeBase': 'string'
},
'pullRequestStatusChangedEventMetadata': {
'pullRequestStatus': 'OPEN'|'CLOSED'
},
'pullRequestSourceReferenceUpdatedEventMetadata': {
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'mergeBase': 'string'
},
'pullRequestMergedStateChangedEventMetadata': {
'repositoryName': 'string',
'destinationReference': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
'approvalRuleEventMetadata': {
'approvalRuleName': 'string',
'approvalRuleId': 'string',
'approvalRuleContent': 'string'
},
'approvalStateChangedEventMetadata': {
'revisionId': 'string',
'approvalStatus': 'APPROVE'|'REVOKE'
},
'approvalRuleOverriddenEventMetadata': {
'revisionId': 'string',
'overrideStatus': 'OVERRIDE'|'REVOKE'
}
},
],
'nextToken': 'string'
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestEventTypeException
CodeCommit.Client.exceptions.InvalidActorArnException
CodeCommit.Client.exceptions.ActorDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def disassociate_approval_rule_template_from_repository(approvalRuleTemplateName=None, repositoryName=None):
"""
Removes the association between a template and a repository so that approval rules based on the template are not automatically created when pull requests are created in the specified repository. This does not delete any approval rules previously created for pull requests through the template association.
See also: AWS API Documentation
Exceptions
:example: response = client.disassociate_approval_rule_template_from_repository(
approvalRuleTemplateName='string',
repositoryName='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template to disassociate from a specified repository.\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository you want to disassociate from the template.\n
:returns:
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def evaluate_pull_request_approval_rules(pullRequestId=None, revisionId=None):
"""
Evaluates whether a pull request has met all the conditions specified in its associated approval rules.
See also: AWS API Documentation
Exceptions
:example: response = client.evaluate_pull_request_approval_rules(
pullRequestId='string',
revisionId='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request you want to evaluate.\n
:type revisionId: string
:param revisionId: [REQUIRED]\nThe system-generated ID for the pull request revision. To retrieve the most recent revision ID for a pull request, use GetPullRequest .\n
:rtype: dict
ReturnsResponse Syntax
{
'evaluation': {
'approved': True|False,
'overridden': True|False,
'approvalRulesSatisfied': [
'string',
],
'approvalRulesNotSatisfied': [
'string',
]
}
}
Response Structure
(dict) --
evaluation (dict) --
The result of the evaluation, including the names of the rules whose conditions have been met (if any), the names of the rules whose conditions have not been met (if any), whether the pull request is in the approved state, and whether the pull request approval rule has been set aside by an override.
approved (boolean) --
Whether the state of the pull request is approved.
overridden (boolean) --
Whether the approval rule requirements for the pull request have been overridden and no longer need to be met.
approvalRulesSatisfied (list) --
The names of the approval rules that have had their conditions met.
(string) --
approvalRulesNotSatisfied (list) --
The names of the approval rules that have not had their conditions met.
(string) --
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.RevisionNotCurrentException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'evaluation': {
'approved': True|False,
'overridden': True|False,
'approvalRulesSatisfied': [
'string',
],
'approvalRulesNotSatisfied': [
'string',
]
}
}
:returns:
(string) --
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to\nClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid\nfor. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By\ndefault, the http method is whatever is used in the method\'s model.
"""
pass
def get_approval_rule_template(approvalRuleTemplateName=None):
"""
Returns information about a specified approval rule template.
See also: AWS API Documentation
Exceptions
:example: response = client.get_approval_rule_template(
approvalRuleTemplateName='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template for which you want to get information.\n
:rtype: dict
ReturnsResponse Syntax{
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
Response Structure
(dict) --
approvalRuleTemplate (dict) --The content and structure of the approval rule template.
approvalRuleTemplateId (string) --The system-generated ID of the approval rule template.
approvalRuleTemplateName (string) --The name of the approval rule template.
approvalRuleTemplateDescription (string) --The description of the approval rule template.
approvalRuleTemplateContent (string) --The content of the approval rule template.
ruleContentSha256 (string) --The SHA-256 hash signature for the content of the approval rule template.
lastModifiedDate (datetime) --The date the approval rule template was most recently changed, in timestamp format.
creationDate (datetime) --The date the approval rule template was created, in timestamp format.
lastModifiedUser (string) --The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule template.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
:return: {
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
"""
pass
def get_blob(repositoryName=None, blobId=None):
"""
Returns the base-64 encoded content of an individual blob in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_blob(
repositoryName='string',
blobId='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the blob.\n
:type blobId: string
:param blobId: [REQUIRED]\nThe ID of the blob, which is its SHA-1 pointer.\n
:rtype: dict
ReturnsResponse Syntax
{
'content': b'bytes'
}
Response Structure
(dict) --
Represents the output of a get blob operation.
content (bytes) --
The content of the blob, usually a file.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.BlobIdRequiredException
CodeCommit.Client.exceptions.InvalidBlobIdException
CodeCommit.Client.exceptions.BlobIdDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.FileTooLargeException
:return: {
'content': b'bytes'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.BlobIdRequiredException
CodeCommit.Client.exceptions.InvalidBlobIdException
CodeCommit.Client.exceptions.BlobIdDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.FileTooLargeException
"""
pass
def get_branch(repositoryName=None, branchName=None):
"""
Returns information about a repository branch, including its name and the last commit ID.
See also: AWS API Documentation
Exceptions
:example: response = client.get_branch(
repositoryName='string',
branchName='string'
)
:type repositoryName: string
:param repositoryName: The name of the repository that contains the branch for which you want to retrieve information.
:type branchName: string
:param branchName: The name of the branch for which you want to retrieve information.
:rtype: dict
ReturnsResponse Syntax
{
'branch': {
'branchName': 'string',
'commitId': 'string'
}
}
Response Structure
(dict) --
Represents the output of a get branch operation.
branch (dict) --
The name of the branch.
branchName (string) --
The name of the branch.
commitId (string) --
The ID of the last commit made to the branch.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'branch': {
'branchName': 'string',
'commitId': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_comment(commentId=None):
"""
Returns the content of a comment made on a change, file, or commit in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_comment(
commentId='string'
)
:type commentId: string
:param commentId: [REQUIRED]\nThe unique, system-generated ID of the comment. To get this ID, use GetCommentsForComparedCommit or GetCommentsForPullRequest .\n
:rtype: dict
ReturnsResponse Syntax{
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
comment (dict) --The contents of the comment.
commentId (string) --The system-generated comment ID.
content (string) --The content of the comment.
inReplyTo (string) --The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
CodeCommit.Client.exceptions.CommentDeletedException
:return: {
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
"""
pass
def get_comments_for_compared_commit(repositoryName=None, beforeCommitId=None, afterCommitId=None, nextToken=None, maxResults=None):
"""
Returns information about comments made on the comparison between two commits.
See also: AWS API Documentation
Exceptions
:example: response = client.get_comments_for_compared_commit(
repositoryName='string',
beforeCommitId='string',
afterCommitId='string',
nextToken='string',
maxResults=123
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to compare commits.\n
:type beforeCommitId: string
:param beforeCommitId: To establish the directionality of the comparison, the full commit ID of the before commit.
:type afterCommitId: string
:param afterCommitId: [REQUIRED]\nTo establish the directionality of the comparison, the full commit ID of the after commit.\n
:type nextToken: string
:param nextToken: An enumeration token that when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results. The default is 100 comments, but you can configure up to 500.
:rtype: dict
ReturnsResponse Syntax
{
'commentsForComparedCommitData': [
{
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comments': [
{
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
},
]
},
],
'nextToken': 'string'
}
Response Structure
(dict) --
commentsForComparedCommitData (list) --
A list of comment objects on the compared commit.
(dict) --
Returns information about comments on the comparison between two commits.
repositoryName (string) --
The name of the repository that contains the compared commits.
beforeCommitId (string) --
The full commit ID of the commit used to establish the before of the comparison.
afterCommitId (string) --
The full commit ID of the commit used to establish the after of the comparison.
beforeBlobId (string) --
The full blob ID of the commit used to establish the before of the comparison.
afterBlobId (string) --
The full blob ID of the commit used to establish the after of the comparison.
location (dict) --
Location information about the comment on the comparison, including the file name, line number, and whether the version of the file where the comment was made is BEFORE or AFTER.
filePath (string) --
The name of the file being compared, including its extension and subdirectory, if any.
filePosition (integer) --
The position of a change in a compared file, in line number format.
relativeFileVersion (string) --
In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.
comments (list) --
An array of comment objects. Each comment object contains information about a comment on the comparison between commits.
(dict) --
Returns information about a specific comment.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commentsForComparedCommitData': [
{
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comments': [
{
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
},
]
},
],
'nextToken': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_comments_for_pull_request(pullRequestId=None, repositoryName=None, beforeCommitId=None, afterCommitId=None, nextToken=None, maxResults=None):
"""
Returns comments made on a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.get_comments_for_pull_request(
pullRequestId='string',
repositoryName='string',
beforeCommitId='string',
afterCommitId='string',
nextToken='string',
maxResults=123
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type repositoryName: string
:param repositoryName: The name of the repository that contains the pull request.
:type beforeCommitId: string
:param beforeCommitId: The full commit ID of the commit in the destination branch that was the tip of the branch at the time the pull request was created.
:type afterCommitId: string
:param afterCommitId: The full commit ID of the commit in the source branch that was the tip of the branch at the time the comment was made.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results. The default is 100 comments. You can return up to 500 comments with a single request.
:rtype: dict
ReturnsResponse Syntax
{
'commentsForPullRequestData': [
{
'pullRequestId': 'string',
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comments': [
{
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
},
]
},
],
'nextToken': 'string'
}
Response Structure
(dict) --
commentsForPullRequestData (list) --
An array of comment objects on the pull request.
(dict) --
Returns information about comments on a pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
repositoryName (string) --
The name of the repository that contains the pull request.
beforeCommitId (string) --
The full commit ID of the commit that was the tip of the destination branch when the pull request was created. This commit is superceded by the after commit in the source branch when and if you merge the source branch into the destination branch.
afterCommitId (string) --
The full commit ID of the commit that was the tip of the source branch at the time the comment was made.
beforeBlobId (string) --
The full blob ID of the file on which you want to comment on the destination commit.
afterBlobId (string) --
The full blob ID of the file on which you want to comment on the source commit.
location (dict) --
Location information about the comment on the pull request, including the file name, line number, and whether the version of the file where the comment was made is BEFORE (destination branch) or AFTER (source branch).
filePath (string) --
The name of the file being compared, including its extension and subdirectory, if any.
filePosition (integer) --
The position of a change in a compared file, in line number format.
relativeFileVersion (string) --
In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.
comments (list) --
An array of comment objects. Each comment object contains information about a comment on the pull request.
(dict) --
Returns information about a specific comment.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
Exceptions
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commentsForPullRequestData': [
{
'pullRequestId': 'string',
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comments': [
{
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
},
]
},
],
'nextToken': 'string'
}
:returns:
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_commit(repositoryName=None, commitId=None):
"""
Returns information about a commit, including commit message and committer information.
See also: AWS API Documentation
Exceptions
:example: response = client.get_commit(
repositoryName='string',
commitId='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository to which the commit was made.\n
:type commitId: string
:param commitId: [REQUIRED]\nThe commit ID. Commit IDs are the full SHA ID of the commit.\n
:rtype: dict
ReturnsResponse Syntax
{
'commit': {
'commitId': 'string',
'treeId': 'string',
'parents': [
'string',
],
'message': 'string',
'author': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'committer': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'additionalData': 'string'
}
}
Response Structure
(dict) --
Represents the output of a get commit operation.
commit (dict) --
A commit data type object that contains information about the specified commit.
commitId (string) --
The full SHA ID of the specified commit.
treeId (string) --
Tree information for the specified commit.
parents (list) --
A list of parent commits for the specified commit. Each parent commit ID is the full commit ID.
(string) --
message (string) --
The commit message associated with the specified commit.
author (dict) --
Information about the author of the specified commit. Information includes the date in timestamp format with GMT offset, the name of the author, and the email address for the author, as configured in Git.
name (string) --
The name of the user who made the specified commit.
email (string) --
The email address associated with the user who made the commit, if any.
date (string) --
The date when the specified commit was commited, in timestamp format with GMT offset.
committer (dict) --
Information about the person who committed the specified commit, also known as the committer. Information includes the date in timestamp format with GMT offset, the name of the committer, and the email address for the committer, as configured in Git.
For more information about the difference between an author and a committer in Git, see Viewing the Commit History in Pro Git by Scott Chacon and Ben Straub.
name (string) --
The name of the user who made the specified commit.
email (string) --
The email address associated with the user who made the commit, if any.
date (string) --
The date when the specified commit was commited, in timestamp format with GMT offset.
additionalData (string) --
Any other data associated with the specified commit.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitIdDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commit': {
'commitId': 'string',
'treeId': 'string',
'parents': [
'string',
],
'message': 'string',
'author': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'committer': {
'name': 'string',
'email': 'string',
'date': 'string'
},
'additionalData': 'string'
}
}
:returns:
(string) --
"""
pass
def get_differences(repositoryName=None, beforeCommitSpecifier=None, afterCommitSpecifier=None, beforePath=None, afterPath=None, MaxResults=None, NextToken=None):
"""
Returns information about the differences in a valid commit specifier (such as a branch, tag, HEAD, commit ID, or other fully qualified reference). Results can be limited to a specified path.
See also: AWS API Documentation
Exceptions
:example: response = client.get_differences(
repositoryName='string',
beforeCommitSpecifier='string',
afterCommitSpecifier='string',
beforePath='string',
afterPath='string',
MaxResults=123,
NextToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to get differences.\n
:type beforeCommitSpecifier: string
:param beforeCommitSpecifier: The branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, the full commit ID). Optional. If not specified, all changes before the afterCommitSpecifier value are shown. If you do not use beforeCommitSpecifier in your request, consider limiting the results with maxResults .
:type afterCommitSpecifier: string
:param afterCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit.\n
:type beforePath: string
:param beforePath: The file path in which to check for differences. Limits the results to this path. Can also be used to specify the previous name of a directory or folder. If beforePath and afterPath are not specified, differences are shown for all paths.
:type afterPath: string
:param afterPath: The file path in which to check differences. Limits the results to this path. Can also be used to specify the changed name of a directory or folder, if it has changed. If not specified, differences are shown for all paths.
:type MaxResults: integer
:param MaxResults: A non-zero, non-negative integer used to limit the number of returned results.
:type NextToken: string
:param NextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:rtype: dict
ReturnsResponse Syntax
{
'differences': [
{
'beforeBlob': {
'blobId': 'string',
'path': 'string',
'mode': 'string'
},
'afterBlob': {
'blobId': 'string',
'path': 'string',
'mode': 'string'
},
'changeType': 'A'|'M'|'D'
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
differences (list) --
A data type object that contains information about the differences, including whether the difference is added, modified, or deleted (A, D, M).
(dict) --
Returns information about a set of differences for a commit specifier.
beforeBlob (dict) --
Information about a beforeBlob data type object, including the ID, the file mode permission code, and the path.
blobId (string) --
The full ID of the blob.
path (string) --
The path to the blob and associated file name, if any.
mode (string) --
The file mode permissions of the blob. File mode permission codes include:
100644 indicates read/write
100755 indicates read/write/execute
160000 indicates a submodule
120000 indicates a symlink
afterBlob (dict) --
Information about an afterBlob data type object, including the ID, the file mode permission code, and the path.
blobId (string) --
The full ID of the blob.
path (string) --
The path to the blob and associated file name, if any.
mode (string) --
The file mode permissions of the blob. File mode permission codes include:
100644 indicates read/write
100755 indicates read/write/execute
160000 indicates a submodule
120000 indicates a symlink
changeType (string) --
Whether the change type of the difference is an addition (A), deletion (D), or modification (M).
NextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.PathDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'differences': [
{
'beforeBlob': {
'blobId': 'string',
'path': 'string',
'mode': 'string'
},
'afterBlob': {
'blobId': 'string',
'path': 'string',
'mode': 'string'
},
'changeType': 'A'|'M'|'D'
},
],
'NextToken': 'string'
}
:returns:
100644 indicates read/write
100755 indicates read/write/execute
160000 indicates a submodule
120000 indicates a symlink
"""
pass
def get_file(repositoryName=None, commitSpecifier=None, filePath=None):
"""
Returns the base-64 encoded contents of a specified file and its metadata.
See also: AWS API Documentation
Exceptions
:example: response = client.get_file(
repositoryName='string',
commitSpecifier='string',
filePath='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the file.\n
:type commitSpecifier: string
:param commitSpecifier: The fully quaified reference that identifies the commit that contains the file. For example, you can specify a full commit ID, a tag, a branch name, or a reference such as refs/heads/master. If none is provided, the head commit is used.
:type filePath: string
:param filePath: [REQUIRED]\nThe fully qualified path to the file, including the full name and extension of the file. For example, /examples/file.md is the fully qualified path to a file named file.md in a folder named examples.\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'blobId': 'string',
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'fileSize': 123,
'fileContent': b'bytes'
}
Response Structure
(dict) --
commitId (string) --
The full commit ID of the commit that contains the content returned by GetFile.
blobId (string) --
The blob ID of the object that represents the file content.
filePath (string) --
The fully qualified path to the specified file. Returns the name and extension of the file.
fileMode (string) --
The extrapolated file mode permissions of the blob. Valid values include strings such as EXECUTABLE and not numeric values.
Note
The file mode permissions returned by this API are not the standard file mode permission values, such as 100644, but rather extrapolated values. See the supported return values.
fileSize (integer) --
The size of the contents of the file, in bytes.
fileContent (bytes) --
The base-64 encoded binary data object that represents the content of the file.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.FileTooLargeException
:return: {
'commitId': 'string',
'blobId': 'string',
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'fileSize': 123,
'fileContent': b'bytes'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.FileTooLargeException
"""
pass
def get_folder(repositoryName=None, commitSpecifier=None, folderPath=None):
"""
Returns the contents of a specified folder in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_folder(
repositoryName='string',
commitSpecifier='string',
folderPath='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository.\n
:type commitSpecifier: string
:param commitSpecifier: A fully qualified reference used to identify a commit that contains the version of the folder\'s content to return. A fully qualified reference can be a commit ID, branch name, tag, or reference such as HEAD. If no specifier is provided, the folder content is returned as it exists in the HEAD commit.
:type folderPath: string
:param folderPath: [REQUIRED]\nThe fully qualified path to the folder whose contents are returned, including the folder name. For example, /examples is a fully-qualified path to a folder named examples that was created off of the root directory (/) of a repository.\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'folderPath': 'string',
'treeId': 'string',
'subFolders': [
{
'treeId': 'string',
'absolutePath': 'string',
'relativePath': 'string'
},
],
'files': [
{
'blobId': 'string',
'absolutePath': 'string',
'relativePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'symbolicLinks': [
{
'blobId': 'string',
'absolutePath': 'string',
'relativePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'subModules': [
{
'commitId': 'string',
'absolutePath': 'string',
'relativePath': 'string'
},
]
}
Response Structure
(dict) --
commitId (string) --
The full commit ID used as a reference for the returned version of the folder content.
folderPath (string) --
The fully qualified path of the folder whose contents are returned.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains the folder.
subFolders (list) --
The list of folders that exist under the specified folder, if any.
(dict) --
Returns information about a folder in a repository.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains the folder.
absolutePath (string) --
The fully qualified path of the folder in the repository.
relativePath (string) --
The relative path of the specified folder from the folder where the query originated.
files (list) --
The list of files in the specified folder, if any.
(dict) --
Returns information about a file in a repository.
blobId (string) --
The blob ID that contains the file information.
absolutePath (string) --
The fully qualified path to the file in the repository.
relativePath (string) --
The relative path of the file from the folder where the query originated.
fileMode (string) --
The extrapolated file mode permissions for the file. Valid values include EXECUTABLE and NORMAL.
symbolicLinks (list) --
The list of symbolic links to other files and folders in the specified folder, if any.
(dict) --
Returns information about a symbolic link in a repository folder.
blobId (string) --
The blob ID that contains the information about the symbolic link.
absolutePath (string) --
The fully qualified path to the folder that contains the symbolic link.
relativePath (string) --
The relative path of the symbolic link from the folder where the query originated.
fileMode (string) --
The file mode permissions of the blob that cotains information about the symbolic link.
subModules (list) --
The list of submodules in the specified folder, if any.
(dict) --
Returns information about a submodule reference in a repository folder.
commitId (string) --
The commit ID that contains the reference to the submodule.
absolutePath (string) --
The fully qualified path to the folder that contains the reference to the submodule.
relativePath (string) --
The relative path of the submodule from the folder where the query originated.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FolderDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'folderPath': 'string',
'treeId': 'string',
'subFolders': [
{
'treeId': 'string',
'absolutePath': 'string',
'relativePath': 'string'
},
],
'files': [
{
'blobId': 'string',
'absolutePath': 'string',
'relativePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'symbolicLinks': [
{
'blobId': 'string',
'absolutePath': 'string',
'relativePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'subModules': [
{
'commitId': 'string',
'absolutePath': 'string',
'relativePath': 'string'
},
]
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FolderDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_merge_commit(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, conflictDetailLevel=None, conflictResolutionStrategy=None):
"""
Returns information about a specified merge commit.
See also: AWS API Documentation
Exceptions
:example: response = client.get_merge_commit(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the merge commit about which you want to get information.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:rtype: dict
ReturnsResponse Syntax
{
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'baseCommitId': 'string',
'mergedCommitId': 'string'
}
Response Structure
(dict) --
sourceCommitId (string) --
The commit ID of the source commit specifier that was used in the merge evaluation.
destinationCommitId (string) --
The commit ID of the destination commit specifier that was used in the merge evaluation.
baseCommitId (string) --
The commit ID of the merge base.
mergedCommitId (string) --
The commit ID for the merge commit created when the source branch was merged into the destination branch. If the fast-forward merge strategy was used, there is no merge commit.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'baseCommitId': 'string',
'mergedCommitId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_merge_conflicts(repositoryName=None, destinationCommitSpecifier=None, sourceCommitSpecifier=None, mergeOption=None, conflictDetailLevel=None, maxConflictFiles=None, conflictResolutionStrategy=None, nextToken=None):
"""
Returns information about merge conflicts between the before and after commit IDs for a pull request in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_merge_conflicts(
repositoryName='string',
destinationCommitSpecifier='string',
sourceCommitSpecifier='string',
mergeOption='FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
maxConflictFiles=123,
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
nextToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where the pull request was created.\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type mergeOption: string
:param mergeOption: [REQUIRED]\nThe merge option or strategy you want to use to merge the code.\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type maxConflictFiles: integer
:param maxConflictFiles: The maximum number of files to include in the output.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:rtype: dict
ReturnsResponse Syntax
{
'mergeable': True|False,
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string',
'conflictMetadataList': [
{
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
],
'nextToken': 'string'
}
Response Structure
(dict) --
mergeable (boolean) --
A Boolean value that indicates whether the code is mergeable by the specified merge option.
destinationCommitId (string) --
The commit ID of the destination commit specifier that was used in the merge evaluation.
sourceCommitId (string) --
The commit ID of the source commit specifier that was used in the merge evaluation.
baseCommitId (string) --
The commit ID of the merge base.
conflictMetadataList (list) --
A list of metadata for any conflicting files. If the specified merge strategy is FAST_FORWARD_MERGE, this list is always empty.
(dict) --
Information about the metadata for a conflict in a merge operation.
filePath (string) --
The path of the file that contains conflicts.
fileSizes (dict) --
The file sizes of the file in the source, destination, and base of the merge.
source (integer) --
The size of a file in the source of a merge or pull request.
destination (integer) --
The size of a file in the destination of a merge or pull request.
base (integer) --
The size of a file in the base of a merge or pull request.
fileModes (dict) --
The file modes of the file in the source, destination, and base of the merge.
source (string) --
The file mode of a file in the source of a merge or pull request.
destination (string) --
The file mode of a file in the destination of a merge or pull request.
base (string) --
The file mode of a file in the base of a merge or pull request.
objectTypes (dict) --
Information about any object type conflicts in a merge operation.
source (string) --
The type of the object in the source branch.
destination (string) --
The type of the object in the destination branch.
base (string) --
The type of the object in the base commit of the merge.
numberOfConflicts (integer) --
The number of conflicts, including both hunk conflicts and metadata conflicts.
isBinaryFile (dict) --
A boolean value (true or false) indicating whether the file is binary or textual in the source, destination, and base of the merge.
source (boolean) --
The binary or non-binary status of file in the source of a merge or pull request.
destination (boolean) --
The binary or non-binary status of a file in the destination of a merge or pull request.
base (boolean) --
The binary or non-binary status of a file in the base of a merge or pull request.
contentConflict (boolean) --
A boolean value indicating whether there are conflicts in the content of a file.
fileModeConflict (boolean) --
A boolean value indicating whether there are conflicts in the file mode of a file.
objectTypeConflict (boolean) --
A boolean value (true or false) indicating whether there are conflicts between the branches in the object type of a file, folder, or submodule.
mergeOperations (dict) --
Whether an add, modify, or delete operation caused the conflict between the source and destination of the merge.
source (string) --
The operation (add, modify, or delete) on a file in the source of a merge or pull request.
destination (string) --
The operation on a file in the destination of a merge or pull request.
nextToken (string) --
An enumeration token that can be used in a request to return the next batch of the results.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.InvalidMaxConflictFilesException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidDestinationCommitSpecifierException
CodeCommit.Client.exceptions.InvalidSourceCommitSpecifierException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'mergeable': True|False,
'destinationCommitId': 'string',
'sourceCommitId': 'string',
'baseCommitId': 'string',
'conflictMetadataList': [
{
'filePath': 'string',
'fileSizes': {
'source': 123,
'destination': 123,
'base': 123
},
'fileModes': {
'source': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'destination': 'EXECUTABLE'|'NORMAL'|'SYMLINK',
'base': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
'objectTypes': {
'source': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'destination': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK',
'base': 'FILE'|'DIRECTORY'|'GIT_LINK'|'SYMBOLIC_LINK'
},
'numberOfConflicts': 123,
'isBinaryFile': {
'source': True|False,
'destination': True|False,
'base': True|False
},
'contentConflict': True|False,
'fileModeConflict': True|False,
'objectTypeConflict': True|False,
'mergeOperations': {
'source': 'A'|'M'|'D',
'destination': 'A'|'M'|'D'
}
},
],
'nextToken': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.MergeOptionRequiredException
CodeCommit.Client.exceptions.InvalidMergeOptionException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.InvalidMaxConflictFilesException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidDestinationCommitSpecifierException
CodeCommit.Client.exceptions.InvalidSourceCommitSpecifierException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_merge_options(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, conflictDetailLevel=None, conflictResolutionStrategy=None):
"""
Returns information about the merge options available for merging two specified branches. For details about why a merge option is not available, use GetMergeConflicts or DescribeMergeConflicts.
See also: AWS API Documentation
Exceptions
:example: response = client.get_merge_options(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the commits about which you want to get merge options.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:rtype: dict
ReturnsResponse Syntax
{
'mergeOptions': [
'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
],
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'baseCommitId': 'string'
}
Response Structure
(dict) --
mergeOptions (list) --
The merge option or strategy used to merge the code.
(string) --
sourceCommitId (string) --
The commit ID of the source commit specifier that was used in the merge evaluation.
destinationCommitId (string) --
The commit ID of the destination commit specifier that was used in the merge evaluation.
baseCommitId (string) --
The commit ID of the merge base.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'mergeOptions': [
'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE',
],
'sourceCommitId': 'string',
'destinationCommitId': 'string',
'baseCommitId': 'string'
}
:returns:
(string) --
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
ReturnsA paginator object.
"""
pass
def get_pull_request(pullRequestId=None):
"""
Gets information about a pull request in a specified repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_pull_request(
pullRequestId='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:rtype: dict
ReturnsResponse Syntax{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --Information about the specified pull request.
pullRequestId (string) --The system-generated ID of the pull request.
title (string) --The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --Returns information about a pull request target.
repositoryName (string) --The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --A Boolean value indicating whether the merge has been made.
mergedBy (string) --The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --The commit ID for the merge commit, if any.
mergeOption (string) --The merge strategy used in the merge.
clientRequestToken (string) --A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --The system-generated revision ID for the pull request.
approvalRules (list) --The approval rules applied to the pull request.
(dict) --Returns information about an approval rule.
approvalRuleId (string) --The system-generated ID of the approval rule.
approvalRuleName (string) --The name of the approval rule.
approvalRuleContent (string) --The content of the approval rule.
ruleContentSha256 (string) --The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --The approval rule template used to create the rule.
approvalRuleTemplateId (string) --The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
"""
pass
def get_pull_request_approval_states(pullRequestId=None, revisionId=None):
"""
Gets information about the approval states for a specified pull request. Approval states only apply to pull requests that have one or more approval rules applied to them.
See also: AWS API Documentation
Exceptions
:example: response = client.get_pull_request_approval_states(
pullRequestId='string',
revisionId='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID for the pull request.\n
:type revisionId: string
:param revisionId: [REQUIRED]\nThe system-generated ID for the pull request revision.\n
:rtype: dict
ReturnsResponse Syntax
{
'approvals': [
{
'userArn': 'string',
'approvalState': 'APPROVE'|'REVOKE'
},
]
}
Response Structure
(dict) --
approvals (list) --
Information about users who have approved the pull request.
(dict) --
Returns information about a specific approval on a pull request.
userArn (string) --
The Amazon Resource Name (ARN) of the user.
approvalState (string) --
The state of the approval, APPROVE or REVOKE. REVOKE states are not stored.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'approvals': [
{
'userArn': 'string',
'approvalState': 'APPROVE'|'REVOKE'
},
]
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_pull_request_override_state(pullRequestId=None, revisionId=None):
"""
Returns information about whether approval rules have been set aside (overridden) for a pull request, and if so, the Amazon Resource Name (ARN) of the user or identity that overrode the rules and their requirements for the pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.get_pull_request_override_state(
pullRequestId='string',
revisionId='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe ID of the pull request for which you want to get information about whether approval rules have been set aside (overridden).\n
:type revisionId: string
:param revisionId: [REQUIRED]\nThe system-generated ID of the revision for the pull request. To retrieve the most recent revision ID, use GetPullRequest .\n
:rtype: dict
ReturnsResponse Syntax
{
'overridden': True|False,
'overrider': 'string'
}
Response Structure
(dict) --
overridden (boolean) --
A Boolean value that indicates whether a pull request has had its rules set aside (TRUE) or whether all approval rules still apply (FALSE).
overrider (string) --
The Amazon Resource Name (ARN) of the user or identity that overrode the rules and their requirements for the pull request.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'overridden': True|False,
'overrider': 'string'
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def get_repository(repositoryName=None):
"""
Returns information about a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_repository(
repositoryName='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository to get information about.\n
:rtype: dict
ReturnsResponse Syntax{
'repositoryMetadata': {
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
}
}
Response Structure
(dict) --Represents the output of a get repository operation.
repositoryMetadata (dict) --Information about the repository.
accountId (string) --The ID of the AWS account associated with the repository.
repositoryId (string) --The ID of the repository.
repositoryName (string) --The repository\'s name.
repositoryDescription (string) --A comment or description about the repository.
defaultBranch (string) --The repository\'s default branch name.
lastModifiedDate (datetime) --The date and time the repository was last modified, in timestamp format.
creationDate (datetime) --The date and time the repository was created, in timestamp format.
cloneUrlHttp (string) --The URL to use for cloning the repository over HTTPS.
cloneUrlSsh (string) --The URL to use for cloning the repository over SSH.
Arn (string) --The Amazon Resource Name (ARN) of the repository.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'repositoryMetadata': {
'accountId': 'string',
'repositoryId': 'string',
'repositoryName': 'string',
'repositoryDescription': 'string',
'defaultBranch': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'cloneUrlHttp': 'string',
'cloneUrlSsh': 'string',
'Arn': 'string'
}
}
"""
pass
def get_repository_triggers(repositoryName=None):
"""
Gets information about triggers configured for a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.get_repository_triggers(
repositoryName='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository for which the trigger is configured.\n
:rtype: dict
ReturnsResponse Syntax{
'configurationId': 'string',
'triggers': [
{
'name': 'string',
'destinationArn': 'string',
'customData': 'string',
'branches': [
'string',
],
'events': [
'all'|'updateReference'|'createReference'|'deleteReference',
]
},
]
}
Response Structure
(dict) --Represents the output of a get repository triggers operation.
configurationId (string) --The system-generated unique ID for the trigger.
triggers (list) --The JSON block of configuration information for each trigger.
(dict) --Information about a trigger for a repository.
name (string) --The name of the trigger.
destinationArn (string) --The ARN of the resource that is the target for a trigger (for example, the ARN of a topic in Amazon SNS).
customData (string) --Any custom data associated with the trigger to be included in the information sent to the target of the trigger.
branches (list) --The branches to be included in the trigger configuration. If you specify an empty array, the trigger applies to all branches.
Note
Although no content is required in the array, you must include the array itself.
(string) --
events (list) --The repository events that cause the trigger to run actions in another service, such as sending a notification through Amazon SNS.
Note
The valid value "all" cannot be used with any other values.
(string) --
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'configurationId': 'string',
'triggers': [
{
'name': 'string',
'destinationArn': 'string',
'customData': 'string',
'branches': [
'string',
],
'events': [
'all'|'updateReference'|'createReference'|'deleteReference',
]
},
]
}
:returns:
(string) --
"""
pass
def get_waiter(waiter_name=None):
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters\nsection of the service docs for a list of available waiters.
:rtype: botocore.waiter.Waiter
"""
pass
def list_approval_rule_templates(nextToken=None, maxResults=None):
"""
Lists all approval rule templates in the specified AWS Region in your AWS account. If an AWS Region is not specified, the AWS Region where you are signed in is used.
See also: AWS API Documentation
Exceptions
:example: response = client.list_approval_rule_templates(
nextToken='string',
maxResults=123
)
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results.
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplateNames': [
'string',
],
'nextToken': 'string'
}
Response Structure
(dict) --
approvalRuleTemplateNames (list) --
The names of all the approval rule templates found in the AWS Region for your AWS account.
(string) --
nextToken (string) --
An enumeration token that allows the operation to batch the next results of the operation.
Exceptions
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
:return: {
'approvalRuleTemplateNames': [
'string',
],
'nextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_associated_approval_rule_templates_for_repository(repositoryName=None, nextToken=None, maxResults=None):
"""
Lists all approval rule templates that are associated with a specified repository.
See also: AWS API Documentation
Exceptions
:example: response = client.list_associated_approval_rule_templates_for_repository(
repositoryName='string',
nextToken='string',
maxResults=123
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository for which you want to list all associated approval rule templates.\n
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results.
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplateNames': [
'string',
],
'nextToken': 'string'
}
Response Structure
(dict) --
approvalRuleTemplateNames (list) --
The names of all approval rule templates associated with the repository.
(string) --
nextToken (string) --
An enumeration token that allows the operation to batch the next results of the operation.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'approvalRuleTemplateNames': [
'string',
],
'nextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_branches(repositoryName=None, nextToken=None):
"""
Gets information about one or more branches in a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.list_branches(
repositoryName='string',
nextToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository that contains the branches.\n
:type nextToken: string
:param nextToken: An enumeration token that allows the operation to batch the results.
:rtype: dict
ReturnsResponse Syntax
{
'branches': [
'string',
],
'nextToken': 'string'
}
Response Structure
(dict) --
Represents the output of a list branches operation.
branches (list) --
The list of branch names.
(string) --
nextToken (string) --
An enumeration token that returns the batch of the results.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
:return: {
'branches': [
'string',
],
'nextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_pull_requests(repositoryName=None, authorArn=None, pullRequestStatus=None, nextToken=None, maxResults=None):
"""
Returns a list of pull requests for a specified repository. The return list can be refined by pull request status or pull request author ARN.
See also: AWS API Documentation
Exceptions
:example: response = client.list_pull_requests(
repositoryName='string',
authorArn='string',
pullRequestStatus='OPEN'|'CLOSED',
nextToken='string',
maxResults=123
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository for which you want to list pull requests.\n
:type authorArn: string
:param authorArn: Optional. The Amazon Resource Name (ARN) of the user who created the pull request. If used, this filters the results to pull requests created by that user.
:type pullRequestStatus: string
:param pullRequestStatus: Optional. The status of the pull request. If used, this refines the results to the pull requests that match the specified status.
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results.
:rtype: dict
ReturnsResponse Syntax
{
'pullRequestIds': [
'string',
],
'nextToken': 'string'
}
Response Structure
(dict) --
pullRequestIds (list) --
The system-generated IDs of the pull requests.
(string) --
nextToken (string) --
An enumeration token that allows the operation to batch the next results of the operation.
Exceptions
CodeCommit.Client.exceptions.InvalidPullRequestStatusException
CodeCommit.Client.exceptions.InvalidAuthorArnException
CodeCommit.Client.exceptions.AuthorDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequestIds': [
'string',
],
'nextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_repositories(nextToken=None, sortBy=None, order=None):
"""
Gets information about one or more repositories.
See also: AWS API Documentation
Exceptions
:example: response = client.list_repositories(
nextToken='string',
sortBy='repositoryName'|'lastModifiedDate',
order='ascending'|'descending'
)
:type nextToken: string
:param nextToken: An enumeration token that allows the operation to batch the results of the operation. Batch sizes are 1,000 for list repository operations. When the client sends the token back to AWS CodeCommit, another page of 1,000 records is retrieved.
:type sortBy: string
:param sortBy: The criteria used to sort the results of a list repositories operation.
:type order: string
:param order: The order in which to sort the results of a list repositories operation.
:rtype: dict
ReturnsResponse Syntax
{
'repositories': [
{
'repositoryName': 'string',
'repositoryId': 'string'
},
],
'nextToken': 'string'
}
Response Structure
(dict) --
Represents the output of a list repositories operation.
repositories (list) --
Lists the repositories called by the list repositories operation.
(dict) --
Information about a repository name and ID.
repositoryName (string) --
The name associated with the repository.
repositoryId (string) --
The ID associated with the repository.
nextToken (string) --
An enumeration token that allows the operation to batch the results of the operation. Batch sizes are 1,000 for list repository operations. When the client sends the token back to AWS CodeCommit, another page of 1,000 records is retrieved.
Exceptions
CodeCommit.Client.exceptions.InvalidSortByException
CodeCommit.Client.exceptions.InvalidOrderException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
:return: {
'repositories': [
{
'repositoryName': 'string',
'repositoryId': 'string'
},
],
'nextToken': 'string'
}
:returns:
CodeCommit.Client.exceptions.InvalidSortByException
CodeCommit.Client.exceptions.InvalidOrderException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
"""
pass
def list_repositories_for_approval_rule_template(approvalRuleTemplateName=None, nextToken=None, maxResults=None):
"""
Lists all repositories associated with the specified approval rule template.
See also: AWS API Documentation
Exceptions
:example: response = client.list_repositories_for_approval_rule_template(
approvalRuleTemplateName='string',
nextToken='string',
maxResults=123
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template for which you want to list repositories that are associated with that template.\n
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:type maxResults: integer
:param maxResults: A non-zero, non-negative integer used to limit the number of returned results.
:rtype: dict
ReturnsResponse Syntax
{
'repositoryNames': [
'string',
],
'nextToken': 'string'
}
Response Structure
(dict) --
repositoryNames (list) --
A list of repository names that are associated with the specified approval rule template.
(string) --
nextToken (string) --
An enumeration token that allows the operation to batch the next results of the operation.
Exceptions
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.InvalidMaxResultsException
CodeCommit.Client.exceptions.InvalidContinuationTokenException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'repositoryNames': [
'string',
],
'nextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_tags_for_resource(resourceArn=None, nextToken=None):
"""
Gets information about AWS tags for a specified Amazon Resource Name (ARN) in AWS CodeCommit. For a list of valid resources in AWS CodeCommit, see CodeCommit Resources and Operations in the*AWS CodeCommit User Guide* .
See also: AWS API Documentation
Exceptions
:example: response = client.list_tags_for_resource(
resourceArn='string',
nextToken='string'
)
:type resourceArn: string
:param resourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) of the resource for which you want to get information about tags, if any.\n
:type nextToken: string
:param nextToken: An enumeration token that, when provided in a request, returns the next batch of the results.
:rtype: dict
ReturnsResponse Syntax
{
'tags': {
'string': 'string'
},
'nextToken': 'string'
}
Response Structure
(dict) --
tags (dict) --
A list of tag key and value pairs associated with the specified resource.
(string) --
(string) --
nextToken (string) --
An enumeration token that allows the operation to batch the next results of the operation.
Exceptions
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ResourceArnRequiredException
CodeCommit.Client.exceptions.InvalidResourceArnException
:return: {
'tags': {
'string': 'string'
},
'nextToken': 'string'
}
:returns:
(string) --
(string) --
"""
pass
def merge_branches_by_fast_forward(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, targetBranch=None):
"""
Merges two branches using the fast-forward merge strategy.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_branches_by_fast_forward(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
targetBranch='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to merge two branches.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type targetBranch: string
:param targetBranch: The branch where the merge is applied.
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'treeId': 'string'
}
Response Structure
(dict) --
commitId (string) --
The commit ID of the merge in the destination or target branch.
treeId (string) --
The tree ID of the merge in the destination or target branch.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'treeId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def merge_branches_by_squash(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, targetBranch=None, conflictDetailLevel=None, conflictResolutionStrategy=None, authorName=None, email=None, commitMessage=None, keepEmptyFolders=None, conflictResolution=None):
"""
Merges two branches using the squash merge strategy.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_branches_by_squash(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
targetBranch='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
authorName='string',
email='string',
commitMessage='string',
keepEmptyFolders=True|False,
conflictResolution={
'replaceContents': [
{
'filePath': 'string',
'replacementType': 'KEEP_BASE'|'KEEP_SOURCE'|'KEEP_DESTINATION'|'USE_NEW_CONTENT',
'content': b'bytes',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'deleteFiles': [
{
'filePath': 'string'
},
],
'setFileModes': [
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to merge two branches.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type targetBranch: string
:param targetBranch: The branch where the merge is applied.
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type authorName: string
:param authorName: The name of the author who created the commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address of the person merging the branches. This information is used in the commit information for the merge.
:type commitMessage: string
:param commitMessage: The commit message for the merge.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If this is specified as true, a .gitkeep file is created for empty folders. The default is false.
:type conflictResolution: dict
:param conflictResolution: If AUTOMERGE is the conflict resolution strategy, a list of inputs to use when resolving conflicts during a merge.\n\nreplaceContents (list) --Files to have content replaced as part of the merge conflict resolution.\n\n(dict) --Information about a replacement content entry in the conflict of a merge or pull request operation.\n\nfilePath (string) -- [REQUIRED]The path of the conflicting file.\n\nreplacementType (string) -- [REQUIRED]The replacement type to use when determining how to resolve the conflict.\n\ncontent (bytes) --The base-64 encoded content to use when the replacement type is USE_NEW_CONTENT.\n\nfileMode (string) --The file mode to apply during conflict resoltion.\n\n\n\n\n\ndeleteFiles (list) --Files to be deleted as part of the merge conflict resolution.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n\nsetFileModes (list) --File modes that are set as part of the merge conflict resolution.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'treeId': 'string'
}
Response Structure
(dict) --
commitId (string) --
The commit ID of the merge in the destination or target branch.
treeId (string) --
The tree ID of the merge in the destination or target branch.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'treeId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def merge_branches_by_three_way(repositoryName=None, sourceCommitSpecifier=None, destinationCommitSpecifier=None, targetBranch=None, conflictDetailLevel=None, conflictResolutionStrategy=None, authorName=None, email=None, commitMessage=None, keepEmptyFolders=None, conflictResolution=None):
"""
Merges two specified branches using the three-way merge strategy.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_branches_by_three_way(
repositoryName='string',
sourceCommitSpecifier='string',
destinationCommitSpecifier='string',
targetBranch='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
authorName='string',
email='string',
commitMessage='string',
keepEmptyFolders=True|False,
conflictResolution={
'replaceContents': [
{
'filePath': 'string',
'replacementType': 'KEEP_BASE'|'KEEP_SOURCE'|'KEEP_DESTINATION'|'USE_NEW_CONTENT',
'content': b'bytes',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'deleteFiles': [
{
'filePath': 'string'
},
],
'setFileModes': [
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to merge two branches.\n
:type sourceCommitSpecifier: string
:param sourceCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type destinationCommitSpecifier: string
:param destinationCommitSpecifier: [REQUIRED]\nThe branch, tag, HEAD, or other fully qualified reference used to identify a commit (for example, a branch name or a full commit ID).\n
:type targetBranch: string
:param targetBranch: The branch where the merge is applied.
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type authorName: string
:param authorName: The name of the author who created the commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address of the person merging the branches. This information is used in the commit information for the merge.
:type commitMessage: string
:param commitMessage: The commit message to include in the commit information for the merge.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If true, a .gitkeep file is created for empty folders. The default is false.
:type conflictResolution: dict
:param conflictResolution: If AUTOMERGE is the conflict resolution strategy, a list of inputs to use when resolving conflicts during a merge.\n\nreplaceContents (list) --Files to have content replaced as part of the merge conflict resolution.\n\n(dict) --Information about a replacement content entry in the conflict of a merge or pull request operation.\n\nfilePath (string) -- [REQUIRED]The path of the conflicting file.\n\nreplacementType (string) -- [REQUIRED]The replacement type to use when determining how to resolve the conflict.\n\ncontent (bytes) --The base-64 encoded content to use when the replacement type is USE_NEW_CONTENT.\n\nfileMode (string) --The file mode to apply during conflict resoltion.\n\n\n\n\n\ndeleteFiles (list) --Files to be deleted as part of the merge conflict resolution.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n\nsetFileModes (list) --File modes that are set as part of the merge conflict resolution.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'treeId': 'string'
}
Response Structure
(dict) --
commitId (string) --
The commit ID of the merge in the destination or target branch.
treeId (string) --
The tree ID of the merge in the destination or target branch.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'commitId': 'string',
'treeId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.CommitRequiredException
CodeCommit.Client.exceptions.InvalidCommitException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidTargetBranchException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.FileModeRequiredException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def merge_pull_request_by_fast_forward(pullRequestId=None, repositoryName=None, sourceCommitId=None):
"""
Attempts to merge the source commit of a pull request into the specified destination branch for that pull request at the specified commit using the fast-forward merge strategy. If the merge is successful, it closes the pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_pull_request_by_fast_forward(
pullRequestId='string',
repositoryName='string',
sourceCommitId='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where the pull request was created.\n
:type sourceCommitId: string
:param sourceCommitId: The full commit ID of the original or updated commit in the pull request source branch. Pass this value if you want an exception thrown if the current commit ID of the tip of the source branch does not match this commit ID.
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Information about the specified pull request, including the merge.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.ReferenceDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.ReferenceDoesNotExistException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def merge_pull_request_by_squash(pullRequestId=None, repositoryName=None, sourceCommitId=None, conflictDetailLevel=None, conflictResolutionStrategy=None, commitMessage=None, authorName=None, email=None, keepEmptyFolders=None, conflictResolution=None):
"""
Attempts to merge the source commit of a pull request into the specified destination branch for that pull request at the specified commit using the squash merge strategy. If the merge is successful, it closes the pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_pull_request_by_squash(
pullRequestId='string',
repositoryName='string',
sourceCommitId='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
commitMessage='string',
authorName='string',
email='string',
keepEmptyFolders=True|False,
conflictResolution={
'replaceContents': [
{
'filePath': 'string',
'replacementType': 'KEEP_BASE'|'KEEP_SOURCE'|'KEEP_DESTINATION'|'USE_NEW_CONTENT',
'content': b'bytes',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'deleteFiles': [
{
'filePath': 'string'
},
],
'setFileModes': [
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where the pull request was created.\n
:type sourceCommitId: string
:param sourceCommitId: The full commit ID of the original or updated commit in the pull request source branch. Pass this value if you want an exception thrown if the current commit ID of the tip of the source branch does not match this commit ID.
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type commitMessage: string
:param commitMessage: The commit message to include in the commit information for the merge.
:type authorName: string
:param authorName: The name of the author who created the commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address of the person merging the branches. This information is used in the commit information for the merge.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If true, a .gitkeep file is created for empty folders. The default is false.
:type conflictResolution: dict
:param conflictResolution: If AUTOMERGE is the conflict resolution strategy, a list of inputs to use when resolving conflicts during a merge.\n\nreplaceContents (list) --Files to have content replaced as part of the merge conflict resolution.\n\n(dict) --Information about a replacement content entry in the conflict of a merge or pull request operation.\n\nfilePath (string) -- [REQUIRED]The path of the conflicting file.\n\nreplacementType (string) -- [REQUIRED]The replacement type to use when determining how to resolve the conflict.\n\ncontent (bytes) --The base-64 encoded content to use when the replacement type is USE_NEW_CONTENT.\n\nfileMode (string) --The file mode to apply during conflict resoltion.\n\n\n\n\n\ndeleteFiles (list) --Files to be deleted as part of the merge conflict resolution.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n\nsetFileModes (list) --File modes that are set as part of the merge conflict resolution.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Returns information about a pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def merge_pull_request_by_three_way(pullRequestId=None, repositoryName=None, sourceCommitId=None, conflictDetailLevel=None, conflictResolutionStrategy=None, commitMessage=None, authorName=None, email=None, keepEmptyFolders=None, conflictResolution=None):
"""
Attempts to merge the source commit of a pull request into the specified destination branch for that pull request at the specified commit using the three-way merge strategy. If the merge is successful, it closes the pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.merge_pull_request_by_three_way(
pullRequestId='string',
repositoryName='string',
sourceCommitId='string',
conflictDetailLevel='FILE_LEVEL'|'LINE_LEVEL',
conflictResolutionStrategy='NONE'|'ACCEPT_SOURCE'|'ACCEPT_DESTINATION'|'AUTOMERGE',
commitMessage='string',
authorName='string',
email='string',
keepEmptyFolders=True|False,
conflictResolution={
'replaceContents': [
{
'filePath': 'string',
'replacementType': 'KEEP_BASE'|'KEEP_SOURCE'|'KEEP_DESTINATION'|'USE_NEW_CONTENT',
'content': b'bytes',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
],
'deleteFiles': [
{
'filePath': 'string'
},
],
'setFileModes': [
{
'filePath': 'string',
'fileMode': 'EXECUTABLE'|'NORMAL'|'SYMLINK'
},
]
}
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where the pull request was created.\n
:type sourceCommitId: string
:param sourceCommitId: The full commit ID of the original or updated commit in the pull request source branch. Pass this value if you want an exception thrown if the current commit ID of the tip of the source branch does not match this commit ID.
:type conflictDetailLevel: string
:param conflictDetailLevel: The level of conflict detail to use. If unspecified, the default FILE_LEVEL is used, which returns a not-mergeable result if the same file has differences in both branches. If LINE_LEVEL is specified, a conflict is considered not mergeable if the same file in both branches has differences on the same line.
:type conflictResolutionStrategy: string
:param conflictResolutionStrategy: Specifies which branch to use when resolving conflicts, or whether to attempt automatically merging two versions of a file. The default is NONE, which requires any conflicts to be resolved manually before the merge operation is successful.
:type commitMessage: string
:param commitMessage: The commit message to include in the commit information for the merge.
:type authorName: string
:param authorName: The name of the author who created the commit. This information is used as both the author and committer for the commit.
:type email: string
:param email: The email address of the person merging the branches. This information is used in the commit information for the merge.
:type keepEmptyFolders: boolean
:param keepEmptyFolders: If the commit contains deletions, whether to keep a folder or folder structure if the changes leave the folders empty. If true, a .gitkeep file is created for empty folders. The default is false.
:type conflictResolution: dict
:param conflictResolution: If AUTOMERGE is the conflict resolution strategy, a list of inputs to use when resolving conflicts during a merge.\n\nreplaceContents (list) --Files to have content replaced as part of the merge conflict resolution.\n\n(dict) --Information about a replacement content entry in the conflict of a merge or pull request operation.\n\nfilePath (string) -- [REQUIRED]The path of the conflicting file.\n\nreplacementType (string) -- [REQUIRED]The replacement type to use when determining how to resolve the conflict.\n\ncontent (bytes) --The base-64 encoded content to use when the replacement type is USE_NEW_CONTENT.\n\nfileMode (string) --The file mode to apply during conflict resoltion.\n\n\n\n\n\ndeleteFiles (list) --Files to be deleted as part of the merge conflict resolution.\n\n(dict) --A file that is deleted as part of a commit.\n\nfilePath (string) -- [REQUIRED]The full path of the file to be deleted, including the name of the file.\n\n\n\n\n\nsetFileModes (list) --File modes that are set as part of the merge conflict resolution.\n\n(dict) --Information about the file mode changes.\n\nfilePath (string) -- [REQUIRED]The full path to the file, including the name of the file.\n\nfileMode (string) -- [REQUIRED]The file mode for the file.\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Returns information about a pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.ManualMergeRequiredException
CodeCommit.Client.exceptions.TipOfSourceReferenceIsDifferentException
CodeCommit.Client.exceptions.TipsDivergenceExceededException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidConflictDetailLevelException
CodeCommit.Client.exceptions.InvalidConflictResolutionStrategyException
CodeCommit.Client.exceptions.InvalidConflictResolutionException
CodeCommit.Client.exceptions.ReplacementTypeRequiredException
CodeCommit.Client.exceptions.InvalidReplacementTypeException
CodeCommit.Client.exceptions.MultipleConflictResolutionEntriesException
CodeCommit.Client.exceptions.ReplacementContentRequiredException
CodeCommit.Client.exceptions.MaximumConflictResolutionEntriesExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.InvalidReplacementContentException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.MaximumFileContentToLoadExceededException
CodeCommit.Client.exceptions.MaximumItemsToCompareExceededException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.ConcurrentReferenceUpdateException
CodeCommit.Client.exceptions.PullRequestApprovalRulesNotSatisfiedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def override_pull_request_approval_rules(pullRequestId=None, revisionId=None, overrideStatus=None):
"""
Sets aside (overrides) all approval rule requirements for a specified pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.override_pull_request_approval_rules(
pullRequestId='string',
revisionId='string',
overrideStatus='OVERRIDE'|'REVOKE'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request for which you want to override all approval rule requirements. To get this information, use GetPullRequest .\n
:type revisionId: string
:param revisionId: [REQUIRED]\nThe system-generated ID of the most recent revision of the pull request. You cannot override approval rules for anything but the most recent revision of a pull request. To get the revision ID, use GetPullRequest.\n
:type overrideStatus: string
:param overrideStatus: [REQUIRED]\nWhether you want to set aside approval rule requirements for the pull request (OVERRIDE) or revoke a previous override and apply approval rule requirements (REVOKE). REVOKE status is not stored.\n
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.InvalidOverrideStatusException
CodeCommit.Client.exceptions.OverrideStatusRequiredException
CodeCommit.Client.exceptions.OverrideAlreadySetException
CodeCommit.Client.exceptions.RevisionNotCurrentException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def post_comment_for_compared_commit(repositoryName=None, beforeCommitId=None, afterCommitId=None, location=None, content=None, clientRequestToken=None):
"""
Posts a comment on the comparison between two commits.
See also: AWS API Documentation
Exceptions
:example: response = client.post_comment_for_compared_commit(
repositoryName='string',
beforeCommitId='string',
afterCommitId='string',
location={
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
content='string',
clientRequestToken='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to post a comment on the comparison between commits.\n
:type beforeCommitId: string
:param beforeCommitId: To establish the directionality of the comparison, the full commit ID of the before commit. Required for commenting on any commit unless that commit is the initial commit.
:type afterCommitId: string
:param afterCommitId: [REQUIRED]\nTo establish the directionality of the comparison, the full commit ID of the after commit.\n
:type location: dict
:param location: The location of the comparison where you want to comment.\n\nfilePath (string) --The name of the file being compared, including its extension and subdirectory, if any.\n\nfilePosition (integer) --The position of a change in a compared file, in line number format.\n\nrelativeFileVersion (string) --In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.\n\n\n
:type content: string
:param content: [REQUIRED]\nThe content of the comment you want to make.\n
:type clientRequestToken: string
:param clientRequestToken: A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.\nThis field is autopopulated if not provided.\n
:rtype: dict
ReturnsResponse Syntax
{
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
repositoryName (string) --
The name of the repository where you posted a comment on the comparison between commits.
beforeCommitId (string) --
In the directionality you established, the full commit ID of the before commit.
afterCommitId (string) --
In the directionality you established, the full commit ID of the after commit.
beforeBlobId (string) --
In the directionality you established, the blob ID of the before blob.
afterBlobId (string) --
In the directionality you established, the blob ID of the after blob.
location (dict) --
The location of the comment in the comparison between the two commits.
filePath (string) --
The name of the file being compared, including its extension and subdirectory, if any.
filePosition (integer) --
The position of a change in a compared file, in line number format.
relativeFileVersion (string) --
In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.
comment (dict) --
The content of the comment you posted.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidFileLocationException
CodeCommit.Client.exceptions.InvalidRelativeFileVersionEnumException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidFilePositionException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.BeforeCommitIdAndAfterCommitIdAreSameException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.PathDoesNotExistException
:return: {
'repositoryName': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidFileLocationException
CodeCommit.Client.exceptions.InvalidRelativeFileVersionEnumException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidFilePositionException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.BeforeCommitIdAndAfterCommitIdAreSameException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.PathDoesNotExistException
"""
pass
def post_comment_for_pull_request(pullRequestId=None, repositoryName=None, beforeCommitId=None, afterCommitId=None, location=None, content=None, clientRequestToken=None):
"""
Posts a comment on a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.post_comment_for_pull_request(
pullRequestId='string',
repositoryName='string',
beforeCommitId='string',
afterCommitId='string',
location={
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
content='string',
clientRequestToken='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to post a comment on a pull request.\n
:type beforeCommitId: string
:param beforeCommitId: [REQUIRED]\nThe full commit ID of the commit in the destination branch that was the tip of the branch at the time the pull request was created.\n
:type afterCommitId: string
:param afterCommitId: [REQUIRED]\nThe full commit ID of the commit in the source branch that is the current tip of the branch for the pull request when you post the comment.\n
:type location: dict
:param location: The location of the change where you want to post your comment. If no location is provided, the comment is posted as a general comment on the pull request difference between the before commit ID and the after commit ID.\n\nfilePath (string) --The name of the file being compared, including its extension and subdirectory, if any.\n\nfilePosition (integer) --The position of a change in a compared file, in line number format.\n\nrelativeFileVersion (string) --In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.\n\n\n
:type content: string
:param content: [REQUIRED]\nThe content of your comment on the change.\n
:type clientRequestToken: string
:param clientRequestToken: A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.\nThis field is autopopulated if not provided.\n
:rtype: dict
ReturnsResponse Syntax
{
'repositoryName': 'string',
'pullRequestId': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
repositoryName (string) --
The name of the repository where you posted a comment on a pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
beforeCommitId (string) --
The full commit ID of the commit in the source branch used to create the pull request, or in the case of an updated pull request, the full commit ID of the commit used to update the pull request.
afterCommitId (string) --
The full commit ID of the commit in the destination branch where the pull request is merged.
beforeBlobId (string) --
In the directionality of the pull request, the blob ID of the before blob.
afterBlobId (string) --
In the directionality of the pull request, the blob ID of the after blob.
location (dict) --
The location of the change where you posted your comment.
filePath (string) --
The name of the file being compared, including its extension and subdirectory, if any.
filePosition (integer) --
The position of a change in a compared file, in line number format.
relativeFileVersion (string) --
In a comparison of commits or a pull request, whether the change is in the before or after of that comparison.
comment (dict) --
The content of the comment you posted.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidFileLocationException
CodeCommit.Client.exceptions.InvalidRelativeFileVersionEnumException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidFilePositionException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.PathDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.BeforeCommitIdAndAfterCommitIdAreSameException
:return: {
'repositoryName': 'string',
'pullRequestId': 'string',
'beforeCommitId': 'string',
'afterCommitId': 'string',
'beforeBlobId': 'string',
'afterBlobId': 'string',
'location': {
'filePath': 'string',
'filePosition': 123,
'relativeFileVersion': 'BEFORE'|'AFTER'
},
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.RepositoryNotAssociatedWithPullRequestException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.InvalidFileLocationException
CodeCommit.Client.exceptions.InvalidRelativeFileVersionEnumException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidFilePositionException
CodeCommit.Client.exceptions.CommitIdRequiredException
CodeCommit.Client.exceptions.InvalidCommitIdException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.CommitDoesNotExistException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.PathDoesNotExistException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.BeforeCommitIdAndAfterCommitIdAreSameException
"""
pass
def post_comment_reply(inReplyTo=None, clientRequestToken=None, content=None):
"""
Posts a comment in reply to an existing comment on a comparison between commits or a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.post_comment_reply(
inReplyTo='string',
clientRequestToken='string',
content='string'
)
:type inReplyTo: string
:param inReplyTo: [REQUIRED]\nThe system-generated ID of the comment to which you want to reply. To get this ID, use GetCommentsForComparedCommit or GetCommentsForPullRequest .\n
:type clientRequestToken: string
:param clientRequestToken: A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.\nThis field is autopopulated if not provided.\n
:type content: string
:param content: [REQUIRED]\nThe contents of your reply to a comment.\n
:rtype: dict
ReturnsResponse Syntax
{
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
comment (dict) --
Information about the reply to a comment.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
:return: {
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.ClientRequestTokenRequiredException
CodeCommit.Client.exceptions.InvalidClientRequestTokenException
CodeCommit.Client.exceptions.IdempotencyParameterMismatchException
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
"""
pass
def put_file(repositoryName=None, branchName=None, fileContent=None, filePath=None, fileMode=None, parentCommitId=None, commitMessage=None, name=None, email=None):
"""
Adds or updates a file in a branch in an AWS CodeCommit repository, and generates a commit for the addition in the specified branch.
See also: AWS API Documentation
Exceptions
:example: response = client.put_file(
repositoryName='string',
branchName='string',
fileContent=b'bytes',
filePath='string',
fileMode='EXECUTABLE'|'NORMAL'|'SYMLINK',
parentCommitId='string',
commitMessage='string',
name='string',
email='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to add or update the file.\n
:type branchName: string
:param branchName: [REQUIRED]\nThe name of the branch where you want to add or update the file. If this is an empty repository, this branch is created.\n
:type fileContent: bytes
:param fileContent: [REQUIRED]\nThe content of the file, in binary object format.\n
:type filePath: string
:param filePath: [REQUIRED]\nThe name of the file you want to add or update, including the relative path to the file in the repository.\n\nNote\nIf the path does not currently exist in the repository, the path is created as part of adding the file.\n\n
:type fileMode: string
:param fileMode: The file mode permissions of the blob. Valid file mode permissions are listed here.
:type parentCommitId: string
:param parentCommitId: The full commit ID of the head commit in the branch where you want to add or update the file. If this is an empty repository, no commit ID is required. If this is not an empty repository, a commit ID is required.\nThe commit ID must match the ID of the head commit at the time of the operation. Otherwise, an error occurs, and the file is not added or updated.\n
:type commitMessage: string
:param commitMessage: A message about why this file was added or updated. Although it is optional, a message makes the commit history for your repository more useful.
:type name: string
:param name: The name of the person adding or updating the file. Although it is optional, a name makes the commit history for your repository more useful.
:type email: string
:param email: An email address for the person adding or updating the file.
:rtype: dict
ReturnsResponse Syntax
{
'commitId': 'string',
'blobId': 'string',
'treeId': 'string'
}
Response Structure
(dict) --
commitId (string) --
The full SHA ID of the commit that contains this file change.
blobId (string) --
The ID of the blob, which is its SHA-1 pointer.
treeId (string) --
The full SHA-1 pointer of the tree information for the commit that contains this file change.
Exceptions
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.FileContentRequiredException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidDeletionParameterException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.SameFileContentException
CodeCommit.Client.exceptions.FileNameConflictsWithDirectoryNameException
CodeCommit.Client.exceptions.DirectoryNameConflictsWithFileNameException
CodeCommit.Client.exceptions.FilePathConflictsWithSubmodulePathException
:return: {
'commitId': 'string',
'blobId': 'string',
'treeId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdRequiredException
CodeCommit.Client.exceptions.InvalidParentCommitIdException
CodeCommit.Client.exceptions.ParentCommitDoesNotExistException
CodeCommit.Client.exceptions.ParentCommitIdOutdatedException
CodeCommit.Client.exceptions.FileContentRequiredException
CodeCommit.Client.exceptions.FileContentSizeLimitExceededException
CodeCommit.Client.exceptions.FolderContentSizeLimitExceededException
CodeCommit.Client.exceptions.PathRequiredException
CodeCommit.Client.exceptions.InvalidPathException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.BranchNameIsTagNameException
CodeCommit.Client.exceptions.InvalidFileModeException
CodeCommit.Client.exceptions.NameLengthExceededException
CodeCommit.Client.exceptions.InvalidEmailException
CodeCommit.Client.exceptions.CommitMessageLengthExceededException
CodeCommit.Client.exceptions.InvalidDeletionParameterException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
CodeCommit.Client.exceptions.SameFileContentException
CodeCommit.Client.exceptions.FileNameConflictsWithDirectoryNameException
CodeCommit.Client.exceptions.DirectoryNameConflictsWithFileNameException
CodeCommit.Client.exceptions.FilePathConflictsWithSubmodulePathException
"""
pass
def put_repository_triggers(repositoryName=None, triggers=None):
"""
Replaces all triggers for a repository. Used to create or delete triggers.
See also: AWS API Documentation
Exceptions
:example: response = client.put_repository_triggers(
repositoryName='string',
triggers=[
{
'name': 'string',
'destinationArn': 'string',
'customData': 'string',
'branches': [
'string',
],
'events': [
'all'|'updateReference'|'createReference'|'deleteReference',
]
},
]
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository where you want to create or update the trigger.\n
:type triggers: list
:param triggers: [REQUIRED]\nThe JSON block of configuration information for each trigger.\n\n(dict) --Information about a trigger for a repository.\n\nname (string) -- [REQUIRED]The name of the trigger.\n\ndestinationArn (string) -- [REQUIRED]The ARN of the resource that is the target for a trigger (for example, the ARN of a topic in Amazon SNS).\n\ncustomData (string) --Any custom data associated with the trigger to be included in the information sent to the target of the trigger.\n\nbranches (list) --The branches to be included in the trigger configuration. If you specify an empty array, the trigger applies to all branches.\n\nNote\nAlthough no content is required in the array, you must include the array itself.\n\n\n(string) --\n\n\nevents (list) -- [REQUIRED]The repository events that cause the trigger to run actions in another service, such as sending a notification through Amazon SNS.\n\nNote\nThe valid value 'all' cannot be used with any other values.\n\n\n(string) --\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'configurationId': 'string'
}
Response Structure
(dict) --
Represents the output of a put repository triggers operation.
configurationId (string) --
The system-generated unique ID for the create or update operation.
Exceptions
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryTriggersListRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryTriggersExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerDestinationArnException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerRegionException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerCustomDataException
CodeCommit.Client.exceptions.MaximumBranchesExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerBranchNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerEventsException
CodeCommit.Client.exceptions.RepositoryTriggerNameRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerDestinationArnRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerBranchNameListRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerEventsListRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'configurationId': 'string'
}
:returns:
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryTriggersListRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryTriggersExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerDestinationArnException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerRegionException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerCustomDataException
CodeCommit.Client.exceptions.MaximumBranchesExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerBranchNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerEventsException
CodeCommit.Client.exceptions.RepositoryTriggerNameRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerDestinationArnRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerBranchNameListRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerEventsListRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def tag_resource(resourceArn=None, tags=None):
"""
Adds or updates tags for a resource in AWS CodeCommit. For a list of valid resources in AWS CodeCommit, see CodeCommit Resources and Operations in the AWS CodeCommit User Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.tag_resource(
resourceArn='string',
tags={
'string': 'string'
}
)
:type resourceArn: string
:param resourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) of the resource to which you want to add or update tags.\n
:type tags: dict
:param tags: [REQUIRED]\nThe key-value pair to use when tagging this repository.\n\n(string) --\n(string) --\n\n\n\n
:returns:
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ResourceArnRequiredException
CodeCommit.Client.exceptions.InvalidResourceArnException
CodeCommit.Client.exceptions.TagsMapRequiredException
CodeCommit.Client.exceptions.InvalidTagsMapException
CodeCommit.Client.exceptions.TooManyTagsException
CodeCommit.Client.exceptions.InvalidSystemTagUsageException
CodeCommit.Client.exceptions.TagPolicyException
"""
pass
def test_repository_triggers(repositoryName=None, triggers=None):
"""
Tests the functionality of repository triggers by sending information to the trigger target. If real data is available in the repository, the test sends data from the last commit. If no data is available, sample data is generated.
See also: AWS API Documentation
Exceptions
:example: response = client.test_repository_triggers(
repositoryName='string',
triggers=[
{
'name': 'string',
'destinationArn': 'string',
'customData': 'string',
'branches': [
'string',
],
'events': [
'all'|'updateReference'|'createReference'|'deleteReference',
]
},
]
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository in which to test the triggers.\n
:type triggers: list
:param triggers: [REQUIRED]\nThe list of triggers to test.\n\n(dict) --Information about a trigger for a repository.\n\nname (string) -- [REQUIRED]The name of the trigger.\n\ndestinationArn (string) -- [REQUIRED]The ARN of the resource that is the target for a trigger (for example, the ARN of a topic in Amazon SNS).\n\ncustomData (string) --Any custom data associated with the trigger to be included in the information sent to the target of the trigger.\n\nbranches (list) --The branches to be included in the trigger configuration. If you specify an empty array, the trigger applies to all branches.\n\nNote\nAlthough no content is required in the array, you must include the array itself.\n\n\n(string) --\n\n\nevents (list) -- [REQUIRED]The repository events that cause the trigger to run actions in another service, such as sending a notification through Amazon SNS.\n\nNote\nThe valid value 'all' cannot be used with any other values.\n\n\n(string) --\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'successfulExecutions': [
'string',
],
'failedExecutions': [
{
'trigger': 'string',
'failureMessage': 'string'
},
]
}
Response Structure
(dict) --
Represents the output of a test repository triggers operation.
successfulExecutions (list) --
The list of triggers that were successfully tested. This list provides the names of the triggers that were successfully tested, separated by commas.
(string) --
failedExecutions (list) --
The list of triggers that were not tested. This list provides the names of the triggers that could not be tested, separated by commas.
(dict) --
A trigger failed to run.
trigger (string) --
The name of the trigger that did not run.
failureMessage (string) --
Message information about the trigger that did not run.
Exceptions
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.RepositoryTriggersListRequiredException
CodeCommit.Client.exceptions.MaximumRepositoryTriggersExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerDestinationArnException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerRegionException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerCustomDataException
CodeCommit.Client.exceptions.MaximumBranchesExceededException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerBranchNameException
CodeCommit.Client.exceptions.InvalidRepositoryTriggerEventsException
CodeCommit.Client.exceptions.RepositoryTriggerNameRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerDestinationArnRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerBranchNameListRequiredException
CodeCommit.Client.exceptions.RepositoryTriggerEventsListRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'successfulExecutions': [
'string',
],
'failedExecutions': [
{
'trigger': 'string',
'failureMessage': 'string'
},
]
}
:returns:
(string) --
"""
pass
def untag_resource(resourceArn=None, tagKeys=None):
"""
Removes tags for a resource in AWS CodeCommit. For a list of valid resources in AWS CodeCommit, see CodeCommit Resources and Operations in the AWS CodeCommit User Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.untag_resource(
resourceArn='string',
tagKeys=[
'string',
]
)
:type resourceArn: string
:param resourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) of the resource to which you want to remove tags.\n
:type tagKeys: list
:param tagKeys: [REQUIRED]\nThe tag key for each tag that you want to remove from the resource.\n\n(string) --\n\n
:returns:
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.ResourceArnRequiredException
CodeCommit.Client.exceptions.InvalidResourceArnException
CodeCommit.Client.exceptions.TagKeysListRequiredException
CodeCommit.Client.exceptions.InvalidTagKeysListException
CodeCommit.Client.exceptions.TooManyTagsException
CodeCommit.Client.exceptions.InvalidSystemTagUsageException
CodeCommit.Client.exceptions.TagPolicyException
"""
pass
def update_approval_rule_template_content(approvalRuleTemplateName=None, newRuleContent=None, existingRuleContentSha256=None):
"""
Updates the content of an approval rule template. You can change the number of required approvals, the membership of the approval rule, and whether an approval pool is defined.
See also: AWS API Documentation
Exceptions
:example: response = client.update_approval_rule_template_content(
approvalRuleTemplateName='string',
newRuleContent='string',
existingRuleContentSha256='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the approval rule template where you want to update the content of the rule.\n
:type newRuleContent: string
:param newRuleContent: [REQUIRED]\nThe content that replaces the existing content of the rule. Content statements must be complete. You cannot provide only the changes.\n
:type existingRuleContentSha256: string
:param existingRuleContentSha256: The SHA-256 hash signature for the content of the approval rule. You can retrieve this information by using GetPullRequest .
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
Response Structure
(dict) --
approvalRuleTemplate (dict) --
Returns information about an approval rule template.
approvalRuleTemplateId (string) --
The system-generated ID of the approval rule template.
approvalRuleTemplateName (string) --
The name of the approval rule template.
approvalRuleTemplateDescription (string) --
The description of the approval rule template.
approvalRuleTemplateContent (string) --
The content of the approval rule template.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule template.
lastModifiedDate (datetime) --
The date the approval rule template was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule template was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule template.
Exceptions
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateContentException
CodeCommit.Client.exceptions.InvalidRuleContentSha256Exception
CodeCommit.Client.exceptions.ApprovalRuleTemplateContentRequiredException
:return: {
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateContentException
CodeCommit.Client.exceptions.InvalidRuleContentSha256Exception
CodeCommit.Client.exceptions.ApprovalRuleTemplateContentRequiredException
"""
pass
def update_approval_rule_template_description(approvalRuleTemplateName=None, approvalRuleTemplateDescription=None):
"""
Updates the description for a specified approval rule template.
See also: AWS API Documentation
Exceptions
:example: response = client.update_approval_rule_template_description(
approvalRuleTemplateName='string',
approvalRuleTemplateDescription='string'
)
:type approvalRuleTemplateName: string
:param approvalRuleTemplateName: [REQUIRED]\nThe name of the template for which you want to update the description.\n
:type approvalRuleTemplateDescription: string
:param approvalRuleTemplateDescription: [REQUIRED]\nThe updated description of the approval rule template.\n
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
Response Structure
(dict) --
approvalRuleTemplate (dict) --
The structure and content of the updated approval rule template.
approvalRuleTemplateId (string) --
The system-generated ID of the approval rule template.
approvalRuleTemplateName (string) --
The name of the approval rule template.
approvalRuleTemplateDescription (string) --
The description of the approval rule template.
approvalRuleTemplateContent (string) --
The content of the approval rule template.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule template.
lastModifiedDate (datetime) --
The date the approval rule template was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule template was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule template.
Exceptions
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateDescriptionException
:return: {
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateDescriptionException
"""
pass
def update_approval_rule_template_name(oldApprovalRuleTemplateName=None, newApprovalRuleTemplateName=None):
"""
Updates the name of a specified approval rule template.
See also: AWS API Documentation
Exceptions
:example: response = client.update_approval_rule_template_name(
oldApprovalRuleTemplateName='string',
newApprovalRuleTemplateName='string'
)
:type oldApprovalRuleTemplateName: string
:param oldApprovalRuleTemplateName: [REQUIRED]\nThe current name of the approval rule template.\n
:type newApprovalRuleTemplateName: string
:param newApprovalRuleTemplateName: [REQUIRED]\nThe new name you want to apply to the approval rule template.\n
:rtype: dict
ReturnsResponse Syntax
{
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
Response Structure
(dict) --
approvalRuleTemplate (dict) --
The structure and content of the updated approval rule template.
approvalRuleTemplateId (string) --
The system-generated ID of the approval rule template.
approvalRuleTemplateName (string) --
The name of the approval rule template.
approvalRuleTemplateDescription (string) --
The description of the approval rule template.
approvalRuleTemplateContent (string) --
The content of the approval rule template.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule template.
lastModifiedDate (datetime) --
The date the approval rule template was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule template was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule template.
Exceptions
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameAlreadyExistsException
:return: {
'approvalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string',
'approvalRuleTemplateDescription': 'string',
'approvalRuleTemplateContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.InvalidApprovalRuleTemplateNameException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameRequiredException
CodeCommit.Client.exceptions.ApprovalRuleTemplateDoesNotExistException
CodeCommit.Client.exceptions.ApprovalRuleTemplateNameAlreadyExistsException
"""
pass
def update_comment(commentId=None, content=None):
"""
Replaces the contents of a comment.
See also: AWS API Documentation
Exceptions
:example: response = client.update_comment(
commentId='string',
content='string'
)
:type commentId: string
:param commentId: [REQUIRED]\nThe system-generated ID of the comment you want to update. To get this ID, use GetCommentsForComparedCommit or GetCommentsForPullRequest .\n
:type content: string
:param content: [REQUIRED]\nThe updated content to replace the existing content of the comment.\n
:rtype: dict
ReturnsResponse Syntax
{
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
Response Structure
(dict) --
comment (dict) --
Information about the updated comment.
commentId (string) --
The system-generated comment ID.
content (string) --
The content of the comment.
inReplyTo (string) --
The ID of the comment for which this comment is a reply, if any.
creationDate (datetime) --
The date and time the comment was created, in timestamp format.
lastModifiedDate (datetime) --
The date and time the comment was most recently modified, in timestamp format.
authorArn (string) --
The Amazon Resource Name (ARN) of the person who posted the comment.
deleted (boolean) --
A Boolean value indicating whether the comment has been deleted.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
Exceptions
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
CodeCommit.Client.exceptions.CommentNotCreatedByCallerException
CodeCommit.Client.exceptions.CommentDeletedException
:return: {
'comment': {
'commentId': 'string',
'content': 'string',
'inReplyTo': 'string',
'creationDate': datetime(2015, 1, 1),
'lastModifiedDate': datetime(2015, 1, 1),
'authorArn': 'string',
'deleted': True|False,
'clientRequestToken': 'string'
}
}
:returns:
CodeCommit.Client.exceptions.CommentContentRequiredException
CodeCommit.Client.exceptions.CommentContentSizeLimitExceededException
CodeCommit.Client.exceptions.CommentDoesNotExistException
CodeCommit.Client.exceptions.CommentIdRequiredException
CodeCommit.Client.exceptions.InvalidCommentIdException
CodeCommit.Client.exceptions.CommentNotCreatedByCallerException
CodeCommit.Client.exceptions.CommentDeletedException
"""
pass
def update_default_branch(repositoryName=None, defaultBranchName=None):
"""
Sets or changes the default branch name for the specified repository.
See also: AWS API Documentation
Exceptions
:example: response = client.update_default_branch(
repositoryName='string',
defaultBranchName='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository to set or change the default branch for.\n
:type defaultBranchName: string
:param defaultBranchName: [REQUIRED]\nThe name of the branch to set as the default.\n
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.BranchNameRequiredException
CodeCommit.Client.exceptions.InvalidBranchNameException
CodeCommit.Client.exceptions.BranchDoesNotExistException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def update_pull_request_approval_rule_content(pullRequestId=None, approvalRuleName=None, existingRuleContentSha256=None, newRuleContent=None):
"""
Updates the structure of an approval rule created specifically for a pull request. For example, you can change the number of required approvers and the approval pool for approvers.
See also: AWS API Documentation
Exceptions
:example: response = client.update_pull_request_approval_rule_content(
pullRequestId='string',
approvalRuleName='string',
existingRuleContentSha256='string',
newRuleContent='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request.\n
:type approvalRuleName: string
:param approvalRuleName: [REQUIRED]\nThe name of the approval rule you want to update.\n
:type existingRuleContentSha256: string
:param existingRuleContentSha256: The SHA-256 hash signature for the content of the approval rule. You can retrieve this information by using GetPullRequest .
:type newRuleContent: string
:param newRuleContent: [REQUIRED]\nThe updated content for the approval rule.\n\nNote\nWhen you update the content of the approval rule, you can specify approvers in an approval pool in one of two ways:\n\nCodeCommitApprovers : This option only requires an AWS account and a resource. It can be used for both IAM users and federated access users whose name matches the provided resource name. This is a very powerful option that offers a great deal of flexibility. For example, if you specify the AWS account 123456789012 and Mary_Major , all of the following are counted as approvals coming from that user:\nAn IAM user in the account (arn:aws:iam::123456789012 :user/Mary_Major )\nA federated user identified in IAM as Mary_Major (arn:aws:sts::123456789012 :federated-user/Mary_Major )\n\n\n\nThis option does not recognize an active session of someone assuming the role of CodeCommitReview with a role session name of Mary_Major (arn:aws:sts::123456789012 :assumed-role/CodeCommitReview/Mary_Major ) unless you include a wildcard (*Mary_Major).\n\nFully qualified ARN : This option allows you to specify the fully qualified Amazon Resource Name (ARN) of the IAM user or role.\n\nFor more information about IAM ARNs, wildcards, and formats, see IAM Identifiers in the IAM User Guide .\n\n
:rtype: dict
ReturnsResponse Syntax
{
'approvalRule': {
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
}
}
Response Structure
(dict) --
approvalRule (dict) --
Information about the updated approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.ApprovalRuleDoesNotExistException
CodeCommit.Client.exceptions.InvalidRuleContentSha256Exception
CodeCommit.Client.exceptions.ApprovalRuleContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleContentException
CodeCommit.Client.exceptions.CannotModifyApprovalRuleFromTemplateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'approvalRule': {
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.ApprovalRuleNameRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleNameException
CodeCommit.Client.exceptions.ApprovalRuleDoesNotExistException
CodeCommit.Client.exceptions.InvalidRuleContentSha256Exception
CodeCommit.Client.exceptions.ApprovalRuleContentRequiredException
CodeCommit.Client.exceptions.InvalidApprovalRuleContentException
CodeCommit.Client.exceptions.CannotModifyApprovalRuleFromTemplateException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def update_pull_request_approval_state(pullRequestId=None, revisionId=None, approvalState=None):
"""
Updates the state of a user\'s approval on a pull request. The user is derived from the signed-in account when the request is made.
See also: AWS API Documentation
Exceptions
:example: response = client.update_pull_request_approval_state(
pullRequestId='string',
revisionId='string',
approvalState='APPROVE'|'REVOKE'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request.\n
:type revisionId: string
:param revisionId: [REQUIRED]\nThe system-generated ID of the revision.\n
:type approvalState: string
:param approvalState: [REQUIRED]\nThe approval state to associate with the user on the pull request.\n
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidRevisionIdException
CodeCommit.Client.exceptions.RevisionIdRequiredException
CodeCommit.Client.exceptions.InvalidApprovalStateException
CodeCommit.Client.exceptions.ApprovalStateRequiredException
CodeCommit.Client.exceptions.PullRequestCannotBeApprovedByAuthorException
CodeCommit.Client.exceptions.RevisionNotCurrentException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
CodeCommit.Client.exceptions.MaximumNumberOfApprovalsExceededException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def update_pull_request_description(pullRequestId=None, description=None):
"""
Replaces the contents of the description of a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.update_pull_request_description(
pullRequestId='string',
description='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type description: string
:param description: [REQUIRED]\nThe updated content of the description for the pull request. This content replaces the existing description.\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Information about the updated pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidDescriptionException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidDescriptionException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
"""
pass
def update_pull_request_status(pullRequestId=None, pullRequestStatus=None):
"""
Updates the status of a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.update_pull_request_status(
pullRequestId='string',
pullRequestStatus='OPEN'|'CLOSED'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type pullRequestStatus: string
:param pullRequestStatus: [REQUIRED]\nThe status of the pull request. The only valid operations are to update the status from OPEN to OPEN , OPEN to CLOSED or from CLOSED to CLOSED .\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Information about the pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestStatusUpdateException
CodeCommit.Client.exceptions.InvalidPullRequestStatusException
CodeCommit.Client.exceptions.PullRequestStatusRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.InvalidPullRequestStatusUpdateException
CodeCommit.Client.exceptions.InvalidPullRequestStatusException
CodeCommit.Client.exceptions.PullRequestStatusRequiredException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def update_pull_request_title(pullRequestId=None, title=None):
"""
Replaces the title of a pull request.
See also: AWS API Documentation
Exceptions
:example: response = client.update_pull_request_title(
pullRequestId='string',
title='string'
)
:type pullRequestId: string
:param pullRequestId: [REQUIRED]\nThe system-generated ID of the pull request. To get this ID, use ListPullRequests .\n
:type title: string
:param title: [REQUIRED]\nThe updated title of the pull request. This replaces the existing title.\n
:rtype: dict
ReturnsResponse Syntax
{
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
Response Structure
(dict) --
pullRequest (dict) --
Information about the updated pull request.
pullRequestId (string) --
The system-generated ID of the pull request.
title (string) --
The user-defined title of the pull request. This title is displayed in the list of pull requests to other repository users.
description (string) --
The user-defined description of the pull request. This description can be used to clarify what should be reviewed and other details of the request.
lastActivityDate (datetime) --
The day and time of the last user or system activity on the pull request, in timestamp format.
creationDate (datetime) --
The date and time the pull request was originally created, in timestamp format.
pullRequestStatus (string) --
The status of the pull request. Pull request status can only change from OPEN to CLOSED .
authorArn (string) --
The Amazon Resource Name (ARN) of the user who created the pull request.
pullRequestTargets (list) --
The targets of the pull request, including the source branch and destination branch for the pull request.
(dict) --
Returns information about a pull request target.
repositoryName (string) --
The name of the repository that contains the pull request source and destination branches.
sourceReference (string) --
The branch of the repository that contains the changes for the pull request. Also known as the source branch.
destinationReference (string) --
The branch of the repository where the pull request changes are merged. Also known as the destination branch.
destinationCommit (string) --
The full commit ID that is the tip of the destination branch. This is the commit where the pull request was or will be merged.
sourceCommit (string) --
The full commit ID of the tip of the source branch used to create the pull request. If the pull request branch is updated by a push while the pull request is open, the commit ID changes to reflect the new tip of the branch.
mergeBase (string) --
The commit ID of the most recent commit that the source branch and the destination branch have in common.
mergeMetadata (dict) --
Returns metadata about the state of the merge, including whether the merge has been made.
isMerged (boolean) --
A Boolean value indicating whether the merge has been made.
mergedBy (string) --
The Amazon Resource Name (ARN) of the user who merged the branches.
mergeCommitId (string) --
The commit ID for the merge commit, if any.
mergeOption (string) --
The merge strategy used in the merge.
clientRequestToken (string) --
A unique, client-generated idempotency token that, when provided in a request, ensures the request cannot be repeated with a changed parameter. If a request is received with the same parameters and a token is included, the request returns information about the initial request that used that token.
revisionId (string) --
The system-generated revision ID for the pull request.
approvalRules (list) --
The approval rules applied to the pull request.
(dict) --
Returns information about an approval rule.
approvalRuleId (string) --
The system-generated ID of the approval rule.
approvalRuleName (string) --
The name of the approval rule.
approvalRuleContent (string) --
The content of the approval rule.
ruleContentSha256 (string) --
The SHA-256 hash signature for the content of the approval rule.
lastModifiedDate (datetime) --
The date the approval rule was most recently changed, in timestamp format.
creationDate (datetime) --
The date the approval rule was created, in timestamp format.
lastModifiedUser (string) --
The Amazon Resource Name (ARN) of the user who made the most recent changes to the approval rule.
originApprovalRuleTemplate (dict) --
The approval rule template used to create the rule.
approvalRuleTemplateId (string) --
The ID of the template that created the approval rule.
approvalRuleTemplateName (string) --
The name of the template that created the approval rule.
Exceptions
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.TitleRequiredException
CodeCommit.Client.exceptions.InvalidTitleException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
:return: {
'pullRequest': {
'pullRequestId': 'string',
'title': 'string',
'description': 'string',
'lastActivityDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'pullRequestStatus': 'OPEN'|'CLOSED',
'authorArn': 'string',
'pullRequestTargets': [
{
'repositoryName': 'string',
'sourceReference': 'string',
'destinationReference': 'string',
'destinationCommit': 'string',
'sourceCommit': 'string',
'mergeBase': 'string',
'mergeMetadata': {
'isMerged': True|False,
'mergedBy': 'string',
'mergeCommitId': 'string',
'mergeOption': 'FAST_FORWARD_MERGE'|'SQUASH_MERGE'|'THREE_WAY_MERGE'
}
},
],
'clientRequestToken': 'string',
'revisionId': 'string',
'approvalRules': [
{
'approvalRuleId': 'string',
'approvalRuleName': 'string',
'approvalRuleContent': 'string',
'ruleContentSha256': 'string',
'lastModifiedDate': datetime(2015, 1, 1),
'creationDate': datetime(2015, 1, 1),
'lastModifiedUser': 'string',
'originApprovalRuleTemplate': {
'approvalRuleTemplateId': 'string',
'approvalRuleTemplateName': 'string'
}
},
]
}
}
:returns:
CodeCommit.Client.exceptions.PullRequestDoesNotExistException
CodeCommit.Client.exceptions.InvalidPullRequestIdException
CodeCommit.Client.exceptions.PullRequestIdRequiredException
CodeCommit.Client.exceptions.TitleRequiredException
CodeCommit.Client.exceptions.InvalidTitleException
CodeCommit.Client.exceptions.PullRequestAlreadyClosedException
"""
pass
def update_repository_description(repositoryName=None, repositoryDescription=None):
"""
Sets or changes the comment or description for a repository.
See also: AWS API Documentation
Exceptions
:example: response = client.update_repository_description(
repositoryName='string',
repositoryDescription='string'
)
:type repositoryName: string
:param repositoryName: [REQUIRED]\nThe name of the repository to set or change the comment or description for.\n
:type repositoryDescription: string
:param repositoryDescription: The new comment or description for the specified repository. Repository descriptions are limited to 1,000 characters.
:returns:
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
CodeCommit.Client.exceptions.InvalidRepositoryDescriptionException
CodeCommit.Client.exceptions.EncryptionIntegrityChecksFailedException
CodeCommit.Client.exceptions.EncryptionKeyAccessDeniedException
CodeCommit.Client.exceptions.EncryptionKeyDisabledException
CodeCommit.Client.exceptions.EncryptionKeyNotFoundException
CodeCommit.Client.exceptions.EncryptionKeyUnavailableException
"""
pass
def update_repository_name(oldName=None, newName=None):
"""
Renames a repository. The repository name must be unique across the calling AWS account. Repository names are limited to 100 alphanumeric, dash, and underscore characters, and cannot include certain characters. The suffix .git is prohibited. For more information about the limits on repository names, see Limits in the AWS CodeCommit User Guide.
See also: AWS API Documentation
Exceptions
:example: response = client.update_repository_name(
oldName='string',
newName='string'
)
:type oldName: string
:param oldName: [REQUIRED]\nThe current name of the repository.\n
:type newName: string
:param newName: [REQUIRED]\nThe new name for the repository.\n
:returns:
CodeCommit.Client.exceptions.RepositoryDoesNotExistException
CodeCommit.Client.exceptions.RepositoryNameExistsException
CodeCommit.Client.exceptions.RepositoryNameRequiredException
CodeCommit.Client.exceptions.InvalidRepositoryNameException
"""
pass
| 37.051582 | 1,571 | 0.729415 | 40,272 | 405,122 | 7.317342 | 0.02945 | 0.09708 | 0.157755 | 0.006556 | 0.923586 | 0.907837 | 0.894528 | 0.879725 | 0.868008 | 0.861777 | 0 | 0.004906 | 0.19757 | 405,122 | 10,933 | 1,572 | 37.054971 | 0.901585 | 0.972976 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
c4e2a4d4c0a371c295c1dd5ff7819064743d837b | 68,579 | py | Python | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_heteroFair/cmp_tonto/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_heteroFair/cmp_tonto/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_heteroFair/cmp_tonto/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.18436,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.347493,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.846305,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.675572,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 1.16985,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.67094,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 2.51636,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.538025,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 7.56174,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.159885,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.02449,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.252185,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.181119,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.41207,
'Execution Unit/Register Files/Runtime Dynamic': 0.205609,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.65989,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 1.67218,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 5.14702,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00309542,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00309542,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00268775,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.0010359,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00260178,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.0114804,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0299773,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.174114,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 6.43323,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.490841,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.591369,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.96874,
'Instruction Fetch Unit/Runtime Dynamic': 1.29778,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0433456,
'L2/Runtime Dynamic': 0.00926367,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 7.60604,
'Load Store Unit/Data Cache/Runtime Dynamic': 3.07636,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.20605,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.20605,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 8.58301,
'Load Store Unit/Runtime Dynamic': 4.29857,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.508083,
'Load Store Unit/StoreQ/Runtime Dynamic': 1.01617,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.18032,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.180735,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.399995,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0811651,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.870233,
'Memory Management Unit/Runtime Dynamic': 0.2619,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 30.5888,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.557803,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0412572,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.343812,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.942872,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 11.9574,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0817771,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.26692,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.375543,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.242988,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.391931,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.197834,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.832753,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.220332,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.95366,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0709482,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.010192,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.107004,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0753762,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.177952,
'Execution Unit/Register Files/Runtime Dynamic': 0.0855682,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.245876,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.617821,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 2.20844,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00138866,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00138866,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00125056,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000506553,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00108279,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00511066,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0118483,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0724611,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.60915,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.204401,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.24611,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 7.05135,
'Instruction Fetch Unit/Runtime Dynamic': 0.539932,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0185047,
'L2/Runtime Dynamic': 0.0039915,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.89314,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.28327,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0859286,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0859286,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 4.29892,
'Load Store Unit/Runtime Dynamic': 1.79297,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.211885,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.42377,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0751988,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0753781,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.28658,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0338002,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.571867,
'Memory Management Unit/Runtime Dynamic': 0.109178,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 20.4838,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.186632,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0132342,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.121104,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.32097,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.97548,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.078721,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.264519,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.361795,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.243305,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.392443,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.198092,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.83384,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.222802,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.93269,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0683509,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0102053,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.105844,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0754746,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.174194,
'Execution Unit/Register Files/Runtime Dynamic': 0.0856799,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.242667,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.611647,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 2.20106,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00145074,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00145074,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00130694,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000529647,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.0010842,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00529262,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0123608,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0725557,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.61516,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.205219,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.246432,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 7.05766,
'Instruction Fetch Unit/Runtime Dynamic': 0.54186,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.018408,
'L2/Runtime Dynamic': 0.00388542,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.88173,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.27705,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0855594,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0855595,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 4.28576,
'Load Store Unit/Runtime Dynamic': 1.78456,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.210975,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.42195,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0748756,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0750482,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.286954,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0339504,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.571686,
'Memory Management Unit/Runtime Dynamic': 0.108999,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 20.4557,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.179799,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0131654,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.121393,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.314358,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.95473,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0794153,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.265065,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.36517,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.243519,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.392787,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.198266,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.834571,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.222529,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.93832,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0689885,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0102143,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.106184,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0755408,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.175172,
'Execution Unit/Register Files/Runtime Dynamic': 0.0857551,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.243558,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.615481,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 2.20625,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00142372,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00142372,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00128308,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000520229,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00108515,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00521567,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0121135,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0726193,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.61921,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.204341,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.246648,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 7.06191,
'Instruction Fetch Unit/Runtime Dynamic': 0.540937,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0189068,
'L2/Runtime Dynamic': 0.00402762,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.90081,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.28687,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0861767,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0861768,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 4.30775,
'Load Store Unit/Runtime Dynamic': 1.79804,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.212497,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.424994,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0754159,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0755978,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.287206,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0338009,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.572866,
'Memory Management Unit/Runtime Dynamic': 0.109399,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 20.4892,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.181477,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0131954,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.121511,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.316184,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.97483,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 0.04072041446687615,
'Runtime Dynamic': 0.04072041446687615,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.0359732,
'Runtime Dynamic': 0.0214858,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 92.0534,
'Peak Power': 125.166,
'Runtime Dynamic': 26.8839,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 92.0174,
'Total Cores/Runtime Dynamic': 26.8625,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.0359732,
'Total L3s/Runtime Dynamic': 0.0214858,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.031729 | 124 | 0.681943 | 8,082 | 68,579 | 5.780624 | 0.067681 | 0.123633 | 0.113016 | 0.093495 | 0.93981 | 0.931206 | 0.917849 | 0.886834 | 0.862219 | 0.84304 | 0 | 0.131458 | 0.224427 | 68,579 | 914 | 125 | 75.031729 | 0.746917 | 0 | 0 | 0.642232 | 0 | 0 | 0.657699 | 0.048119 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4fa704bc8b34432ee673a04ac8fc43901d71f82 | 30,486 | py | Python | src/config/api-server/vnc_cfg_api_server/tests/resources/test_logical_router.py | EWERK-DIGITAL/tf-controller | 311ea863b03d425a67d04d27c1f1b9cf1e20c926 | [
"Apache-2.0"
] | 37 | 2020-09-21T10:42:26.000Z | 2022-01-09T10:16:40.000Z | src/config/api-server/vnc_cfg_api_server/tests/resources/test_logical_router.py | EWERK-DIGITAL/tf-controller | 311ea863b03d425a67d04d27c1f1b9cf1e20c926 | [
"Apache-2.0"
] | null | null | null | src/config/api-server/vnc_cfg_api_server/tests/resources/test_logical_router.py | EWERK-DIGITAL/tf-controller | 311ea863b03d425a67d04d27c1f1b9cf1e20c926 | [
"Apache-2.0"
] | 21 | 2020-08-25T12:48:42.000Z | 2022-03-22T04:32:18.000Z | #
# Copyright (c) 2013,2014 Juniper Networks, Inc. All rights reserved.
#
from builtins import str
import logging
import uuid
from cfgm_common import get_lr_internal_vn_name
from netaddr import IPNetwork
from testtools import ExpectedException
from vnc_api.exceptions import BadRequest
from vnc_api.exceptions import NoIdError
from vnc_api.exceptions import RefsExistError
from vnc_api.gen.resource_client import Domain
from vnc_api.gen.resource_client import InstanceIp
from vnc_api.gen.resource_client import LogicalRouter
from vnc_api.gen.resource_client import NetworkIpam
from vnc_api.gen.resource_client import Project
from vnc_api.gen.resource_client import RouteTable
from vnc_api.gen.resource_client import VirtualMachine
from vnc_api.gen.resource_client import VirtualMachineInterface
from vnc_api.gen.resource_client import VirtualNetwork
from vnc_api.gen.resource_xsd import IdPermsType
from vnc_api.gen.resource_xsd import IpamSubnetType
from vnc_api.gen.resource_xsd import IpamType
from vnc_api.gen.resource_xsd import RouteTableType
from vnc_api.gen.resource_xsd import RouteType
from vnc_api.gen.resource_xsd import SubnetType
from vnc_api.gen.resource_xsd import VnSubnetsType
from vnc_cfg_api_server.tests import test_case
logger = logging.getLogger(__name__)
class TestLogicalRouter(test_case.ApiServerTestCase):
@classmethod
def setUpClass(cls, *args, **kwargs):
cls.console_handler = logging.StreamHandler()
cls.console_handler.setLevel(logging.DEBUG)
logger.addHandler(cls.console_handler)
super(TestLogicalRouter, cls).setUpClass(*args, **kwargs)
@classmethod
def tearDownClass(cls, *args, **kwargs):
logger.removeHandler(cls.console_handler)
super(TestLogicalRouter, cls).tearDownClass(*args, **kwargs)
def test_lr_v4_subnets(self):
# Create Domain
domain = Domain('my-lr-domain')
self._vnc_lib.domain_create(domain)
# Create Project
project = Project('my-lr-proj', domain)
self._vnc_lib.project_create(project)
# Create NetworkIpam
ipam = NetworkIpam('default-network-ipam', project, IpamType("dhcp"))
self._vnc_lib.network_ipam_create(ipam)
ipam = self._vnc_lib.network_ipam_read(
['my-lr-domain', 'my-lr-proj', 'default-network-ipam'])
# Create subnets
ipam_sn_v4_vn1 = IpamSubnetType(subnet=SubnetType('11.1.1.0', 24))
ipam_sn_v6_vn1 = IpamSubnetType(subnet=SubnetType('fd11::', 120))
ipam_sn_v4_vn2 = IpamSubnetType(subnet=SubnetType('11.1.2.0', 24))
ipam_sn_v6_vn2 = IpamSubnetType(subnet=SubnetType('fd12::', 120))
# Create VN my-vn-1
vn1 = VirtualNetwork('my-vn-1', project)
vn1.add_network_ipam(ipam,
VnSubnetsType([ipam_sn_v4_vn1, ipam_sn_v6_vn1]))
self._vnc_lib.virtual_network_create(vn1)
net_obj1 = self._vnc_lib.virtual_network_read(id=vn1.uuid)
# Create VN my-vn-2
vn2 = VirtualNetwork('my-vn-2', project)
vn2.add_network_ipam(ipam,
VnSubnetsType([ipam_sn_v4_vn2, ipam_sn_v6_vn2]))
self._vnc_lib.virtual_network_create(vn2)
net_obj2 = self._vnc_lib.virtual_network_read(id=vn2.uuid)
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr_uuid = self._vnc_lib.logical_router_create(lr)
# Create a Virtual Machine Interface belonging to my-vn-1
id_perms = IdPermsType(enable=True)
port_obj1 = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
port_obj1.uuid = port_obj1.name
port_obj1.set_virtual_network(vn1)
port_obj1.set_virtual_machine_interface_device_owner(
'DEVICE_OWNER_ROUTER_INTF')
# Assign gateway ip
ipam_refs = net_obj1.get_network_ipam_refs()
for ipam_ref in ipam_refs:
subnets = ipam_ref['attr'].get_ipam_subnets()
for subnet in subnets:
cidr = '%s/%s' % (subnet.subnet.get_ip_prefix(),
subnet.subnet.get_ip_prefix_len())
if IPNetwork(cidr).version == 4:
gateway_ip = subnet.get_default_gateway()
self._vnc_lib.virtual_machine_interface_create(port_obj1)
# Create v4 Ip object
ip_obj1 = InstanceIp(name=str(uuid.uuid4()),
instance_ip_address=gateway_ip,
instance_ip_family='v4')
ip_obj1.uuid = ip_obj1.name
ip_obj1.set_virtual_machine_interface(port_obj1)
ip_obj1.set_virtual_network(net_obj1)
ip_id1 = self._vnc_lib.instance_ip_create(ip_obj1)
# Add Router Interface (test being subnet)
lr.add_virtual_machine_interface(port_obj1)
self._vnc_lib.logical_router_update(lr)
# Create a Virtual Machine Interface belonging to my-vn-2
port_obj2 = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
port_obj2.uuid = port_obj2.name
port_obj2.set_virtual_network(vn2)
port_obj2.set_virtual_machine_interface_device_owner(
'DEVICE_OWNER_ROUTER_INTF')
# Assign gateway ip
ipam_refs = net_obj2.get_network_ipam_refs()
for ipam_ref in ipam_refs:
subnets = ipam_ref['attr'].get_ipam_subnets()
for subnet in subnets:
cidr = '%s/%s' % (subnet.subnet.get_ip_prefix(),
subnet.subnet.get_ip_prefix_len())
if IPNetwork(cidr).version == 4:
gateway_ip = subnet.get_default_gateway()
self._vnc_lib.virtual_machine_interface_create(port_obj2)
# Create v4 Ip object
ip_obj2 = InstanceIp(name=str(uuid.uuid4()),
instance_ip_address=gateway_ip,
instance_ip_family='v4')
ip_obj2.uuid = ip_obj2.name
ip_obj2.set_virtual_machine_interface(port_obj2)
ip_obj2.set_virtual_network(net_obj2)
ip_id2 = self._vnc_lib.instance_ip_create(ip_obj2)
# Add Router Interface (test being subnet)
lr.add_virtual_machine_interface(port_obj2)
self._vnc_lib.logical_router_update(lr)
# TODO: Schema transformer not integrated in the tests,
# hence route-target refs not set yet
# Verify Route Target Creation
rt_refs = lr.get_route_target_refs()
for rt_ref in rt_refs or []:
rt_obj = self._vnc_lib.route_target_read(id=rt_ref['uuid'])
ri_refs = rt_obj.get_routing_instance_back_refs()
for ri_ref in ri_refs:
ri_obj = self.vnc_lib.routing_instance_read(id=ri_ref['uuid'])
ri_name = ri_obj.get_display_name()
if ri_name != 'my-vn-1' and ri_name != 'my-vn-2':
pass
# cleanup
self._vnc_lib.instance_ip_delete(id=ip_id1)
self._vnc_lib.instance_ip_delete(id=ip_id2)
self._vnc_lib.logical_router_delete(id=lr_uuid)
self._vnc_lib.virtual_machine_interface_delete(id=port_obj1.uuid)
self._vnc_lib.virtual_machine_interface_delete(id=port_obj2.uuid)
self._vnc_lib.virtual_network_delete(id=vn1.uuid)
self._vnc_lib.virtual_network_delete(id=vn2.uuid)
self._vnc_lib.network_ipam_delete(id=ipam.uuid)
self._vnc_lib.project_delete(id=project.uuid)
self._vnc_lib.domain_delete(id=domain.uuid)
def test_lr_v6_subnets(self):
# Create Domain
domain = Domain('my-lr-domain')
self._vnc_lib.domain_create(domain)
# Create Project
project = Project('my-lr-proj', domain)
self._vnc_lib.project_create(project)
# Create NetworkIpam
ipam = NetworkIpam('default-network-ipam', project, IpamType("dhcp"))
self._vnc_lib.network_ipam_create(ipam)
ipam = self._vnc_lib.network_ipam_read(
['my-lr-domain', 'my-lr-proj', 'default-network-ipam'])
# Create subnets
ipam_sn_v4_vn1 = IpamSubnetType(subnet=SubnetType('11.1.1.0', 24))
ipam_sn_v6_vn1 = IpamSubnetType(subnet=SubnetType('fd11::', 120))
ipam_sn_v4_vn2 = IpamSubnetType(subnet=SubnetType('11.1.2.0', 24))
ipam_sn_v6_vn2 = IpamSubnetType(subnet=SubnetType('fd12::', 120))
# Create VN my-vn-1
vn1 = VirtualNetwork('my-vn-1', project)
vn1.add_network_ipam(ipam,
VnSubnetsType([ipam_sn_v4_vn1, ipam_sn_v6_vn1]))
self._vnc_lib.virtual_network_create(vn1)
net_obj1 = self._vnc_lib.virtual_network_read(id=vn1.uuid)
# Create VN my-vn-2
vn2 = VirtualNetwork('my-vn-2', project)
vn2.add_network_ipam(ipam,
VnSubnetsType([ipam_sn_v4_vn2, ipam_sn_v6_vn2]))
self._vnc_lib.virtual_network_create(vn2)
net_obj2 = self._vnc_lib.virtual_network_read(id=vn2.uuid)
# Create Logical Router
lr = LogicalRouter('router-test-v6-%s' % self.id(), project)
lr_uuid = self._vnc_lib.logical_router_create(lr)
# Create a Virtual Machine Interface belonging to my-vn-1
id_perms = IdPermsType(enable=True)
port_obj1 = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
port_obj1.uuid = port_obj1.name
port_obj1.set_virtual_network(vn1)
port_obj1.set_virtual_machine_interface_device_owner(
'DEVICE_OWNER_ROUTER_INTF')
# Assign gateway ip
ipam_refs = net_obj1.get_network_ipam_refs()
for ipam_ref in ipam_refs:
subnets = ipam_ref['attr'].get_ipam_subnets()
for subnet in subnets:
cidr = '%s/%s' % (subnet.subnet.get_ip_prefix(),
subnet.subnet.get_ip_prefix_len())
if IPNetwork(cidr).version == 6:
gateway_ip = subnet.get_default_gateway()
self._vnc_lib.virtual_machine_interface_create(port_obj1)
# Create v6 Ip object
ip_obj1 = InstanceIp(name=str(uuid.uuid4()),
instance_ip_address=gateway_ip,
instance_ip_family='v6')
ip_obj1.uuid = ip_obj1.name
ip_obj1.set_virtual_machine_interface(port_obj1)
ip_obj1.set_virtual_network(net_obj1)
ip_id1 = self._vnc_lib.instance_ip_create(ip_obj1)
# Add Router Interface (test being subnet)
lr.add_virtual_machine_interface(port_obj1)
lr_obj = self._vnc_lib.logical_router_read(id=lr_uuid)
self._vnc_lib.logical_router_update(lr_obj)
# Create a Virtual Machine Interface belonging to my-vn-2
port_obj2 = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
port_obj2.uuid = port_obj2.name
port_obj2.set_virtual_network(vn2)
port_obj2.set_virtual_machine_interface_device_owner(
'DEVICE_OWNER_ROUTER_INTF')
# Assign gateway ip
ipam_refs = net_obj2.get_network_ipam_refs()
for ipam_ref in ipam_refs:
subnets = ipam_ref['attr'].get_ipam_subnets()
for subnet in subnets:
cidr = '%s/%s' % (subnet.subnet.get_ip_prefix(),
subnet.subnet.get_ip_prefix_len())
if IPNetwork(cidr).version == 6:
gateway_ip = subnet.get_default_gateway()
self._vnc_lib.virtual_machine_interface_create(port_obj2)
# Create v6 Ip object
ip_obj2 = InstanceIp(name=str(uuid.uuid4()),
instance_ip_address=gateway_ip,
instance_ip_family='v6')
ip_obj2.uuid = ip_obj2.name
ip_obj2.set_virtual_machine_interface(port_obj2)
ip_obj2.set_virtual_network(net_obj2)
ip_id2 = self._vnc_lib.instance_ip_create(ip_obj2)
# Add Router Interface (test being subnet)
lr.add_virtual_machine_interface(port_obj2)
lr_obj = self._vnc_lib.logical_router_read(id=lr_uuid)
self._vnc_lib.logical_router_update(lr_obj)
# TODO: Schema transformer not integrated in the tests,
# hence route-target refs not set yet
# Verify Route Target Creation
rt_refs = lr.get_route_target_refs()
for rt_ref in rt_refs or []:
rt_obj = self._vnc_lib.route_target_read(id=rt_ref['uuid'])
ri_refs = rt_obj.get_routing_instance_back_refs()
for ri_ref in ri_refs:
ri_obj = self.vnc_lib.routing_instance_read(id=ri_ref['uuid'])
ri_name = ri_obj.get_display_name()
if ri_name() != 'my-vn-1' and ri_name() != 'my-vn-2':
pass
# cleanup
self._vnc_lib.instance_ip_delete(id=ip_id1)
self._vnc_lib.instance_ip_delete(id=ip_id2)
self._vnc_lib.virtual_machine_interface_delete(id=port_obj1.uuid)
self._vnc_lib.virtual_machine_interface_delete(id=port_obj2.uuid)
self._vnc_lib.logical_router_delete(id=lr_uuid)
self._vnc_lib.virtual_network_delete(id=vn1.uuid)
self._vnc_lib.virtual_network_delete(id=vn2.uuid)
self._vnc_lib.network_ipam_delete(id=ipam.uuid)
self._vnc_lib.project_delete(id=project.uuid)
self._vnc_lib.domain_delete(id=domain.uuid)
def test_route_table_prefixes(self):
"""
UT to verify CEM-22625.
API server to allow applying two network routes (same prefix,
different next-hops) to the same Virtual Network.
"""
rt = RouteTable("rt1")
routes = RouteTableType()
route1 = RouteType(prefix="1.1.1.1/0", next_hop="10.10.10.10",
next_hop_type="ip-address")
route2 = RouteType(prefix="1.1.1.1/0", next_hop="20.20.20.20",
next_hop_type="ip-address")
routes.add_route(route1)
routes.add_route(route2)
rt.set_routes(routes)
self._vnc_lib.route_table_create(rt)
routes.delete_route(route2)
route2 = RouteType(prefix="1.1.1.2/0", next_hop="20.20.20.20",
next_hop_type="ip-address")
routes.add_route(route2)
rt.set_routes(routes)
self._vnc_lib.route_table_update(rt)
routes.delete_route(route1)
routes.delete_route(route2)
def test_vm_port_not_added_to_lr(self):
project = self._vnc_lib.project_read(
['default-domain', 'default-project'])
ipam = self._vnc_lib.network_ipam_read(
['default-domain', 'default-project', 'default-network-ipam'])
# Create subnets
ipam_sn_v4_vn = IpamSubnetType(subnet=SubnetType('11.1.1.0', 24))
# Create VN my-vn
vn = VirtualNetwork('%s-vn' % self.id(), project)
vn.add_network_ipam(ipam, VnSubnetsType([ipam_sn_v4_vn]))
self._vnc_lib.virtual_network_create(vn)
net_obj = self._vnc_lib.virtual_network_read(id=vn.uuid)
# Create v4 Ip object
ip_obj = InstanceIp(name=str(uuid.uuid4()), instance_ip_family='v4')
ip_obj.uuid = ip_obj.name
# Create VM
vm_inst_obj = VirtualMachine(str(uuid.uuid4()))
vm_inst_obj.uuid = vm_inst_obj.name
self._vnc_lib.virtual_machine_create(vm_inst_obj)
id_perms = IdPermsType(enable=True)
vm_port_obj = VirtualMachineInterface(
str(uuid.uuid4()), vm_inst_obj, id_perms=id_perms)
vm_port_obj.uuid = vm_port_obj.name
vm_port_obj.set_virtual_network(vn)
ip_obj.set_virtual_machine_interface(vm_port_obj)
ip_obj.set_virtual_network(net_obj)
self._vnc_lib.virtual_machine_interface_create(vm_port_obj)
self._vnc_lib.instance_ip_create(ip_obj)
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
self._vnc_lib.logical_router_create(lr)
# Add Router Interface
lr.add_virtual_machine_interface(vm_port_obj)
with ExpectedException(RefsExistError):
self._vnc_lib.logical_router_update(lr)
lr.del_virtual_machine_interface(vm_port_obj)
# Create Port
port_obj = self.create_port(project, net_obj)
lr.add_virtual_machine_interface(port_obj)
self._vnc_lib.logical_router_update(lr)
with ExpectedException(BadRequest):
port_obj.add_virtual_machine(vm_inst_obj)
self._vnc_lib.virtual_machine_interface_update(port_obj)
self._vnc_lib.logical_router_delete(id=lr.uuid)
def create_port(self, project, vn):
# Create v4 Ip object
ip_obj = InstanceIp(name=str(uuid.uuid4()), instance_ip_family='v4')
ip_obj.uuid = ip_obj.name
# Create Port
id_perms = IdPermsType(enable=True)
port_obj = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
port_obj.uuid = port_obj.name
port_obj.set_virtual_network(vn)
ip_obj.set_virtual_machine_interface(port_obj)
ip_obj.set_virtual_network(vn)
self._vnc_lib.virtual_machine_interface_create(port_obj)
self._vnc_lib.instance_ip_create(ip_obj)
return port_obj
def test_same_network_not_attached_to_lr(self):
project = self._vnc_lib.project_read(
['default-domain', 'default-project'])
ipam = self._vnc_lib.network_ipam_read(
['default-domain', 'default-project', 'default-network-ipam'])
# Create subnets
ipam_sn_v4_vn = IpamSubnetType(subnet=SubnetType('11.1.1.0', 24))
# Create VN my-vn
vn = VirtualNetwork('%s-vn' % self.id(), project)
vn.add_network_ipam(ipam, VnSubnetsType([ipam_sn_v4_vn]))
self._vnc_lib.virtual_network_create(vn)
net_obj = self._vnc_lib.virtual_network_read(id=vn.uuid)
# Create v4 Ip object
ip_obj = InstanceIp(name=str(uuid.uuid4()), instance_ip_family='v4')
ip_obj.uuid = ip_obj.name
# Create Port
port_obj = self.create_port(project, net_obj)
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr.set_logical_router_type('snat-routing')
self._vnc_lib.logical_router_create(lr)
# Add Router Interface
lr.add_virtual_machine_interface(port_obj)
self._vnc_lib.logical_router_update(lr)
# set router_external
net_obj.set_router_external(True)
self._vnc_lib.virtual_network_update(net_obj)
with ExpectedException(BadRequest):
lr.add_virtual_network(net_obj)
self._vnc_lib.logical_router_update(lr)
lr.del_virtual_network(net_obj)
lr.del_virtual_machine_interface(port_obj)
self._vnc_lib.logical_router_update(lr)
lr.add_virtual_network(net_obj)
self._vnc_lib.logical_router_update(lr)
# Create Port
port_obj = self.create_port(project, net_obj)
with ExpectedException(BadRequest):
lr.add_virtual_machine_interface(port_obj)
self._vnc_lib.logical_router_update(lr)
self._vnc_lib.logical_router_delete(id=lr.uuid)
def test_vxlan_routing_for_internal_vn(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
project.set_vxlan_routing(True)
self._vnc_lib.project_update(project)
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
# Check to see whether internal VN for VxLAN Routing is created.
int_vn_name = get_lr_internal_vn_name(lr_uuid)
int_vn_fqname = ['default-domain', 'default-project', int_vn_name]
try:
self._vnc_lib.virtual_network_read(fq_name=int_vn_fqname)
except NoIdError as e:
# Invisible objects do not come up in read
# calls but throws up a exception saying the
# object is invisible but cannot be read, confirming
# the presence of the object. Hack!
if "This object is not visible" not in str(e):
assert(False)
# Check to see if deleting the VN deletes the internal VN
# that was created.
self._vnc_lib.logical_router_delete(id=lr_uuid)
try:
self._vnc_lib.virtual_network_read(fq_name=int_vn_fqname)
self.fail("Logical router internal virtual network was not "
"removed")
except NoIdError:
pass
# Check to see if we are able to disable VxLAN Routing
# after LR is deleted in the project.
project.set_vxlan_routing(False)
self._vnc_lib.project_update(project)
def test_vxlan_create_for_lr(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
project.set_vxlan_routing(True)
self._vnc_lib.project_update(project)
mock_zk = self._api_server._db_conn._zk_db
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr.set_vxlan_network_identifier('5000')
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
vxlan_id = lr.get_vxlan_network_identifier()
self.assertEqual(vxlan_id, '5000')
int_vn_name = get_lr_internal_vn_name(lr_uuid)
int_vn_fqname = ['default-domain', 'default-project', int_vn_name]
self.assertEqual(':'.join(int_vn_fqname) + "_vxlan",
mock_zk.get_vn_from_id(int(vxlan_id)))
self._vnc_lib.logical_router_delete(id=lr_uuid)
def test_vxlan_update_for_lr(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
project.set_vxlan_routing(True)
self._vnc_lib.project_update(project)
mock_zk = self._api_server._db_conn._zk_db
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr.set_vxlan_network_identifier('5001')
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr_read = self._vnc_lib.logical_router_read(id=lr_uuid)
vxlan_id = lr_read.get_vxlan_network_identifier()
self.assertEqual(vxlan_id, '5001')
lr.set_vxlan_network_identifier('5002')
self._vnc_lib.logical_router_update(lr)
lr_read = self._vnc_lib.logical_router_read(id=lr_uuid)
vxlan_id = lr_read.get_vxlan_network_identifier()
self.assertEqual(vxlan_id, '5002')
int_vn_name = get_lr_internal_vn_name(lr_uuid)
int_vn_fqname = ['default-domain', 'default-project', int_vn_name]
self.assertEqual(':'.join(int_vn_fqname) + "_vxlan",
mock_zk.get_vn_from_id(int(vxlan_id)))
self._vnc_lib.logical_router_delete(id=lr_uuid)
def test_vxlan_update_failure_for_lr(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
project.set_vxlan_routing(True)
self._vnc_lib.project_update(project)
mock_zk = self._api_server._db_conn._zk_db
# Create Logical Router
lr1 = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr1.set_vxlan_network_identifier('5003')
lr1.set_logical_router_type('vxlan-routing')
lr1_uuid = self._vnc_lib.logical_router_create(lr1)
lr1_read = self._vnc_lib.logical_router_read(id=lr1_uuid)
vxlan_id1 = lr1_read.get_vxlan_network_identifier()
self.assertEqual(vxlan_id1, '5003')
lr2 = LogicalRouter('router-test-v4-%s-2' % self.id(), project)
lr2.set_vxlan_network_identifier('5004')
lr2.set_logical_router_type('vxlan-routing')
lr2_uuid = self._vnc_lib.logical_router_create(lr2)
lr2_read = self._vnc_lib.logical_router_read(id=lr2_uuid)
vxlan_id2 = lr2_read.get_vxlan_network_identifier()
self.assertEqual(vxlan_id2, '5004')
lr2.set_vxlan_network_identifier('5003')
with ExpectedException(BadRequest):
self._vnc_lib.logical_router_update(lr2)
lr_read = self._vnc_lib.logical_router_read(id=lr2_uuid)
vxlan_id = lr_read.get_vxlan_network_identifier()
self.assertEqual(vxlan_id, '5004')
int_vn_name = get_lr_internal_vn_name(lr2_uuid)
int_vn_fqname = ['default-domain', 'default-project', int_vn_name]
self.assertEqual(':'.join(int_vn_fqname) + "_vxlan",
mock_zk.get_vn_from_id(int(vxlan_id)))
self._vnc_lib.logical_router_delete(id=lr1_uuid)
self._vnc_lib.logical_router_delete(id=lr2_uuid)
def test_vxlan_deallocate_for_lr(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
project.set_vxlan_routing(True)
self._vnc_lib.project_update(project)
mock_zk = self._api_server._db_conn._zk_db
# Create Logical Router
lr = LogicalRouter('router-test-v4-%s' % self.id(), project)
lr.set_vxlan_network_identifier('5005')
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
vxlan_id = lr.get_vxlan_network_identifier()
self.assertEqual(vxlan_id, '5005')
int_vn_name = get_lr_internal_vn_name(lr_uuid)
int_vn_fqname = ['default-domain', 'default-project', int_vn_name]
self._vnc_lib.logical_router_delete(id=lr_uuid)
self.assertNotEqual(':'.join(int_vn_fqname) + "_vxlan",
mock_zk.get_vn_from_id(int(vxlan_id)))
def test_change_from_vxlan_to_snat(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
# Create Logical Router enabled for VxLAN.
lr = LogicalRouter('router-test-vxlan-to-snat-%s' %
(self.id()), project)
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
logger.debug('Created Logical Router ')
lr.set_logical_router_type('snat-routing')
with ExpectedException(BadRequest):
self._vnc_lib.logical_router_update(lr)
logger.debug('PASS - Could not update LR from VXLAN to SNAT')
def test_change_from_snat_to_vxlan(self):
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
# Create Logical Router enabled for SNAT.
lr = LogicalRouter(
'router-test-snat-to-vxlan-%s'
% (self.id()), project)
lr.set_logical_router_type('snat-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
logger.debug('Created Logical Router ')
lr.set_logical_router_type('vxlan-routing')
with ExpectedException(BadRequest):
self._vnc_lib.logical_router_update(lr)
logger.debug('PASS: Could not update LR from SNAT to VXLAN')
def test_delete_lr_missing_vn_refs(self):
# Get Project Ref
project = self._vnc_lib.project_read(fq_name=['default-domain',
'default-project'])
lr = LogicalRouter(
'router-test-missing_vn_refs-%s'
% (self.id()), project)
lr.set_logical_router_type('vxlan-routing')
lr_uuid = self._vnc_lib.logical_router_create(lr)
lr = self._vnc_lib.logical_router_read(id=lr_uuid)
logger.debug('Created Logical Router ')
# Create a VN Object
vn = VirtualNetwork('%s-vn' % self.id(), project)
self._vnc_lib.virtual_network_create(vn)
# Create a Virtual Machine Interface that does not have VN Ref
id_perms = IdPermsType(enable=True)
vmi_no_vn_ref = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
vmi_no_vn_ref.set_virtual_network(vn)
vmi_uuid = self._vnc_lib.virtual_machine_interface_create(
vmi_no_vn_ref)
# Do not associate VN Ref to VMI and create
lr.add_virtual_machine_interface(vmi_no_vn_ref)
self._vnc_lib.logical_router_update(lr)
vmi_obj = self._vnc_lib.virtual_machine_interface_read(id=vmi_uuid)
vn_refs = vmi_obj.get_virtual_network_refs()
vn_uuid = vn_refs[0]['uuid']
self._vnc_lib.ref_update(
'virtual_machine_interface', vmi_uuid,
'virtual_network', vn_uuid, None, 'DELETE')
vmi_obj = self._vnc_lib.virtual_machine_interface_read(id=vmi_uuid)
vn_refs = vmi_obj.get_virtual_network_refs()
self.assertIsNone(vn_refs)
# Create a VN Object
vn2 = VirtualNetwork('%s-vn2' % self.id(), project)
self._vnc_lib.virtual_network_create(vn2)
# Create a Virtual Machine Interface that does not have VN Ref
vmi_no_vn_ref_2 = VirtualMachineInterface(
str(uuid.uuid4()), parent_obj=project, id_perms=id_perms)
vmi_no_vn_ref_2.set_virtual_network(vn2)
self._vnc_lib.virtual_machine_interface_create(
vmi_no_vn_ref_2)
# Do not associate VN Ref to VMI and create
lr.add_virtual_machine_interface(vmi_no_vn_ref_2)
self._vnc_lib.logical_router_update(lr)
# Deleting directly from api-server as api server
# will not allow deletion
self._vnc_lib.logical_router_delete(id=lr.uuid)
| 42.938028 | 78 | 0.656924 | 4,069 | 30,486 | 4.54608 | 0.068567 | 0.052979 | 0.075684 | 0.049627 | 0.861066 | 0.83728 | 0.818305 | 0.772083 | 0.750568 | 0.73608 | 0 | 0.018281 | 0.249984 | 30,486 | 709 | 79 | 42.99859 | 0.790728 | 0.080594 | 0 | 0.711765 | 0 | 0 | 0.067976 | 0.007417 | 0 | 0 | 0 | 0.00141 | 0.02549 | 1 | 0.031373 | false | 0.009804 | 0.05098 | 0 | 0.086275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
480b6df3c69abb328331184ae06fe625de8502d1 | 7,648 | py | Python | TEST/code.tst.py | ihgazni2/ltdict | d79df01ae6b24e5893af8876d8d4e9a26a5dc3b5 | [
"MIT"
] | null | null | null | TEST/code.tst.py | ihgazni2/ltdict | d79df01ae6b24e5893af8876d8d4e9a26a5dc3b5 | [
"MIT"
] | null | null | null | TEST/code.tst.py | ihgazni2/ltdict | d79df01ae6b24e5893af8876d8d4e9a26a5dc3b5 | [
"MIT"
] | null | null | null |
#is_ltdict(obj)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:1,1:2,2:3}
is_ltdict(ltd) == True
ltd = {0:1,2:2,3:3}
is_ltdict(ltd) == False
ltd = {0:1,'1':2,2:3}
is_ltdict(ltd) == False
#json2ltdict(obj,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
json_dict = {'0':'a','1':'b','2':'c'}
json2ltdict(json_dict) == {0:'a',1:'b',2:'c'}
#to_json(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c'}
to_json(ltd) == {'1': 'b', '2': 'c', '0': 'a'}
#list2ltdict(array,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
array = ['a','b','c']
list2ltdict(array) == {0: 'a', 1: 'b', 2: 'c'}
#tuple2ltdict(fixed_array,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
t = ('a','b','c')
tuple2ltdict(t) == {0: 'a', 1: 'b', 2: 'c'}
#set2ltdict(this_set,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
s = {'a','b','c'}
set2ltdict(s)
#to_list(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
to_list(ltd) == ['a', 'b', 'c']
#to_tuple(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
to_tuple(ltd) == ('a', 'b', 'c')
#to_set(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
to_set(ltd) == {'b', 'c', 'a'}
#extend(ltd1,ltd2,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd1 = {0: 'a', 1: 'b', 2: 'c'}
ltd2 = {0: 'd', 1: 'e', 2: 'f'}
extend(ltd1,ltd2) == {0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'e', 5: 'f'}
#prextend(ltd1,ltd2,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd1 = {0: 'a', 1: 'b', 2: 'c'}
ltd2 = {0: 'd', 1: 'e', 2: 'f'}
prextend(ltd1,ltd2) == {0: 'd', 1: 'e', 2: 'f', 3: 'a', 4: 'b', 5: 'c'}
#concat(ltd1,ltd2,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd1 = {0: 'a', 1: 'b', 2: 'c'}
ltd2 = {0: 'd', 1: 'e', 2: 'f'}
concat(ltd1,ltd2) == {0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'e', 5: 'f'}
#first_continuous_indexes_slice(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
first_continuous_indexes_slice(ltd,'c') == [2, 3]
#last_continuous_indexes_slice(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
last_continuous_indexes_slice(ltd,'c') == [8, 9]
#all_continuous_indexes_slices_array(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
all_continuous_indexes_slices_array(ltd,'c') == [[2, 3], [5], [8, 9]]
#indexes_array(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
indexes_array(ltd,'c') == [2, 3, 5, 8, 9]
#first_index(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
first_index(ltd,'c') == 2
#last_index(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
last_index(ltd,'c') == 9
#append(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
append(ltd,'d') == {0: 'a', 1: 'b', 2: 'c', 3: 'd'}
#prepend(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
prepend(ltd,'d') == {0: 'd', 1: 'a', 2: 'b', 3: 'c'}
#clear(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
clear(ltd) == {}
#insert(ltd,index,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
insert(ltd,1,'d') == {0: 'a', 1: 'd', 2: 'b', 3: 'c'}
#insert_ltdict(ltd1,index,ltd2,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd1 = {0: 'a', 1: 'b', 2: 'c'}
ltd2 = {0: 'd', 1: 'e', 2: 'f'}
insert_ltdict(ltd1,1,ltd2) == {0: 'a', 1: 'd', 2: 'e', 3: 'f', 4: 'b', 5: 'c'}
#include(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
include(ltd,'c') == True
#count(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
count(ltd,'c') == 5
#pop(ltd,index,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0: 'a', 1: 'b', 2: 'c'}
pop(ltd,1) == 'b'
ltd == {0: 'a', 1: 'c'}
#pop_range(ltd,start,end,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
pop_range(ltd,1,7) == {0: 'b', 1: 'c', 2: 'c', 3: 'd', 4: 'c', 5: 'e', 6: 'f'}
ltd == {0: 'a', 1: 'c', 2: 'c'}
#pop_seqs(ltd,seqs_set,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
seqs_set = {2,5,8}
pop_seqs(ltd,seqs_set) == {0: 'c', 1: 'c', 2: 'c'}
ltd == {0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'e', 5: 'f', 6: 'c'}
#remove_first(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
remove_first(ltd,'c') == {0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'c', 5: 'e', 6: 'f', 7: 'c', 8: 'c'}
ltd == {0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'c', 5: 'e', 6: 'f', 7: 'c', 8: 'c'}
#remove_last(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
remove_last(ltd,'c') == {0: 'a', 1: 'b', 2: 'c', 3: 'c', 4: 'd', 5: 'c', 6: 'e', 7: 'f', 8: 'c'}
ltd == {0: 'a', 1: 'b', 2: 'c', 3: 'c', 4: 'd', 5: 'c', 6: 'e', 7: 'f', 8: 'c'}
#remove_all(ltd,value,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
remove_all(ltd,'c') == {0: 'a', 1: 'b', 2: 'd', 3: 'e', 4: 'f'}
ltd == {0: 'a', 1: 'b', 2: 'd', 3: 'e', 4: 'f'}
#reverse(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
reverse(ltd) == {0: 'c', 1: 'c', 2: 'f', 3: 'e', 4: 'c', 5: 'd', 6: 'c', 7: 'c', 8: 'b', 9: 'a'}
ltd == {0: 'c', 1: 'c', 2: 'f', 3: 'e', 4: 'c', 5: 'd', 6: 'c', 7: 'c', 8: 'b', 9: 'a'}
#sort(ltd,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd = {0:'a',1:'b',2:'c',3:'c',4:'d',5:'c',6:'e',7:'f',8:'c',9:'c'}
sort(ltd) == {0: 'a', 1: 'b', 2: 'c', 3: 'c', 4: 'c', 5: 'c', 6: 'c', 7: 'd', 8: 'e', 9: 'f'}
ltd == {0: 'a', 1: 'b', 2: 'c', 3: 'c', 4: 'c', 5: 'c', 6: 'c', 7: 'd', 8: 'e', 9: 'f'}
#comprise(ltd1,ltd2,**kwargs)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ltd1 = {0:'a',1:'b',2:'c',3:'d',4:'e',5:'f'}
ltd2 = {0:'a',1:'b',2:'c'}
comprise(ltd1,ltd2) == True
ltd3 = {0:'c',1:'d',2:'e'}
comprise(ltd1,ltd3) == False
comprise(ltd1,ltd3,strict=False) == True
#naturalize_intkeydict(ikd)
from ltdict.ltdict import *
from xdict.jprint import pobj,pdir
ikd = {3:'a',1:'b',2:'c',11:'d',0:'e',50:'f'}
naturalize_intkeydict(ikd) == {0: 'e', 1: 'b', 2: 'c', 3: 'a', 4: 'd', 5: 'f'}
| 24.831169 | 97 | 0.551647 | 1,514 | 7,648 | 2.747688 | 0.05284 | 0.024519 | 0.036058 | 0.045192 | 0.822596 | 0.781971 | 0.766587 | 0.760096 | 0.746154 | 0.739663 | 0 | 0.070906 | 0.142521 | 7,648 | 307 | 98 | 24.912052 | 0.563434 | 0.13193 | 0 | 0.645963 | 0 | 0 | 0.063163 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.434783 | 0 | 0.434783 | 0.217391 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
48310ebef71a733bee3433624b60d6fb82b0865d | 148 | py | Python | mmdet/utils/deployment/operations_domain.py | asenina/mmdetection | 951b23a7ecee7fa79caf7f80d71491b7f555a261 | [
"Apache-2.0"
] | 4 | 2020-01-19T08:00:31.000Z | 2020-02-14T03:25:45.000Z | mmdet/utils/deployment/operations_domain.py | asenina/mmdetection | 951b23a7ecee7fa79caf7f80d71491b7f555a261 | [
"Apache-2.0"
] | 3 | 2021-03-12T12:06:37.000Z | 2021-07-28T11:21:33.000Z | mmdet/utils/deployment/operations_domain.py | asenina/mmdetection | 951b23a7ecee7fa79caf7f80d71491b7f555a261 | [
"Apache-2.0"
] | 1 | 2020-04-21T01:44:04.000Z | 2020-04-21T01:44:04.000Z | DOMAIN_CUSTOM_OPS_NAME = 'org.openvinotoolkit'
def add_domain(name_operator: str) -> str:
return DOMAIN_CUSTOM_OPS_NAME + '::' + name_operator
| 29.6 | 56 | 0.763514 | 20 | 148 | 5.2 | 0.55 | 0.230769 | 0.288462 | 0.365385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128378 | 148 | 4 | 57 | 37 | 0.806202 | 0 | 0 | 0 | 0 | 0 | 0.141892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
48489d18636eecda2bfd1abef4325f11dd0e5195 | 120 | py | Python | gym_gazebo/envs/articulated_arm/__init__.py | abstractguy/gym_gazebo_kinetic | 61a55966cf66de493b149ad314c1d9b987d1deb9 | [
"MIT"
] | 31 | 2019-04-12T04:48:34.000Z | 2022-02-21T02:29:07.000Z | gym_gazebo/envs/articulated_arm/__init__.py | abstractguy/gym_gazebo_kinetic | 61a55966cf66de493b149ad314c1d9b987d1deb9 | [
"MIT"
] | 2 | 2020-03-23T01:17:45.000Z | 2020-07-02T07:01:06.000Z | gym_gazebo/envs/articulated_arm/__init__.py | abstractguy/gym_gazebo_kinetic | 61a55966cf66de493b149ad314c1d9b987d1deb9 | [
"MIT"
] | 9 | 2019-05-07T06:24:00.000Z | 2021-06-30T14:05:07.000Z | from gym_gazebo.envs.articulated_arm.gazebo_modular_articulated_arm_4dof_v1 import GazeboModularArticulatedArm4DOFv1Env
| 60 | 119 | 0.941667 | 14 | 120 | 7.571429 | 0.785714 | 0.264151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0.033333 | 120 | 1 | 120 | 120 | 0.87931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
485265d4247f22a45144e6aee3edae55bd62265d | 17,837 | py | Python | tests/test_archetypal_analysis.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | tests/test_archetypal_analysis.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | tests/test_archetypal_analysis.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | """
Provides unit tests for archetypal analysis routines.
"""
# License: MIT
from __future__ import absolute_import, division, print_function
import numpy as np
from sklearn.utils import check_random_state
from convex_dim_red.archetypal_analysis import (
_iterate_kernel_aa,
_kernel_aa_cost,
_update_kernel_aa_dictionary,
_update_kernel_aa_weights)
from convex_dim_red import KernelAA, right_stochastic_matrix
def test_single_dictionary_update_reduces_cost_with_zero_delta():
"""Test single update step reduces cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 10
n_components = 5
n_samples = 400
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = np.ones(n_components)
trace_K = np.trace(K)
KZ = K.dot(Z)
ZtZ = Z.T.dot(Z)
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_C = _update_kernel_aa_dictionary(
K, C, alpha, trace_K, KZ, ZtZ)
final_cost = _kernel_aa_cost(K, Z, updated_C, alpha)
assert final_cost <= initial_cost
assert np.allclose(updated_C.sum(axis=1), 1, 1e-12)
def test_single_dictionary_update_reduces_cost_with_nonzero_delta():
"""Test single update step reduces cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 10
n_components = 5
n_samples = 400
delta = 0.1
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = np.random.uniform(low=(1 - delta), high=(1 + delta),
size=(n_components,))
trace_K = np.trace(K)
KZ = K.dot(Z)
ZtZ = Z.T.dot(Z)
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_C = _update_kernel_aa_dictionary(
K, C, alpha, trace_K, KZ, ZtZ)
final_cost = _kernel_aa_cost(K, Z, updated_C, alpha)
assert final_cost <= initial_cost
assert np.allclose(updated_C.sum(axis=1), 1, 1e-12)
def test_exact_solution_with_zero_delta_is_dictionary_update_fixed_point():
"""Test exact solution is a fixed point of the dictionary update."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 10
n_components = 6
n_samples = 100
tolerance = 1e-12
basis = random_state.uniform(size=(n_components, n_features))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
archetype_indices = np.zeros(n_components, dtype='i8')
for i in range(n_components):
new_index = False
current_index = 0
while not new_index:
new_index = True
current_index = random_state.randint(
low=0, high=n_samples)
for index in archetype_indices:
if current_index == index:
new_index = False
archetype_indices[i] = current_index
C = np.zeros((n_components, n_samples))
component = 0
for index in archetype_indices:
C[component, index] = 1.0
for i in range(n_components):
if i == component:
Z[index, i] = 1.0
else:
Z[index, i] = 0.0
component += 1
X = Z.dot(basis)
basis_projection = C.dot(X)
assert np.allclose(basis_projection, basis, tolerance)
assert np.linalg.norm(X - Z.dot(C.dot(X))) < tolerance
K = X.dot(X.T)
alpha = np.ones(n_components)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
trace_K = np.trace(K)
KZ = K.dot(Z)
ZtZ = Z.T.dot(Z)
updated_C = _update_kernel_aa_dictionary(
K, C, alpha, trace_K, KZ, ZtZ)
final_cost = _kernel_aa_cost(K, Z, updated_C, alpha)
assert abs(final_cost - initial_cost) < tolerance
assert np.allclose(updated_C.sum(axis=1), 1, 1e-12)
assert np.allclose(updated_C, C, tolerance)
def test_repeated_dictionary_updates_converge_with_zero_delta():
"""Test repeated updates converge to a fixed point with delta = 0."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 20
n_components = 15
n_samples = 600
max_iterations = 1000
tolerance = 1e-6
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(C.sum(axis=1), 1, tolerance)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
delta = 0
alpha = np.ones(n_components)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z, updated_C, updated_alpha, _, n_iter = _iterate_kernel_aa(
K, Z, C, alpha, delta=delta,
update_weights=False, update_dictionary=True,
update_scale_factors=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)[:5]
final_cost = _kernel_aa_cost(K, updated_Z, updated_C, updated_alpha)
assert final_cost <= initial_cost
assert n_iter < max_iterations
assert np.allclose(updated_Z, Z, 1e-12)
assert np.allclose(updated_alpha, alpha, 1e-12)
assert np.allclose(updated_C.sum(axis=1), 1, 1e-12)
def test_repeated_dictionary_updates_converge_with_nonzero_delta():
"""Test repeated updates converge to a fixed point with non-zero delta."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 20
n_components = 15
n_samples = 600
max_iterations = 1000
tolerance = 1e-6
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(C.sum(axis=1), 1, tolerance)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
delta = 0.2
alpha = random_state.uniform(low=(1 - delta), high=(1 + delta),
size=(n_components,))
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z, updated_C, updated_alpha, _, n_iter = _iterate_kernel_aa(
K, Z, C, alpha, delta=delta,
update_weights=False, update_dictionary=True,
update_scale_factors=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)[:5]
final_cost = _kernel_aa_cost(K, updated_Z, updated_C, updated_alpha)
assert final_cost <= initial_cost
assert n_iter < max_iterations
assert np.allclose(updated_Z, Z, 1e-12)
assert np.allclose(updated_alpha, alpha, 1e-12)
assert np.allclose(updated_C.sum(axis=1), 1, 1e-12)
def test_single_weights_update_reduces_cost_with_zero_delta():
"""Test single weights update reduces cost function with delta = 0."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 13
n_components = 7
n_samples = 100
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = np.ones(n_components)
CK = C.dot(K)
CKCt = C.dot(K.dot(C.T))
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z = _update_kernel_aa_weights(
Z, alpha, CK, CKCt)
final_cost = _kernel_aa_cost(K, updated_Z, C, alpha)
assert final_cost <= initial_cost
assert np.allclose(updated_Z.sum(axis=1), 1, 1e-12)
def test_single_weights_update_reduces_cost_with_nonzero_delta():
"""Test single weights update reduces cost function with non-zero delta."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 50
n_components = 5
n_samples = 400
delta = 0.5
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = random_state.uniform(low=(1 - delta), high=(1 + delta),
size=(n_components,))
CK = C.dot(K)
CKCt = C.dot(K.dot(C.T))
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z = _update_kernel_aa_weights(
Z, alpha, CK, CKCt)
final_cost = _kernel_aa_cost(K, updated_Z, C, alpha)
assert final_cost <= initial_cost
assert np.allclose(updated_Z.sum(axis=1), 1, 1e-12)
def test_exact_solution_with_zero_delta_is_weights_update_fixed_point():
"""Test exact solution for weights is fixed point of update step."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 30
n_components = 10
n_samples = 130
tolerance = 1e-12
basis = random_state.uniform(size=(n_components, n_features))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
archetype_indices = np.zeros(n_components, dtype='i8')
for i in range(n_components):
new_index = False
current_index = 0
while not new_index:
new_index = True
current_index = random_state.randint(
low=0, high=n_samples)
for index in archetype_indices:
if current_index == index:
new_index = False
archetype_indices[i] = current_index
C = np.zeros((n_components, n_samples))
component = 0
for index in archetype_indices:
C[component, index] = 1.0
for i in range(n_components):
if i == component:
Z[index, i] = 1.0
else:
Z[index, i] = 0.0
component += 1
X = Z.dot(basis)
basis_projection = C.dot(X)
assert np.allclose(basis_projection, basis, tolerance)
assert np.linalg.norm(X - Z.dot(C.dot(X))) < tolerance
K = X.dot(X.T)
alpha = np.ones((n_components,))
CK = C.dot(K)
CKCt = C.dot(K.dot(C.T))
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z = _update_kernel_aa_weights(
Z, alpha, CK, CKCt)
final_cost = _kernel_aa_cost(K, updated_Z, C, alpha)
assert abs(final_cost - initial_cost) < tolerance
assert np.allclose(updated_Z.sum(axis=1), 1, 1e-12)
assert np.allclose(updated_Z, Z, tolerance)
def test_repeated_weights_updates_converge_with_zero_delta():
"""Test repeated updates converge to a fixed point with delta = 0."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 10
n_components = 3
n_samples = 600
max_iterations = 100
tolerance = 1e-6
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(C.sum(axis=1), 1, tolerance)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
delta = 0
alpha = np.ones((n_components,))
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z, updated_C, updated_alpha, _, n_iter = _iterate_kernel_aa(
K, Z, C, alpha, delta=delta,
update_weights=True, update_dictionary=False,
update_scale_factors=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)[:5]
final_cost = _kernel_aa_cost(K, updated_Z, updated_C, updated_alpha)
assert final_cost <= initial_cost
assert n_iter < max_iterations
assert np.allclose(updated_C, C, 1e-12)
assert np.allclose(updated_alpha, alpha, 1e-12)
assert np.allclose(updated_Z.sum(axis=1), 1, 1e-12)
def test_repeated_weights_updates_converge_with_nonzero_delta():
"""Test repeated updates converge to a fixed point with delta = 0."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 30
n_components = 11
n_samples = 320
max_iterations = 100
tolerance = 1e-6
X = random_state.uniform(size=(n_samples, n_features))
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(C.sum(axis=1), 1, tolerance)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
delta = 0.3
alpha = random_state.uniform(low=(1 - delta), high=(1 + delta),
size=(n_components,))
initial_cost = _kernel_aa_cost(K, Z, C, alpha)
updated_Z, updated_C, updated_alpha, _, n_iter = _iterate_kernel_aa(
K, Z, C, alpha, delta=delta,
update_weights=True, update_dictionary=False,
update_scale_factors=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)[:5]
final_cost = _kernel_aa_cost(K, updated_Z, updated_C, updated_alpha)
assert final_cost <= initial_cost
assert n_iter < max_iterations
assert np.allclose(updated_C, C, 1e-12)
assert np.allclose(updated_alpha, alpha, 1e-12)
assert np.allclose(updated_Z.sum(axis=1), 1, 1e-12)
def test_finds_elements_of_3_point_convex_hull():
"""Test finds archetypes in convex hull for 2D example."""
random_seed = 0
random_state = check_random_state(random_seed)
n_samples = 50
n_components = 3
max_iterations = 500
tolerance = 1e-6
basis = np.array([[0.0, 0.0], [1.0, 0.0], [0.0, 1.0]])
expected_Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assignments = np.array([5, 27, 32])
for i in range(n_components):
expected_Z[assignments[i]] = np.zeros(n_components)
expected_Z[assignments[i], i] = 1
X = expected_Z.dot(basis)
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = np.ones((n_components,))
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
delta = 0
aa = KernelAA(n_components=n_components, delta=delta, init='custom',
max_iterations=max_iterations, tolerance=tolerance)
solution_Z = aa.fit_transform(K, dictionary=C, weights=Z, alpha=alpha)
solution_C = aa.dictionary
assert aa.n_iter < max_iterations
assert np.allclose(solution_C.sum(axis=1), 1, 1e-12)
assert np.allclose(solution_Z.sum(axis=1), 1, 1e-12)
main_components = solution_C.argmax(axis=1)
main_components = sorted(main_components)
for i in range(n_components):
assert main_components[i] == assignments[i]
def test_finds_elements_of_4_point_convex_hull():
"""Test finds archetypes in convex hull for 3D example."""
random_seed = 0
random_state = check_random_state(random_seed)
n_samples = 123
n_components = 4
max_iter = 500
tolerance = 1e-12
basis = np.array([[0, 0, 0],
[1, 0, 0],
[0, 1, 0],
[0, 0, 1]])
expected_Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assignments = np.array([8, 9, 56, 90])
for i in range(n_components):
expected_Z[assignments[i]] = np.zeros(n_components)
expected_Z[assignments[i], i] = 1
expected_C = np.zeros((n_components, n_samples), dtype='f8')
for i in range(n_components):
expected_C[i, assignments[i]] = 1
X = expected_Z.dot(basis)
assert np.linalg.norm(X - expected_Z.dot(expected_C.dot(X))) < tolerance
K = X.dot(X.T)
C = right_stochastic_matrix(
(n_components, n_samples), random_state=random_state)
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
alpha = np.ones((n_components,))
assert np.allclose(C.sum(axis=1), 1, 1e-12)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
delta = 0
aa = KernelAA(n_components=n_components, delta=delta, init='custom',
max_iterations=max_iter, tolerance=tolerance)
solution_Z = aa.fit_transform(K, dictionary=C, weights=Z, alpha=alpha)
solution_C = aa.dictionary
assert aa.n_iter < max_iter
assert np.allclose(solution_C.sum(axis=1), 1, 1e-12)
assert np.allclose(solution_Z.sum(axis=1), 1, 1e-12)
main_components = solution_C.argmax(axis=1)
main_components = sorted(main_components)
for i in range(n_components):
assert main_components[i] == assignments[i]
| 29.385502 | 79 | 0.667096 | 2,624 | 17,837 | 4.246189 | 0.065168 | 0.086878 | 0.068928 | 0.029079 | 0.940316 | 0.934303 | 0.925597 | 0.909621 | 0.896697 | 0.885568 | 0 | 0.028774 | 0.226495 | 17,837 | 606 | 80 | 29.433993 | 0.778792 | 0.044234 | 0 | 0.864865 | 0 | 0 | 0.00106 | 0 | 0 | 0 | 0 | 0 | 0.169533 | 1 | 0.029484 | false | 0 | 0.012285 | 0 | 0.041769 | 0.002457 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f921df7a4548368a6922d23ec3f5276d11dad92 | 244 | py | Python | bnpy/ioutil/__init__.py | raphael-group/bnpy | b11dc6f5689b06fc967bab6dffe7e01551d84667 | [
"BSD-3-Clause"
] | 184 | 2016-12-13T21:05:48.000Z | 2022-02-28T11:47:23.000Z | bnpy/ioutil/__init__.py | raphael-group/bnpy | b11dc6f5689b06fc967bab6dffe7e01551d84667 | [
"BSD-3-Clause"
] | 37 | 2016-12-18T14:07:53.000Z | 2022-03-13T10:58:14.000Z | bnpy/ioutil/__init__.py | raphael-group/bnpy | b11dc6f5689b06fc967bab6dffe7e01551d84667 | [
"BSD-3-Clause"
] | 50 | 2017-01-25T19:44:34.000Z | 2022-03-15T10:22:01.000Z | from bnpy.ioutil.ModelWriter import makePrefixForLap
import bnpy.ioutil.ModelReader
import bnpy.ioutil.ModelWriter
import bnpy.ioutil.DataReader
import bnpy.ioutil.SuffStatBagIO
import bnpy.ioutil.CountReader
import bnpy.ioutil.BNPYArgParser
| 24.4 | 52 | 0.868852 | 30 | 244 | 7.066667 | 0.366667 | 0.330189 | 0.45283 | 0.254717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07377 | 244 | 9 | 53 | 27.111111 | 0.938053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b51676d005ae393f3873b6b20985bd70ea9ab89b | 20,929 | py | Python | astroquery/gaia/tests/test_gaiatap.py | jmilou/astroquery | 06d6e6740865d0461570390726e1831ea139b558 | [
"BSD-3-Clause"
] | null | null | null | astroquery/gaia/tests/test_gaiatap.py | jmilou/astroquery | 06d6e6740865d0461570390726e1831ea139b558 | [
"BSD-3-Clause"
] | 2 | 2020-10-29T19:55:25.000Z | 2021-05-14T19:17:44.000Z | astroquery/gaia/tests/test_gaiatap.py | mevtorres/localAstroquery | 06d6e6740865d0461570390726e1831ea139b558 | [
"BSD-3-Clause"
] | null | null | null | # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
=============
Gaia TAP plus
=============
@author: Juan Carlos Segovia
@contact: juan.carlos.segovia@sciops.esa.int
European Space Astronomy Centre (ESAC)
European Space Agency (ESA)
Created on 30 jun. 2016
"""
import unittest
import os
import pytest
from astroquery.gaia.core import GaiaClass
from astroquery.gaia.tests.DummyTapHandler import DummyTapHandler
from astroquery.utils.tap.conn.tests.DummyConnHandler import DummyConnHandler
from astroquery.utils.tap.conn.tests.DummyResponse import DummyResponse
import astropy.units as u
from astropy.coordinates.sky_coordinate import SkyCoord
from astropy.units import Quantity
import numpy as np
from astroquery.utils.tap.xmlparser import utils
from astroquery.utils.tap.core import TapPlus
def data_path(filename):
data_dir = os.path.join(os.path.dirname(__file__), 'data')
return os.path.join(data_dir, filename)
class TestTap(unittest.TestCase):
def test_load_tables(self):
dummyTapHandler = DummyTapHandler()
tap = GaiaClass(dummyTapHandler)
# default parameters
parameters = {}
parameters['only_names'] = False
parameters['include_shared_tables'] = False
parameters['verbose'] = False
tap.load_tables()
dummyTapHandler.check_call('load_tables', parameters)
# test with parameters
dummyTapHandler.reset()
parameters = {}
parameters['only_names'] = True
parameters['include_shared_tables'] = True
parameters['verbose'] = True
tap.load_tables(True, True, True)
dummyTapHandler.check_call('load_tables', parameters)
def test_load_table(self):
dummyTapHandler = DummyTapHandler()
tap = GaiaClass(dummyTapHandler)
# default parameters
parameters = {}
parameters['table'] = 'table'
parameters['verbose'] = False
tap.load_table('table')
dummyTapHandler.check_call('load_table', parameters)
# test with parameters
dummyTapHandler.reset()
parameters = {}
parameters['table'] = 'table'
parameters['verbose'] = True
tap.load_table('table', verbose=True)
dummyTapHandler.check_call('load_table', parameters)
def test_launch_sync_job(self):
dummyTapHandler = DummyTapHandler()
tap = GaiaClass(dummyTapHandler)
query = "query"
# default parameters
parameters = {}
parameters['query'] = query
parameters['name'] = None
parameters['output_file'] = None
parameters['output_format'] = 'votable'
parameters['verbose'] = False
parameters['dump_to_file'] = False
parameters['upload_resource'] = None
parameters['upload_table_name'] = None
tap.launch_job(query)
dummyTapHandler.check_call('launch_job', parameters)
# test with parameters
dummyTapHandler.reset()
name = 'name'
output_file = 'output'
output_format = 'format'
verbose = True
dump_to_file = True
upload_resource = 'upload_res'
upload_table_name = 'upload_table'
parameters['query'] = query
parameters['name'] = name
parameters['output_file'] = output_file
parameters['output_format'] = output_format
parameters['verbose'] = verbose
parameters['dump_to_file'] = dump_to_file
parameters['upload_resource'] = upload_resource
parameters['upload_table_name'] = upload_table_name
tap.launch_job(query,
name=name,
output_file=output_file,
output_format=output_format,
verbose=verbose,
dump_to_file=dump_to_file,
upload_resource=upload_resource,
upload_table_name=upload_table_name)
dummyTapHandler.check_call('launch_job', parameters)
def test_launch_async_job(self):
dummyTapHandler = DummyTapHandler()
tap = GaiaClass(dummyTapHandler)
query = "query"
# default parameters
parameters = {}
parameters['query'] = query
parameters['name'] = None
parameters['output_file'] = None
parameters['output_format'] = 'votable'
parameters['verbose'] = False
parameters['dump_to_file'] = False
parameters['background'] = False
parameters['upload_resource'] = None
parameters['upload_table_name'] = None
tap.launch_job_async(query)
dummyTapHandler.check_call('launch_job_async', parameters)
# test with parameters
dummyTapHandler.reset()
name = 'name'
output_file = 'output'
output_format = 'format'
verbose = True
dump_to_file = True
background = True
upload_resource = 'upload_res'
upload_table_name = 'upload_table'
parameters['query'] = query
parameters['name'] = name
parameters['output_file'] = output_file
parameters['output_format'] = output_format
parameters['verbose'] = verbose
parameters['dump_to_file'] = dump_to_file
parameters['background'] = background
parameters['upload_resource'] = upload_resource
parameters['upload_table_name'] = upload_table_name
tap.launch_job_async(query,
name=name,
output_file=output_file,
output_format=output_format,
verbose=verbose,
dump_to_file=dump_to_file,
background=background,
upload_resource=upload_resource,
upload_table_name=upload_table_name)
dummyTapHandler.check_call('launch_job_async', parameters)
def test_list_async_jobs(self):
dummyTapHandler = DummyTapHandler()
tap = GaiaClass(dummyTapHandler)
# default parameters
parameters = {}
parameters['verbose'] = False
tap.list_async_jobs()
dummyTapHandler.check_call('list_async_jobs', parameters)
# test with parameters
dummyTapHandler.reset()
parameters['verbose'] = True
tap.list_async_jobs(verbose=True)
dummyTapHandler.check_call('list_async_jobs', parameters)
def test_query_object(self):
connHandler = DummyConnHandler()
tapplus = TapPlus("http://test:1111/tap", connhandler=connHandler)
tap = GaiaClass(tapplus)
# Launch response: we use default response because the query contains decimals
responseLaunchJob = DummyResponse()
responseLaunchJob.set_status_code(200)
responseLaunchJob.set_message("OK")
jobDataFile = data_path('job_1.vot')
jobData = utils.read_file_content(jobDataFile)
responseLaunchJob.set_data(method='POST',
context=None,
body=jobData,
headers=None)
# The query contains decimals: force default response
connHandler.set_default_response(responseLaunchJob)
sc = SkyCoord(ra=29.0, dec=15.0, unit=(u.degree, u.degree), frame='icrs')
with pytest.raises(ValueError) as err:
tap.query_object(sc)
assert "Missing required argument: 'width'" in err.value.args[0]
width = Quantity(12, u.deg)
with pytest.raises(ValueError) as err:
tap.query_object(sc, width=width)
assert "Missing required argument: 'height'" in err.value.args[0]
height = Quantity(10, u.deg)
table = tap.query_object(sc, width=width, height=height)
assert len(table) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(table))
self.__check_results_column(table,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(table,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(table,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(table,
'table1_oid',
'table1_oid',
None,
np.int32)
# by radius
radius = Quantity(1, u.deg)
table = tap.query_object(sc, radius=radius)
assert len(table) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(table))
self.__check_results_column(table,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(table,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(table,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(table,
'table1_oid',
'table1_oid',
None,
np.int32)
def test_query_object_async(self):
connHandler = DummyConnHandler()
tapplus = TapPlus("http://test:1111/tap", connhandler=connHandler)
tap = GaiaClass(tapplus)
jobid = '12345'
# Launch response
responseLaunchJob = DummyResponse()
responseLaunchJob.set_status_code(303)
responseLaunchJob.set_message("OK")
# list of list (httplib implementation for headers in response)
launchResponseHeaders = [
['location', 'http://test:1111/tap/async/' + jobid]
]
responseLaunchJob.set_data(method='POST',
context=None,
body=None,
headers=launchResponseHeaders)
connHandler.set_default_response(responseLaunchJob)
# Phase response
responsePhase = DummyResponse()
responsePhase.set_status_code(200)
responsePhase.set_message("OK")
responsePhase.set_data(method='GET',
context=None,
body="COMPLETED",
headers=None)
req = "async/" + jobid + "/phase"
connHandler.set_response(req, responsePhase)
# Results response
responseResultsJob = DummyResponse()
responseResultsJob.set_status_code(200)
responseResultsJob.set_message("OK")
jobDataFile = data_path('job_1.vot')
jobData = utils.read_file_content(jobDataFile)
responseResultsJob.set_data(method='GET',
context=None,
body=jobData,
headers=None)
req = "async/" + jobid + "/results/result"
connHandler.set_response(req, responseResultsJob)
sc = SkyCoord(ra=29.0, dec=15.0, unit=(u.degree, u.degree), frame='icrs')
width = Quantity(12, u.deg)
height = Quantity(10, u.deg)
table = tap.query_object_async(sc, width=width, height=height)
assert len(table) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(table))
self.__check_results_column(table,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(table,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(table,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(table,
'table1_oid',
'table1_oid',
None,
np.int32)
# by radius
radius = Quantity(1, u.deg)
table = tap.query_object_async(sc, radius=radius)
assert len(table) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(table))
self.__check_results_column(table,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(table,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(table,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(table,
'table1_oid',
'table1_oid',
None,
np.int32)
def test_cone_search_sync(self):
connHandler = DummyConnHandler()
tapplus = TapPlus("http://test:1111/tap", connhandler=connHandler)
tap = GaiaClass(tapplus)
# Launch response: we use default response because the query contains decimals
responseLaunchJob = DummyResponse()
responseLaunchJob.set_status_code(200)
responseLaunchJob.set_message("OK")
jobDataFile = data_path('job_1.vot')
jobData = utils.read_file_content(jobDataFile)
responseLaunchJob.set_data(method='POST',
context=None,
body=jobData,
headers=None)
ra = 19.0
dec = 20.0
sc = SkyCoord(ra=ra, dec=dec, unit=(u.degree, u.degree), frame='icrs')
radius = Quantity(1.0, u.deg)
connHandler.set_default_response(responseLaunchJob)
job = tap.cone_search(sc, radius)
assert job is not None, "Expected a valid job"
assert job.async_ is False, "Expected a synchronous job"
assert job.get_phase() == 'COMPLETED', \
"Wrong job phase. Expected: %s, found %s" % \
('COMPLETED', job.get_phase())
assert job.failed is False, "Wrong job status (set Failed = True)"
# results
results = job.get_results()
assert len(results) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(results))
self.__check_results_column(results,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(results,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(results,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(results,
'table1_oid',
'table1_oid',
None,
np.int32)
def test_cone_search_async(self):
connHandler = DummyConnHandler()
tapplus = TapPlus("http://test:1111/tap", connhandler=connHandler)
tap = GaiaClass(tapplus)
jobid = '12345'
# Launch response
responseLaunchJob = DummyResponse()
responseLaunchJob.set_status_code(303)
responseLaunchJob.set_message("OK")
# list of list (httplib implementation for headers in response)
launchResponseHeaders = [
['location', 'http://test:1111/tap/async/' + jobid]
]
responseLaunchJob.set_data(method='POST',
context=None,
body=None,
headers=launchResponseHeaders)
ra = 19
dec = 20
sc = SkyCoord(ra=ra, dec=dec, unit=(u.degree, u.degree), frame='icrs')
radius = Quantity(1.0, u.deg)
connHandler.set_default_response(responseLaunchJob)
# Phase response
responsePhase = DummyResponse()
responsePhase.set_status_code(200)
responsePhase.set_message("OK")
responsePhase.set_data(method='GET',
context=None,
body="COMPLETED",
headers=None)
req = "async/" + jobid + "/phase"
connHandler.set_response(req, responsePhase)
# Results response
responseResultsJob = DummyResponse()
responseResultsJob.set_status_code(200)
responseResultsJob.set_message("OK")
jobDataFile = data_path('job_1.vot')
jobData = utils.read_file_content(jobDataFile)
responseResultsJob.set_data(method='GET',
context=None,
body=jobData,
headers=None)
req = "async/" + jobid + "/results/result"
connHandler.set_response(req, responseResultsJob)
job = tap.cone_search_async(sc, radius)
assert job is not None, "Expected a valid job"
assert job.async_ is True, "Expected an asynchronous job"
assert job.get_phase() == 'COMPLETED', \
"Wrong job phase. Expected: %s, found %s" % \
('COMPLETED', job.get_phase())
assert job.failed is False, "Wrong job status (set Failed = True)"
# results
results = job.get_results()
assert len(results) == 3, \
"Wrong job results (num rows). Expected: %d, found %d" % \
(3, len(results))
self.__check_results_column(results,
'alpha',
'alpha',
None,
np.float64)
self.__check_results_column(results,
'delta',
'delta',
None,
np.float64)
self.__check_results_column(results,
'source_id',
'source_id',
None,
np.object)
self.__check_results_column(results,
'table1_oid',
'table1_oid',
None,
np.int32)
def __check_results_column(self, results, columnName, description, unit,
dataType):
c = results[columnName]
assert c.description == description, \
"Wrong description for results column '%s'. Expected: '%s', found '%s'" % \
(columnName, description, c.description)
assert c.unit == unit, \
"Wrong unit for results column '%s'. Expected: '%s', found '%s'" % \
(columnName, unit, c.unit)
assert c.dtype == dataType, \
"Wrong dataType for results column '%s'. Expected: '%s', found '%s'" % \
(columnName, dataType, c.dtype)
if __name__ == "__main__":
# import sys;sys.argv = ['', 'Test.testName']
unittest.main()
| 41.941884 | 87 | 0.505136 | 1,813 | 20,929 | 5.624379 | 0.120794 | 0.035697 | 0.044131 | 0.05178 | 0.845837 | 0.823281 | 0.790625 | 0.771501 | 0.757183 | 0.737178 | 0 | 0.013445 | 0.406517 | 20,929 | 498 | 88 | 42.026104 | 0.807503 | 0.046729 | 0 | 0.825688 | 0 | 0 | 0.114085 | 0.002109 | 0 | 0 | 0 | 0 | 0.043578 | 1 | 0.025229 | false | 0 | 0.029817 | 0 | 0.059633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
82f9c541d0f13b60c9378e2d54e2a01eb4290ef2 | 102,400 | py | Python | mi/dataset/parser/test/test_vel3d_l_wfp.py | cdobs/mi-instrument | 99f9322a4afabc5dff9b0fad12166075efce838c | [
"BSD-2-Clause"
] | 1 | 2018-09-14T23:28:29.000Z | 2018-09-14T23:28:29.000Z | mi/dataset/parser/test/test_vel3d_l_wfp.py | cdobs/mi-instrument | 99f9322a4afabc5dff9b0fad12166075efce838c | [
"BSD-2-Clause"
] | 33 | 2017-04-25T19:53:45.000Z | 2022-03-18T17:42:18.000Z | mi/dataset/parser/test/test_vel3d_l_wfp.py | cdobs/mi-instrument | 99f9322a4afabc5dff9b0fad12166075efce838c | [
"BSD-2-Clause"
] | 31 | 2015-03-04T01:01:09.000Z | 2020-10-28T14:42:12.000Z | #!/usr/bin/env python
"""
@package mi.dataset.parser.test.test_vel3d_l_wfp
@file marine-integrations/mi/dataset/parser/test/test_vel3d_l_wfp.py
@author Steve Myerson (Raytheon)
@brief Test code for a vel3d_l_wfp parser for recovered and telemetered data
"""
from StringIO import StringIO
import os
from nose.plugins.attrib import attr
from mi.core.exceptions import SampleException
from mi.core.instrument.data_particle import DataParticleKey
from mi.core.log import get_logger
from mi.dataset.dataset_parser import DataSetDriverConfigKeys
from mi.dataset.driver.vel3d_l.wfp.resource import RESOURCE_PATH
from mi.dataset.parser.vel3d_l_wfp import \
Vel3dLWfpParser, \
Vel3dLWfpSioParser, \
Vel3dLWfpInstrumentParticle, \
Vel3dLWfpInstrumentRecoveredParticle, \
Vel3dLWfpMetadataRecoveredParticle, \
Vel3dLWfpSioMuleMetadataParticle
from mi.dataset.test.test_parser import ParserUnitTestCase
log = get_logger()
# Recovered Record #1 has 1 instrument record.
REC_RECORD_1 = \
'\x00\x00\x01\x46\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01\x02' \
'\xD4\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00\x40' \
'\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00\x42' \
'\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x52\xE6\x3C\x32\x52\xE6' \
'\x54\xDF'
# Expected results for Recovered record #1.
REC_EXPECTED_FIELDS_RECORD_1_1 = (1, 2, 3, 1, 2, 2004, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
REC_EXPECTED_FIELDS_RECORD_1_META = (1390826719, 1390820402, 1390826719, 65535, 1)
# Recovered Record #2 has 2 instrument records.
REC_RECORD_2 = \
'\x00\x00\x01\x75\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x02\x03\x04\x02\x03' \
'\xD4\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80' \
'\x7A\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44' \
'\x9C\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04\xD4' \
'\x07\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0\x7A' \
'\x44\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46\x9C' \
'\x46\x00\x66\xEA\x46\x00\x43\x1C\x47\x52\xE6\x3C\x33\x52\xE6\x54' \
'\xE0'
# Expected results for Recovered record #2.
REC_EXPECTED_FIELDS_RECORD_2_1 = (2, 3, 4, 2, 3, 2004, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
REC_EXPECTED_FIELDS_RECORD_2_2 = (3, 4, 5, 3, 4, 2004, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
REC_EXPECTED_FIELDS_RECORD_2_META = (1390826720, 1390820403, 1390826720, 65535, 2)
# Recovered Record #3 has 3 instrument records.
REC_RECORD_3 = \
'\x00\x00\x01\xA4\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x04\x05\x06\x04\x05' \
'\xD4\x07\x00\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00' \
'\x7B\x44\x00\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48' \
'\x9C\x46\x00\x68\xEA\x46\x00\x44\x1C\x47\x05\x06\x07\x05\x06\xD4' \
'\x07\x00\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40\x7B' \
'\x44\x00\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A\x9C' \
'\x46\x00\x6A\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07\xD4\x07' \
'\x00\x00\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80\x7B\x44' \
'\x00\xC0\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C\x9C\x46' \
'\x00\x6C\xEA\x46\x00\x46\x1C\x47\x52\xE6\x3C\x34\x52\xE6\x54\xE1'
# Expected results for Recovered record #3.
REC_EXPECTED_FIELDS_RECORD_3_1 = (4, 5, 6, 4, 5, 2004, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
REC_EXPECTED_FIELDS_RECORD_3_2 = (5, 6, 7, 5, 6, 2004, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
REC_EXPECTED_FIELDS_RECORD_3_3 = (6, 7, 8, 6, 7, 2004, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
REC_EXPECTED_FIELDS_RECORD_3_META = (1390826721, 1390820404, 1390826721, 65535, 3)
# Recovered Record #4 has 4 instrument records.
REC_RECORD_4 = \
'\x00\x00\x01\xD3\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x07\x08\x09\x07\x08' \
'\xD4\x07\x00\x00\xD6\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0' \
'\x7B\x44\x00\xE0\xFA\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E' \
'\x9C\x46\x00\x6E\xEA\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xD4' \
'\x07\x00\x00\xD8\x42\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C' \
'\x44\x00\x00\xFB\x44\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C' \
'\x46\x00\x70\xEA\x46\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xD4\x07' \
'\x00\x00\xDA\x42\x00\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44' \
'\x00\x20\xFB\x44\x00\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46' \
'\x00\x72\xEA\x46\x00\x49\x1C\x47\x0A\x0B\x0C\x0A\x0B\xD4\x07\x00' \
'\x00\xDC\x42\x00\x00\x52\x43\x00\x00\x9B\x43\x00\x80\x7C\x44\x00' \
'\x40\xFB\x44\x00\x20\x3C\x45\x00\x68\x1C\x46\x00\x54\x9C\x46\x00' \
'\x74\xEA\x46\x00\x4A\x1C\x47\x52\xE6\x3C\x35\x52\xE6\x54\xE2'
# Expected results for Recovered record #4.
REC_EXPECTED_FIELDS_RECORD_4_1 = (7, 8, 9, 7, 8, 2004, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
REC_EXPECTED_FIELDS_RECORD_4_2 = (8, 9, 10, 8, 9, 2004, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
REC_EXPECTED_FIELDS_RECORD_4_3 = (9, 10, 11, 9, 10, 2004, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
REC_EXPECTED_FIELDS_RECORD_4_4 = (10, 11, 12, 10, 11, 2004, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
REC_EXPECTED_FIELDS_RECORD_4_META = (1390826722, 1390820405, 1390826722, 65535, 4)
# Recovered Record #10 has 10 instrument records.
REC_RECORD_10 = \
'\x00\x00\x02\xED\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01\x02' \
'\xD1\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00\x40' \
'\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00\x42' \
'\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x02\x03\x04\x02\x03\xD1' \
'\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80\x7A' \
'\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44\x9C' \
'\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04\xD1\x07' \
'\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0\x7A\x44' \
'\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46\x9C\x46' \
'\x00\x66\xEA\x46\x00\x43\x1C\x47\x04\x05\x06\x04\x05\xD1\x07\x00' \
'\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00\x7B\x44\x00' \
'\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48\x9C\x46\x00' \
'\x68\xEA\x46\x00\x44\x1C\x47\x05\x06\x07\x05\x06\xD1\x07\x00\x00' \
'\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40\x7B\x44\x00\xA0' \
'\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A\x9C\x46\x00\x6A' \
'\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07\xD1\x07\x00\x00\xD4' \
'\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80\x7B\x44\x00\xC0\xFA' \
'\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C\x9C\x46\x00\x6C\xEA' \
'\x46\x00\x46\x1C\x47\x07\x08\x09\x07\x08\xD1\x07\x00\x00\xD6\x42' \
'\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0\x7B\x44\x00\xE0\xFA\x44' \
'\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E\x9C\x46\x00\x6E\xEA\x46' \
'\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xD1\x07\x00\x00\xD8\x42\x00' \
'\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C\x44\x00\x00\xFB\x44\x00' \
'\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C\x46\x00\x70\xEA\x46\x00' \
'\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xD1\x07\x00\x00\xDA\x42\x00\x00' \
'\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44\x00\x20\xFB\x44\x00\x10' \
'\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46\x00\x72\xEA\x46\x00\x49' \
'\x1C\x47\x0A\x0B\x0C\x0A\x0B\xD1\x07\x00\x00\xDC\x42\x00\x00\x52' \
'\x43\x00\x00\x9B\x43\x00\x80\x7C\x44\x00\x40\xFB\x44\x00\x20\x3C' \
'\x45\x00\x68\x1C\x46\x00\x54\x9C\x46\x00\x74\xEA\x46\x00\x4A\x1C' \
'\x47\x52\xE6\x3C\x32\x52\xE6\x54\xDF'
# Expected results for Recovered record #10.
REC_EXPECTED_FIELDS_RECORD_10_1 = (1, 2, 3, 1, 2, 2001, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
REC_EXPECTED_FIELDS_RECORD_10_2 = (2, 3, 4, 2, 3, 2001, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
REC_EXPECTED_FIELDS_RECORD_10_3 = (3, 4, 5, 3, 4, 2001, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
REC_EXPECTED_FIELDS_RECORD_10_4 = (4, 5, 6, 4, 5, 2001, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
REC_EXPECTED_FIELDS_RECORD_10_5 = (5, 6, 7, 5, 6, 2001, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
REC_EXPECTED_FIELDS_RECORD_10_6 = (6, 7, 8, 6, 7, 2001, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
REC_EXPECTED_FIELDS_RECORD_10_7 = (7, 8, 9, 7, 8, 2001, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
REC_EXPECTED_FIELDS_RECORD_10_8 = (8, 9, 10, 8, 9, 2001, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
REC_EXPECTED_FIELDS_RECORD_10_9 = (9, 10, 11, 9, 10, 2001, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
REC_EXPECTED_FIELDS_RECORD_10_10 = (10, 11, 12, 10, 11, 2001, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
REC_EXPECTED_FIELDS_RECORD_10_META = (1390826719, 1390820402, 1390826719, 65535, 10)
# Recovered Record 2_10 has 2 blocks, with 4 instrument records
# in the first block and 6 instrument records in the second block.
REC_RECORD_2_10 = \
'\x00\x00\x01\xD3\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01\x02' \
'\xD2\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00\x40' \
'\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00\x42' \
'\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x02\x03\x04\x02\x03\xD2' \
'\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80\x7A' \
'\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44\x9C' \
'\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04\xD2\x07' \
'\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0\x7A\x44' \
'\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46\x9C\x46' \
'\x00\x66\xEA\x46\x00\x43\x1C\x47\x04\x05\x06\x04\x05\xD2\x07\x00' \
'\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00\x7B\x44\x00' \
'\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48\x9C\x46\x00' \
'\x68\xEA\x46\x00\x44\x1C\x47\x52\xE6\x3C\x32\x52\xE6\x54\xDF' \
'\x00\x00\x02\x31\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x05\x06\x07\x05\x06' \
'\xD2\x07\x00\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40' \
'\x7B\x44\x00\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A' \
'\x9C\x46\x00\x6A\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07\xD2' \
'\x07\x00\x00\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80\x7B' \
'\x44\x00\xC0\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C\x9C' \
'\x46\x00\x6C\xEA\x46\x00\x46\x1C\x47\x07\x08\x09\x07\x08\xD2\x07' \
'\x00\x00\xD6\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0\x7B\x44' \
'\x00\xE0\xFA\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E\x9C\x46' \
'\x00\x6E\xEA\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xD2\x07\x00' \
'\x00\xD8\x42\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C\x44\x00' \
'\x00\xFB\x44\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C\x46\x00' \
'\x70\xEA\x46\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xD2\x07\x00\x00' \
'\xDA\x42\x00\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44\x00\x20' \
'\xFB\x44\x00\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46\x00\x72' \
'\xEA\x46\x00\x49\x1C\x47\x0A\x0B\x0C\x0A\x0B\xD2\x07\x00\x00\xDC' \
'\x42\x00\x00\x52\x43\x00\x00\x9B\x43\x00\x80\x7C\x44\x00\x40\xFB' \
'\x44\x00\x20\x3C\x45\x00\x68\x1C\x46\x00\x54\x9C\x46\x00\x74\xEA' \
'\x46\x00\x4A\x1C\x47\x52\xE6\x3C\x33\x52\xE6\x54\xE0'
# Expected results for Recovered record #2-10.
REC_EXPECTED_FIELDS_RECORD_2_10_1_1 = (1, 2, 3, 1, 2, 2002, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
REC_EXPECTED_FIELDS_RECORD_2_10_1_2 = (2, 3, 4, 2, 3, 2002, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
REC_EXPECTED_FIELDS_RECORD_2_10_1_3 = (3, 4, 5, 3, 4, 2002, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
REC_EXPECTED_FIELDS_RECORD_2_10_1_4 = (4, 5, 6, 4, 5, 2002, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
REC_EXPECTED_FIELDS_RECORD_2_10_1_META = (1390826719, 1390820402, 1390826719, 65535, 4)
REC_EXPECTED_FIELDS_RECORD_2_10_2_1 = (5, 6, 7, 5, 6, 2002, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_2 = (6, 7, 8, 6, 7, 2002, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_3 = (7, 8, 9, 7, 8, 2002, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_4 = (8, 9, 10, 8, 9, 2002, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_5 = (9, 10, 11, 9, 10, 2002, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_6 = (10, 11, 12, 10, 11, 2002, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
REC_EXPECTED_FIELDS_RECORD_2_10_2_META = (1390826720, 1390820403, 1390826720, 65535, 6)
# Recovered file with excess bytes at the end of the metadata record.
# Used to test exception processing. No expected results.
REC_EXCESS_METADATA = \
'\x00\x00\x01\x51\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01\x02' \
'\xE4\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00\x40' \
'\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00\x42' \
'\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x52\xE6\x3C\x32\x52\xE6' \
'\x54\xDF\x45\x78\x63\x65\x73\x73\x20\x64\x61\x74\x61'
# Telemetered Record #1 has 1 SIO block with 1 instrument record.
TEL_RECORD_1 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x35\x34\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x31\x5F\x30\x39\x44\x34' \
'\x02\x00\x00\x01\x46\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01' \
'\x02\xDE\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00' \
'\x40\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00' \
'\x42\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x52\xE6\x3C\x32\x52' \
'\xE6\x54\xDF\x00\x0A\x03'
# Expected results for Telemetered record #1.
TEL_EXPECTED_FIELDS_RECORD_1_1 = (1, 2, 3, 1, 2, 2014, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
TEL_EXPECTED_FIELDS_RECORD_1_META = (1374902331, 1390820402, 1390826719, 65535,
1, 10, 1374902331)
# Telemetered Record #2 has 1 SIO block with 2 instrument records.
TEL_RECORD_2 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x38\x31\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x32\x5F\x37\x41\x43\x41' \
'\x02\x00\x00\x01\x75\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x02\x03\x04\x02' \
'\x03\xDE\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00' \
'\x80\x7A\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00' \
'\x44\x9C\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04' \
'\xDE\x07\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0' \
'\x7A\x44\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46' \
'\x9C\x46\x00\x66\xEA\x46\x00\x43\x1C\x47\x52\xE6\x3C\x33\x52\xE6' \
'\x54\xE0\x03'
# Expected results for Telemetered record #2.
TEL_EXPECTED_FIELDS_RECORD_2_1 = (2, 3, 4, 2, 3, 2014, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
TEL_EXPECTED_FIELDS_RECORD_2_2 = (3, 4, 5, 3, 4, 2014, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
TEL_EXPECTED_FIELDS_RECORD_2_META = (1374902331, 1390820403, 1390826720, 65535,
2, None, 1374902331)
# Telemetered Record #3 has 1 SIO block with 3 instrument records.
TEL_RECORD_3 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x42\x32\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x33\x5F\x37\x32\x42\x37' \
'\x02\x00\x00\x01\xA4\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x04\x05\x06\x04' \
'\x05\xDE\x07\x00\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00' \
'\x00\x7B\x44\x00\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00' \
'\x48\x9C\x46\x00\x68\xEA\x46\x00\x44\x1C\x47\x05\x06\x07\x05\x06' \
'\xDE\x07\x00\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40' \
'\x7B\x44\x00\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A' \
'\x9C\x46\x00\x6A\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07\xDE' \
'\x07\x00\x00\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80\x7B' \
'\x44\x00\xC0\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C\x9C' \
'\x46\x00\x6C\xEA\x46\x00\x46\x1C\x47\x52\xE6\x3C\x34\x52\xE6\x54' \
'\xE1\x00\x0C\x03'
# Expected results for Telemetered record #3.
TEL_EXPECTED_FIELDS_RECORD_3_1 = (4, 5, 6, 4, 5, 2014, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
TEL_EXPECTED_FIELDS_RECORD_3_2 = (5, 6, 7, 5, 6, 2014, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
TEL_EXPECTED_FIELDS_RECORD_3_3 = (6, 7, 8, 6, 7, 2014, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
TEL_EXPECTED_FIELDS_RECORD_3_META = (1374902331, 1390820404, 1390826721, 65535,
3, 12, 1374902331)
# Telemetered Record #4 has 1 SIO block with 4 instrument records.
TEL_RECORD_4 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x44\x46\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x34\x5F\x45\x42\x30\x42' \
'\x02\x00\x00\x01\xD3\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x07\x08\x09\x07' \
'\x08\xDE\x07\x00\x00\xD6\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00' \
'\xC0\x7B\x44\x00\xE0\xFA\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00' \
'\x4E\x9C\x46\x00\x6E\xEA\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09' \
'\xDE\x07\x00\x00\xD8\x42\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00' \
'\x7C\x44\x00\x00\xFB\x44\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50' \
'\x9C\x46\x00\x70\xEA\x46\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xDE' \
'\x07\x00\x00\xDA\x42\x00\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C' \
'\x44\x00\x20\xFB\x44\x00\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C' \
'\x46\x00\x72\xEA\x46\x00\x49\x1C\x47\x0A\x0B\x0C\x0A\x0B\xDE\x07' \
'\x00\x00\xDC\x42\x00\x00\x52\x43\x00\x00\x9B\x43\x00\x80\x7C\x44' \
'\x00\x40\xFB\x44\x00\x20\x3C\x45\x00\x68\x1C\x46\x00\x54\x9C\x46' \
'\x00\x74\xEA\x46\x00\x4A\x1C\x47\x52\xE6\x3C\x35\x52\xE6\x54\xE2' \
'\x03'
# Expected results for Telemetered record #4.
TEL_EXPECTED_FIELDS_RECORD_4_1 = (7, 8, 9, 7, 8, 2014, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
TEL_EXPECTED_FIELDS_RECORD_4_2 = (8, 9, 10, 8, 9, 2014, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
TEL_EXPECTED_FIELDS_RECORD_4_3 = (9, 10, 11, 9, 10, 2014, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
TEL_EXPECTED_FIELDS_RECORD_4_4 = (10, 11, 12, 10, 11, 2014, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
TEL_EXPECTED_FIELDS_RECORD_4_META = (1374902331, 1390820405, 1390826722, 65535,
4, None, 1374902331)
# Telemetered Record #10 has 1 SIO block with 10 instrument records.
TEL_RECORD_10 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x32\x46\x42\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x31\x5F\x45\x38\x33\x41' \
'\x02\x00\x00\x02\xED\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01' \
'\x02\xDB\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00' \
'\x40\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00' \
'\x42\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x02\x03\x04\x02\x03' \
'\xDB\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80' \
'\x7A\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44' \
'\x9C\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04\xDB' \
'\x07\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0\x7A' \
'\x44\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46\x9C' \
'\x46\x00\x66\xEA\x46\x00\x43\x1C\x47\x04\x05\x06\x04\x05\xDB\x07' \
'\x00\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00\x7B\x44' \
'\x00\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48\x9C\x46' \
'\x00\x68\xEA\x46\x00\x44\x1C\x47\x05\x06\x07\x05\x06\xDB\x07\x00' \
'\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40\x7B\x44\x00' \
'\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A\x9C\x46\x00' \
'\x6A\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07\xDB\x07\x00\x00' \
'\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80\x7B\x44\x00\xC0' \
'\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C\x9C\x46\x00\x6C' \
'\xEA\x46\x00\x46\x1C\x47\x07\x08\x09\x07\x08\xDB\x07\x00\x00\xD6' \
'\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0\x7B\x44\x00\xE0\xFA' \
'\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E\x9C\x46\x00\x6E\xEA' \
'\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xDB\x07\x00\x00\xD8\x42' \
'\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C\x44\x00\x00\xFB\x44' \
'\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C\x46\x00\x70\xEA\x46' \
'\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xDB\x07\x00\x00\xDA\x42\x00' \
'\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44\x00\x20\xFB\x44\x00' \
'\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46\x00\x72\xEA\x46\x00' \
'\x49\x1C\x47\x0A\x0B\x0C\x0A\x0B\xDB\x07\x00\x00\xDC\x42\x00\x00' \
'\x52\x43\x00\x00\x9B\x43\x00\x80\x7C\x44\x00\x40\xFB\x44\x00\x20' \
'\x3C\x45\x00\x68\x1C\x46\x00\x54\x9C\x46\x00\x74\xEA\x46\x00\x4A' \
'\x1C\x47\x52\xE6\x3C\x32\x52\xE6\x54\xDF\x00\x0A\x03'
# Expected results for Telemetered record #10.
TEL_EXPECTED_FIELDS_RECORD_10_1 = (1, 2, 3, 1, 2, 2011, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
TEL_EXPECTED_FIELDS_RECORD_10_2 = (2, 3, 4, 2, 3, 2011, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
TEL_EXPECTED_FIELDS_RECORD_10_3 = (3, 4, 5, 3, 4, 2011, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
TEL_EXPECTED_FIELDS_RECORD_10_4 = (4, 5, 6, 4, 5, 2011, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
TEL_EXPECTED_FIELDS_RECORD_10_5 = (5, 6, 7, 5, 6, 2011, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
TEL_EXPECTED_FIELDS_RECORD_10_6 = (6, 7, 8, 6, 7, 2011, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
TEL_EXPECTED_FIELDS_RECORD_10_7 = (7, 8, 9, 7, 8, 2011, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
TEL_EXPECTED_FIELDS_RECORD_10_8 = (8, 9, 10, 8, 9, 2011, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
TEL_EXPECTED_FIELDS_RECORD_10_9 = (9, 10, 11, 9, 10, 2011, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
TEL_EXPECTED_FIELDS_RECORD_10_10 = (10, 11, 12, 10, 11, 2011, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
TEL_EXPECTED_FIELDS_RECORD_10_META = (1374902331, 1390820402, 1390826719, 65535,
10, 10, 1374902331)
# Telemetered Record 2_10 has 2 SIO blocks, with 4 instrument records
# in the first block and 6 instrument records in the second block.
TEL_RECORD_2_10 = \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x45\x31\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x31\x5F\x30\x46\x42\x41' \
'\x02\x00\x00\x01\xD3\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01' \
'\x02\xDC\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00' \
'\x40\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00' \
'\x42\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x02\x03\x04\x02\x03' \
'\xDC\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80' \
'\x7A\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44' \
'\x9C\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x03\x04\x05\x03\x04\xDC' \
'\x07\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00\xC0\x7A' \
'\x44\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00\x46\x9C' \
'\x46\x00\x66\xEA\x46\x00\x43\x1C\x47\x04\x05\x06\x04\x05\xDC\x07' \
'\x00\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00\x7B\x44' \
'\x00\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48\x9C\x46' \
'\x00\x68\xEA\x46\x00\x44\x1C\x47\x52\xE6\x3C\x32\x52\xE6\x54\xDF' \
'\x00\x0A\x03' \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x32\x33\x44\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x32\x5F\x33\x44\x32\x33' \
'\x02\x00\x00\x02\x31\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x05\x06\x07\x05' \
'\x06\xDC\x07\x00\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00' \
'\x40\x7B\x44\x00\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00' \
'\x4A\x9C\x46\x00\x6A\xEA\x46\x00\x45\x1C\x47\x06\x07\x08\x06\x07' \
'\xDC\x07\x00\x00\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00\x80' \
'\x7B\x44\x00\xC0\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00\x4C' \
'\x9C\x46\x00\x6C\xEA\x46\x00\x46\x1C\x47\x07\x08\x09\x07\x08\xDC' \
'\x07\x00\x00\xD6\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0\x7B' \
'\x44\x00\xE0\xFA\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E\x9C' \
'\x46\x00\x6E\xEA\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xDC\x07' \
'\x00\x00\xD8\x42\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C\x44' \
'\x00\x00\xFB\x44\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C\x46' \
'\x00\x70\xEA\x46\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xDC\x07\x00' \
'\x00\xDA\x42\x00\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44\x00' \
'\x20\xFB\x44\x00\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46\x00' \
'\x72\xEA\x46\x00\x49\x1C\x47\x0A\x0B\x0C\x0A\x0B\xDC\x07\x00\x00' \
'\xDC\x42\x00\x00\x52\x43\x00\x00\x9B\x43\x00\x80\x7C\x44\x00\x40' \
'\xFB\x44\x00\x20\x3C\x45\x00\x68\x1C\x46\x00\x54\x9C\x46\x00\x74' \
'\xEA\x46\x00\x4A\x1C\x47\x52\xE6\x3C\x33\x52\xE6\x54\xE0\x03'
# Expected results for Telemetered record #2-10.
TEL_EXPECTED_FIELDS_RECORD_2_10_1_1 = (1, 2, 3, 1, 2, 2012, 101.0, 201.0, 301.0,
1001.0, 2001.0, 3001.0,
10001.0, 20001.0, 30001.0, 40001.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_1_2 = (2, 3, 4, 2, 3, 2012, 102.0, 202.0, 302.0,
1002.0, 2002.0, 3002.0,
10002.0, 20002.0, 30002.0, 40002.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_1_3 = (3, 4, 5, 3, 4, 2012, 103.0, 203.0, 303.0,
1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_1_4 = (4, 5, 6, 4, 5, 2012, 104.0, 204.0, 304.0,
1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_1_META = (1374902331, 1390820402, 1390826719, 65535,
4, 10, 1374902331)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_1 = (5, 6, 7, 5, 6, 2012, 105.0, 205.0, 305.0,
1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_2 = (6, 7, 8, 6, 7, 2012, 106.0, 206.0, 306.0,
1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_3 = (7, 8, 9, 7, 8, 2012, 107.0, 207.0, 307.0,
1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_4 = (8, 9, 10, 8, 9, 2012, 108.0, 208.0, 308.0,
1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_5 = (9, 10, 11, 9, 10, 2012, 109.0, 209.0, 309.0,
1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_6 = (10, 11, 12, 10, 11, 2012, 110.0, 210.0, 310.0,
1010.0, 2010.0, 3010.0,
10010.0, 20010.0, 30010.0, 40010.0)
TEL_EXPECTED_FIELDS_RECORD_2_10_2_META = (1374902331, 1390820403, 1390826720, 65535,
6, None, 1374902331)
# Telemetered data with 3 SIO blocks.
# The first block is not one that we want, followed by 2 blocks we do want.
# The 2 vel3d_l blocks have 3 and 4 instrument records, respectively,
# plus 1 metadata per block.
TEL_SIO_PS_WA_WA = \
'\x01\x50\x53\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x38\x33\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x31\x5F\x42\x37\x31\x34' \
'\x02\x00\x00\x01\x75\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x01\x02\x03\x01' \
'\x02\xDD\x07\x00\x00\xCA\x42\x00\x00\x49\x43\x00\x80\x96\x43\x00' \
'\x40\x7A\x44\x00\x20\xFA\x44\x00\x90\x3B\x45\x00\x44\x1C\x46\x00' \
'\x42\x9C\x46\x00\x62\xEA\x46\x00\x41\x1C\x47\x02\x03\x04\x02\x03' \
'\xDD\x07\x00\x00\xCC\x42\x00\x00\x4A\x43\x00\x00\x97\x43\x00\x80' \
'\x7A\x44\x00\x40\xFA\x44\x00\xA0\x3B\x45\x00\x48\x1C\x46\x00\x44' \
'\x9C\x46\x00\x64\xEA\x46\x00\x42\x1C\x47\x52\xE6\x3C\x32\x52\xE6' \
'\x54\xDF\x00\x0A\x03' \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x42\x30\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x32\x5F\x31\x32\x43\x36' \
'\x02\x00\x00\x01\xA4\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x03\x04\x05\x03' \
'\x04\xDD\x07\x00\x00\xCE\x42\x00\x00\x4B\x43\x00\x80\x97\x43\x00' \
'\xC0\x7A\x44\x00\x60\xFA\x44\x00\xB0\x3B\x45\x00\x4C\x1C\x46\x00' \
'\x46\x9C\x46\x00\x66\xEA\x46\x00\x43\x1C\x47\x04\x05\x06\x04\x05' \
'\xDD\x07\x00\x00\xD0\x42\x00\x00\x4C\x43\x00\x00\x98\x43\x00\x00' \
'\x7B\x44\x00\x80\xFA\x44\x00\xC0\x3B\x45\x00\x50\x1C\x46\x00\x48' \
'\x9C\x46\x00\x68\xEA\x46\x00\x44\x1C\x47\x05\x06\x07\x05\x06\xDD' \
'\x07\x00\x00\xD2\x42\x00\x00\x4D\x43\x00\x80\x98\x43\x00\x40\x7B' \
'\x44\x00\xA0\xFA\x44\x00\xD0\x3B\x45\x00\x54\x1C\x46\x00\x4A\x9C' \
'\x46\x00\x6A\xEA\x46\x00\x45\x1C\x47\x52\xE6\x3C\x33\x52\xE6\x54' \
'\xE0\x03' \
'\x01\x57\x41\x31\x32\x33\x34\x35\x36\x39\x5F\x30\x31\x45\x31\x48' \
'\x35\x31\x46\x33\x35\x38\x33\x42\x5F\x30\x33\x5F\x37\x39\x38\x36' \
'\x02\x00\x00\x01\xD3\x31\x32\x33\xFF\xFF\x00\x00\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58' \
'\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x58\x06\x07\x08\x06' \
'\x07\xDD\x07\x00\x00\xD4\x42\x00\x00\x4E\x43\x00\x00\x99\x43\x00' \
'\x80\x7B\x44\x00\xC0\xFA\x44\x00\xE0\x3B\x45\x00\x58\x1C\x46\x00' \
'\x4C\x9C\x46\x00\x6C\xEA\x46\x00\x46\x1C\x47\x07\x08\x09\x07\x08' \
'\xDD\x07\x00\x00\xD6\x42\x00\x00\x4F\x43\x00\x80\x99\x43\x00\xC0' \
'\x7B\x44\x00\xE0\xFA\x44\x00\xF0\x3B\x45\x00\x5C\x1C\x46\x00\x4E' \
'\x9C\x46\x00\x6E\xEA\x46\x00\x47\x1C\x47\x08\x09\x0A\x08\x09\xDD' \
'\x07\x00\x00\xD8\x42\x00\x00\x50\x43\x00\x00\x9A\x43\x00\x00\x7C' \
'\x44\x00\x00\xFB\x44\x00\x00\x3C\x45\x00\x60\x1C\x46\x00\x50\x9C' \
'\x46\x00\x70\xEA\x46\x00\x48\x1C\x47\x09\x0A\x0B\x09\x0A\xDD\x07' \
'\x00\x00\xDA\x42\x00\x00\x51\x43\x00\x80\x9A\x43\x00\x40\x7C\x44' \
'\x00\x20\xFB\x44\x00\x10\x3C\x45\x00\x64\x1C\x46\x00\x52\x9C\x46' \
'\x00\x72\xEA\x46\x00\x49\x1C\x47\x52\xE6\x3C\x34\x52\xE6\x54\xE1' \
'\x00\x0C\x03'
# Expected results for SIO_PS_WA_WA.
# First block will produce no vel3d_l particles.
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_1 = (3, 4, 5, 3, 4, 2013,
103.0, 203.0, 303.0, 1003.0, 2003.0, 3003.0,
10003.0, 20003.0, 30003.0, 40003.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_2 = (4, 5, 6, 4, 5, 2013,
104.0, 204.0, 304.0, 1004.0, 2004.0, 3004.0,
10004.0, 20004.0, 30004.0, 40004.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_3 = (5, 6, 7, 5, 6, 2013,
105.0, 205.0, 305.0, 1005.0, 2005.0, 3005.0,
10005.0, 20005.0, 30005.0, 40005.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_META = (1374902331, 1390820403, 1390826720,
65535, 3, None, 1374902331)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_1 = (6, 7, 8, 6, 7, 2013,
106.0, 206.0, 306.0, 1006.0, 2006.0, 3006.0,
10006.0, 20006.0, 30006.0, 40006.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_2 = (7, 8, 9, 7, 8, 2013,
107.0, 207.0, 307.0, 1007.0, 2007.0, 3007.0,
10007.0, 20007.0, 30007.0, 40007.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_3 = (8, 9, 10, 8, 9, 2013,
108.0, 208.0, 308.0, 1008.0, 2008.0, 3008.0,
10008.0, 20008.0, 30008.0, 40008.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_4 = (9, 10, 11, 9, 10, 2013,
109.0, 209.0, 309.0, 1009.0, 2009.0, 3009.0,
10009.0, 20009.0, 30009.0, 40009.0)
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_META = (1374902331, 1390820404, 1390826721,
65535, 4, 12, 1374902331)
# NEW STUFF DEFS
SIMPLE_LOG_FILE = 'tel_vel3d_l_1.dat'
YAML_FILE = 'tel_vel3d_l_1.yml'
NUM_REC_SIMPLE_LOG_FILE = 11
LARGE_LOG_FILE = 'tel_node58p1_0_wa_wfp.dat'
LARGE_YAML_FILE = 'tel_node58p1_0_wa_wfp.yml'
DEC_LOG_FILE = 'tel_node15p1.dat'
DEC_YAML_FILE = 'tel_node15p1.yml'
MIX_LOG_FILE = 'tel_vel3d_l_4.dat'
MIX_YAML_FILE = 'tel_vel3d_l_4.yml'
REC_LOG_FILE_SIMPLE = 'rec_vel3d_l_2.dat'
REC_YAML_FILE_SIMPLE = 'rec_vel3d_l_2.yml'
REC_LOG_FILE_1 = 'A0000001.DAT'
REC_JSON_FILE_1 = 'A0000001.json'
REC_LOG_FILE_2 = 'A0000001_PAPA14.dat'
REC_JSON_FILE_2 = 'A0000001_PAPA14.json'
# END NEW STUFF DEFS
# The list of generated tests are the suggested tests, but there may
# be other tests needed to fully test your parser
@attr('UNIT', group='mi')
class Vel3dLWfpParserUnitTestCase(ParserUnitTestCase):
"""
vel3d_l_wfp Parser unit test suite
"""
def create_expected_results(self):
self.create_rec_expected_results()
self.create_tel_expected_results()
def create_rec_expected_results(self):
"""
This function creates the recovered data expected particle results.
"""
self.rec_expected_particle_1_1 = Vel3dLWfpInstrumentRecoveredParticle\
(REC_EXPECTED_FIELDS_RECORD_1_1, internal_timestamp=3281994123.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_1, internal_timestamp=3284762584.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_2, internal_timestamp=3287358245.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_3_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_3_1, internal_timestamp=3290126706.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_3_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_3_2, internal_timestamp=3292808767.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_3_3 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_3_3, internal_timestamp=3295577228.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_4_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_4_1, internal_timestamp=3298259289.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_4_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_4_2, internal_timestamp=3301027750.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_4_3 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_4_3, internal_timestamp=3303796211.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_4_4 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_4_4, internal_timestamp=3306478272.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_1_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_1_META, internal_timestamp=3599815519.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_3_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_3_META, internal_timestamp=3599815521.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_4_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_4_META, internal_timestamp=3599815522.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
#
self.rec_expected_particle_10_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_1, internal_timestamp=3187386123.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_2, internal_timestamp=3190154584.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_3 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_3, internal_timestamp=3192663845.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_4 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_4, internal_timestamp=3195432306.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_5 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_5, internal_timestamp=3198114367.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_6 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_6, internal_timestamp=3200882828.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_7 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_7, internal_timestamp=3203564889.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_8 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_8, internal_timestamp=3206333350.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_9 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_9, internal_timestamp=3209101811.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_10 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_10, internal_timestamp=3211783872.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_10_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_10_META, internal_timestamp=3599815519.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
# The following are for the multiple block file.
self.rec_expected_particle_2_10_1_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_1_1, internal_timestamp=3218922123.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_1_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_1_2, internal_timestamp=3221690584.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_1_3 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_1_3, internal_timestamp=3224199845.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_1_4 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_1_4, internal_timestamp=3226968306.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_1 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_1, internal_timestamp=3229650367.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_2 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_2, internal_timestamp=3232418828.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_3 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_3, internal_timestamp=3235100889.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_4 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_4, internal_timestamp=3237869350.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_5 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_5, internal_timestamp=3240637811.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_6 = Vel3dLWfpInstrumentRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_6, internal_timestamp=3243319872.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_1_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_1_META, internal_timestamp=3599815519.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.rec_expected_particle_2_10_2_meta = Vel3dLWfpMetadataRecoveredParticle(
REC_EXPECTED_FIELDS_RECORD_2_10_2_META, internal_timestamp=3599815520.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
def create_tel_expected_results(self):
"""
This function creates the telemetered data expected particle results.
"""
# The first number refers to the SIO record number.
# The second number refers to the FSI record within the SIO block.
self.tel_expected_particle_1_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_1_1, internal_timestamp=3597613323.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_1, internal_timestamp=3600381784.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_2, internal_timestamp=3602891045.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_3_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_3_1, internal_timestamp=3605659506.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_3_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_3_2, internal_timestamp=3608341567.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_3_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_3_3, internal_timestamp=3611110028.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_4_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_4_1, internal_timestamp=3613792089.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_4_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_4_2, internal_timestamp=3616560550.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_4_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_4_3, internal_timestamp=3619329011.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_4_4 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_4_4, internal_timestamp=3622011072.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_1_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_1_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_2_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_3_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_3_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_4_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_4_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_1, internal_timestamp=3502918923.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_2, internal_timestamp=3505687384.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_3, internal_timestamp=3508196645.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_4 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_4, internal_timestamp=3510965106.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_5 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_5, internal_timestamp=3513647167.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_6 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_6, internal_timestamp=3516415628.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_7 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_7, internal_timestamp=3519097689.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_8 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_8, internal_timestamp=3521866150.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_9 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_9, internal_timestamp=3524634611.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_10 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_10_10, internal_timestamp=3527316672.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_10_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_10_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
# The following are for the multiple SIO block file.
self.tel_expected_particle_2_10_1_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_1_1, internal_timestamp=3534454923.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_1_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_1_2, internal_timestamp=3537223384.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_1_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_1_3, internal_timestamp=3539819045.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_1_4 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_1_4, internal_timestamp=3542587506.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_1, internal_timestamp=3545269567.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_2, internal_timestamp=3548038028.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_3, internal_timestamp=3550720089.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_4 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_4, internal_timestamp=3553488550.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_5 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_5, internal_timestamp=3556257011.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_6 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_6, internal_timestamp=3558939072.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_1_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_1_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_2_10_2_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_RECORD_2_10_2_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_2_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_1, internal_timestamp=3571355045.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_2_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_2, internal_timestamp=3574123506.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_2_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_3, internal_timestamp=3576805567.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_2_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_2_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_3_1 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_1, internal_timestamp=3579574028.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_3_2 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_2, internal_timestamp=3582256089.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_3_3 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_3, internal_timestamp=3585024550.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_3_4 = Vel3dLWfpInstrumentParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_4, internal_timestamp=3587793011.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
self.tel_expected_particle_sio_ps_wa_wa_3_meta = Vel3dLWfpSioMuleMetadataParticle(
TEL_EXPECTED_FIELDS_SIO_PS_WA_WA_3_META, internal_timestamp=3583891131.0,
preferred_timestamp=DataParticleKey.INTERNAL_TIMESTAMP)
def create_rec_parser(self, file_handle):
"""
This function creates a Vel3d_l_Wfp parser for recovered data.
"""
parser = Vel3dLWfpParser(self.rec_config, file_handle, self.exception_callback)
return parser
def create_tel_parser(self, file_handle):
"""
This function creates a Vel3d_l_Wfp_Sio_Mule parser for telemetered data.
"""
parser = Vel3dLWfpSioParser(self.tel_config, file_handle, self.exception_callback)
return parser
def open_file(self, filename):
fid = open(os.path.join(RESOURCE_PATH, filename), mode='rb')
return fid
def open_file_write(self, filename):
fid = open(os.path.join(RESOURCE_PATH, filename), mode='w')
return fid
def verify_contents(self, actual_particle, expected_particle):
self.assertEqual(actual_particle, [expected_particle])
def setUp(self):
ParserUnitTestCase.setUp(self)
self.rec_config = {
DataSetDriverConfigKeys.PARTICLE_MODULE:
'mi.dataset.parser.vel3d_l_wfp',
DataSetDriverConfigKeys.PARTICLE_CLASS:
['Vel3dLWfpInstrumentRecoveredParticle',
'Vel3dLWfpMetadataRecoveredParticle']
}
self.tel_config = {
DataSetDriverConfigKeys.PARTICLE_MODULE:
'mi.dataset.parser.vel3d_l_wfp',
DataSetDriverConfigKeys.PARTICLE_CLASS:
['Vel3dLWfpInstrumentParticle',
'Vel3dLWfpSioMuleMetadataParticle']
}
# Define test data particles and their associated timestamps which will be
# compared with returned results
self.rec_file_ingested_value = None
self.exception_callback_value = []
self.maxDiff = None
self.create_expected_results()
def test_rec_excess_data(self):
"""
This test verifies that excess bytes at the end of the metadata record
are detected and a sample exception is raised.
"""
log.debug("============== START RECOVERED EXCESS DATA ==================")
log.debug("Recovered Excess Data length %d", len(REC_EXCESS_METADATA))
input_file = StringIO(REC_EXCESS_METADATA)
self.parser = self.create_rec_parser(input_file)
result = self.parser.get_records(1)
self.assertEquals(len(result), 1)
self.assertEquals(len(self.exception_callback_value), 1)
self.assert_(isinstance(self.exception_callback_value[0], SampleException))
log.debug("============== END RECOVERED EXCESS DATA ==================")
def test_rec_get_many(self):
"""
Read test data and pull out multiple data particles at one time.
Assert that the results are those we expected.
"""
log.debug("=================== START RECOVERED MANY ======================")
log.debug("Recovered Many length %d", len(REC_RECORD_4))
input_file = StringIO(REC_RECORD_4)
self.parser = self.create_rec_parser(input_file)
log.debug("REC MANY VERIFY RECORDS 1-4")
result = self.parser.get_records(4)
self.assertEqual(result, [self.rec_expected_particle_4_1,
self.rec_expected_particle_4_2,
self.rec_expected_particle_4_3,
self.rec_expected_particle_4_4])
log.debug("REC MANY VERIFY METADATA RECORD")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_4_meta)
log.debug("=================== END RECOVERED MANY ======================")
def test_rec_long_stream(self):
"""
Test a long stream
"""
log.debug("============== START RECOVERED LONG STREAM ==================")
log.debug("Recovered Long Stream length %d", len(REC_RECORD_10))
input_file = StringIO(REC_RECORD_10)
self.parser = self.create_rec_parser(input_file)
result = self.parser.get_records(10)
self.assertEqual(result, [self.rec_expected_particle_10_1,
self.rec_expected_particle_10_2,
self.rec_expected_particle_10_3,
self.rec_expected_particle_10_4,
self.rec_expected_particle_10_5,
self.rec_expected_particle_10_6,
self.rec_expected_particle_10_7,
self.rec_expected_particle_10_8,
self.rec_expected_particle_10_9,
self.rec_expected_particle_10_10])
log.debug("REC LONG STREAM VERIFY METADATA RECORD")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_10_meta)
self.assertEqual(self.exception_callback_value, [])
log.debug("============== END RECOVERED LONG STREAM ==================")
def test_rec_multiple_blocks(self):
"""
This function verifies that multiple blocks can be read.
"""
log.debug("============ START RECOVERED MULTIPLE BLOCKS ================")
log.debug("Recovered Multiple Blocks length %d", len(REC_RECORD_2_10))
input_file = StringIO(REC_RECORD_2_10)
self.parser = self.create_rec_parser(input_file)
result = self.parser.get_records(12)
self.assertEqual(result, [self.rec_expected_particle_2_10_1_1,
self.rec_expected_particle_2_10_1_2,
self.rec_expected_particle_2_10_1_3,
self.rec_expected_particle_2_10_1_4,
self.rec_expected_particle_2_10_1_meta,
self.rec_expected_particle_2_10_2_1,
self.rec_expected_particle_2_10_2_2,
self.rec_expected_particle_2_10_2_3,
self.rec_expected_particle_2_10_2_4,
self.rec_expected_particle_2_10_2_5,
self.rec_expected_particle_2_10_2_6,
self.rec_expected_particle_2_10_2_meta])
log.debug("============ END RECOVERED MULTIPLE BLOCKS ================")
def test_rec_simple_no_decimation(self):
"""
Read test data and pull out data particles one at a time.
Assert that the results are those we expected.
"""
log.debug("========== START RECOVERED SIMPLE WITHOUT DECIMATION ==========")
log.debug("Recovered Simple length %d", len(REC_RECORD_3))
input_file = StringIO(REC_RECORD_3)
self.parser = self.create_rec_parser(input_file)
log.debug("REC SIMPLE WITHOUT DECIMATION VERIFY RECORD 1")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_3_1)
log.debug("REC SIMPLE WITHOUT DECIMATION VERIFY RECORD 2")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_3_2)
log.debug("REC SIMPLE WITHOUT DECIMATION VERIFY RECORD 3")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_3_3)
log.debug("REC SIMPLE WITHOUT DECIMATION VERIFY METADATA RECORD")
result = self.parser.get_records(1)
self.verify_contents(result, self.rec_expected_particle_3_meta)
log.debug("========== END RECOVERED SIMPLE WITHOUT DECIMATION ==========")
def test_tel_multiple_sio_blocks(self):
"""
This function verifies that multiple SIO blocks can be read.
"""
log.debug("========== START TELEMETERED MULTIPLE SIO BLOCKS ==============")
log.debug("Telemetered Multiple SIO Blocks length %d", len(TEL_RECORD_2_10))
input_file = StringIO(TEL_RECORD_2_10)
self.parser = self.create_tel_parser(input_file)
result = self.parser.get_records(12)
self.assertEqual(result, [self.tel_expected_particle_2_10_1_1,
self.tel_expected_particle_2_10_1_2,
self.tel_expected_particle_2_10_1_3,
self.tel_expected_particle_2_10_1_4,
self.tel_expected_particle_2_10_1_meta,
self.tel_expected_particle_2_10_2_1,
self.tel_expected_particle_2_10_2_2,
self.tel_expected_particle_2_10_2_3,
self.tel_expected_particle_2_10_2_4,
self.tel_expected_particle_2_10_2_5,
self.tel_expected_particle_2_10_2_6,
self.tel_expected_particle_2_10_2_meta])
log.debug("========== END TELEMETERED MULTIPLE SIO BLOCKS ==============")
def test_tel_not_my_sio_block(self):
"""
This function verifies that non-WA SIO blocks are successfully ignored.
"""
log.debug("========== START TELEMETERED NOT MY SIO BLOCK ==============")
log.debug("Not my SIO Block length %d", len(TEL_SIO_PS_WA_WA))
input_file = StringIO(TEL_SIO_PS_WA_WA)
self.parser = self.create_tel_parser(input_file)
result = self.parser.get_records(9)
self.assertEqual(result, [self.tel_expected_particle_sio_ps_wa_wa_2_1,
self.tel_expected_particle_sio_ps_wa_wa_2_2,
self.tel_expected_particle_sio_ps_wa_wa_2_3,
self.tel_expected_particle_sio_ps_wa_wa_2_meta,
self.tel_expected_particle_sio_ps_wa_wa_3_1,
self.tel_expected_particle_sio_ps_wa_wa_3_2,
self.tel_expected_particle_sio_ps_wa_wa_3_3,
self.tel_expected_particle_sio_ps_wa_wa_3_4,
self.tel_expected_particle_sio_ps_wa_wa_3_meta])
log.debug("========== END TELEMETERED NOT MY SIO BLOCK ==============")
def test_tel_simple_no_decimation(self):
"""
Read test data and pull out data particles one at a time.
Assert that the results are those we expected.
This test verifies that a missing decimation factor is handled correctly.
"""
log.debug("========= START TELEMETERED SIMPLE NO DECIMATION =============")
log.debug("Telemetered Simple length %d", len(TEL_RECORD_1))
input_file = StringIO(TEL_RECORD_1)
self.parser = self.create_tel_parser(input_file)
log.debug("TEL SIMPLE NO DECIMATION VERIFY RECORD 1")
result = self.parser.get_records(1)
self.verify_contents(result, self.tel_expected_particle_1_1)
log.debug("TEL SIMPLE NO DECIMATION VERIFY METADATA RECORD")
result = self.parser.get_records(1)
self.verify_contents(result, self.tel_expected_particle_1_meta)
log.debug("========= END TELEMETERED SIMPLE NO DECIMATION ==============")
def test_tel_simple_with_decimation(self):
"""
Read test data and pull out data particles one at a time.
Assert that the results are those we expected.
This test verifies that a decimation factor is handled correctly.
"""
log.debug("========= START TELEMETERED SIMPLE WITH DECIMATION ===========")
log.debug("Simple length %d", len(TEL_RECORD_2))
input_file = StringIO(TEL_RECORD_2)
self.parser = self.create_tel_parser(input_file)
log.debug("SIMPLE WITH DECIMATION VERIFY RECORD 1")
result = self.parser.get_records(1)
self.verify_contents(result, self.tel_expected_particle_2_1)
log.debug("SIMPLE WITH DECIMATION VERIFY RECORD 2")
result = self.parser.get_records(1)
self.verify_contents(result, self.tel_expected_particle_2_2)
log.debug("SIMPLE WITH DECIMATION VERIFY METADATA RECORD")
result = self.parser.get_records(1)
self.verify_contents(result, self.tel_expected_particle_2_meta)
log.debug("========= END TELEMETERED SIMPLE WITH DECIMATION =============")
# ************* END OLDER TESTS -- START NEW TESTS ***********************
def test_rec_parser_simple(self):
"""
Read data from a file and pull out data particles
one at a time. Verify that the results are those we expected.
"""
log.debug('===== START YAML TEST =====')
in_file = self.open_file(REC_LOG_FILE_SIMPLE)
parser = self.create_rec_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(20)
self.assert_particles(result, REC_YAML_FILE_SIMPLE, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END YAML TEST =====')
def test_rec_parser_yaml(self):
"""
Read data from a file and pull out data particles
one at a time. Verify that the results are those we expected.
"""
log.debug('===== START YAML TEST =====')
in_file = self.open_file(REC_LOG_FILE_1)
parser = self.create_rec_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(15000)
self.assert_particles(result, REC_JSON_FILE_1, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END YAML TEST =====')
def test_rec_parser_large_yaml(self):
"""
Read data from a file and pull out data particles
one at a time. Verify that the results are those we expected.
"""
log.debug('===== START YAML TEST =====')
in_file = self.open_file(REC_LOG_FILE_2)
parser = self.create_rec_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(25000)
self.assert_particles(result, REC_JSON_FILE_2, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END YAML TEST =====')
def test_tel_parser_yaml_simple(self):
"""
Read data from a file and pull out data particles
one at a time. Verify that the results are those we expected.
"""
log.debug('===== START YAML TEST =====')
in_file = self.open_file(SIMPLE_LOG_FILE)
parser = self.create_tel_parser(in_file)
# In a single read, get all particles in this file.
number_expected_results = NUM_REC_SIMPLE_LOG_FILE
result = parser.get_records(number_expected_results)
self.assert_particles(result, YAML_FILE, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END YAML TEST =====')
def test_tel_parser_yaml_no_decimation(self):
"""
Read data from a log file with no decimation.
Verify that the results are those we expected.
"""
log.debug('===== START NO DEC TEST =====')
in_file = self.open_file(LARGE_LOG_FILE)
parser = self.create_tel_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(500)
self.assert_particles(result, LARGE_YAML_FILE, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END NO DEC TEST =====')
def test_tel_parser_yaml_with_decimation(self):
"""
Read data from a log file with a decimation factor.
Verify that the results are those we expected.
"""
log.debug('===== START DEC TEST =====')
in_file = self.open_file(DEC_LOG_FILE)
parser = self.create_tel_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(500)
self.assert_particles(result, DEC_YAML_FILE, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END DEC TEST =====')
def test_tel_parser_yaml_decimation_mix(self):
"""
Some records have a decimation factor, others don't.
Verify that the results are those we expected.
"""
log.debug('===== START MIX TEST =====')
in_file = self.open_file(MIX_LOG_FILE)
parser = self.create_tel_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(15)
self.assert_particles(result, MIX_YAML_FILE, RESOURCE_PATH)
in_file.close()
self.assertListEqual(self.exception_callback_value, [])
log.debug('===== END MIX TEST =====')
def create_rec_yml_file(self):
"""
Create a yml file corresponding to an actual recovered dataset. This is not an actual test - it allows
us to create what we need for integration testing, i.e. a yml file.
"""
in_file = self.open_file(REC_LOG_FILE_2)
parser = self.create_rec_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(100000)
self.particle_to_yml(result, REC_JSON_FILE_2)
def create_tel_yml_file(self):
"""
Create a yml file corresponding to an actual recovered dataset. This is not an actual test - it allows
us to create what we need for integration testing, i.e. a yml file.
"""
in_file = self.open_file(LARGE_LOG_FILE)
parser = self.create_tel_parser(in_file)
# In a single read, get all particles in this file.
result = parser.get_records(500)
self.particle_to_yml(result, LARGE_YAML_FILE)
def particle_to_yml(self, particles, filename):
"""
This is added as a testing helper, not actually as part of the parser tests. Since the same particles
will be used for the driver test it is helpful to write them to .yml in the same form they need in the
results.yml fids here.
"""
# open write append, if you want to start from scratch manually delete this fid
fid = self.open_file_write(filename)
fid.write('header:\n')
fid.write(" particle_object: 'MULTIPLE'\n")
fid.write(" particle_type: 'MULTIPLE'\n")
fid.write('data:\n')
for i in range(0, len(particles)):
particle_dict = particles[i].generate_dict()
fid.write(' - _index: %d\n' % (i+1))
fid.write(' particle_object: %s\n' % particles[i].__class__.__name__)
fid.write(' particle_type: %s\n' % particle_dict.get('stream_name'))
fid.write(' internal_timestamp: %f\n' % particle_dict.get('internal_timestamp'))
for val in particle_dict.get('values'):
if val.get('value') is None:
fid.write(' %s: %s\n' % (val.get('value_id'), '!!null'))
elif isinstance(val.get('value'), float):
fid.write(' %s: %16.12f\n' % (val.get('value_id'), val.get('value')))
elif isinstance(val.get('value'), str):
fid.write(" %s: '%s'\n" % (val.get('value_id'), val.get('value')))
else:
fid.write(' %s: %s\n' % (val.get('value_id'), val.get('value')))
fid.close()
| 56.263736 | 111 | 0.62916 | 16,437 | 102,400 | 3.767415 | 0.036016 | 0.472636 | 0.706338 | 0.938296 | 0.893775 | 0.876189 | 0.832249 | 0.79086 | 0.756254 | 0.724861 | 0 | 0.291687 | 0.215703 | 102,400 | 1,819 | 112 | 56.294667 | 0.479368 | 0.054307 | 0 | 0.474415 | 0 | 0.404971 | 0.409015 | 0.379316 | 0 | 0 | 0 | 0 | 0.017544 | 1 | 0.020468 | false | 0 | 0.00731 | 0 | 0.031433 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
d25d58e054ed86788e90bbd6d8ea1bcb7322aae8 | 24,554 | py | Python | src/ImportObject.py | Felicia35/CS4182-course-project | 4bb957d38751f5e5a33709aa44cc669ab0f946f5 | [
"MIT"
] | null | null | null | src/ImportObject.py | Felicia35/CS4182-course-project | 4bb957d38751f5e5a33709aa44cc669ab0f946f5 | [
"MIT"
] | null | null | null | src/ImportObject.py | Felicia35/CS4182-course-project | 4bb957d38751f5e5a33709aa44cc669ab0f946f5 | [
"MIT"
] | null | null | null | import OpenGL.GL as GL
import OpenGL.GLUT as GLUT
import OpenGL.GLU as GLU
## Avoid conflict with Python open
from PIL.Image import open as imageOpen
## This class is used to create an object from geometry and materials
## saved to a file in WaveFront object format. The object exported
## from Blender must have the normals included.
class ImportedObject:
## Constructor that includes storage for geometry and materials
## for an object.
def __init__(self, fileName, setAmbient = 0.9, verbose = False):
self.faces = []
self.verts = []
self.norms = []
self.texCoords = []
self.materials = []
self.fileName = fileName
self.setAmbient = False
self.hasTex = False
## Set this value to False before loading if the model is flat
self.isSmooth = True
self.verbose = verbose
## Load the material properties from the file
def loadMat(self):
## Open the material file
with open((self.fileName + ".mtl"), "r") as matFile:
## Load the material properties into tempMat
tempMat = []
for line in matFile:
## Break the line into its components
vals = line.split()
## Make sure there's something in the line (not blank)
if len(vals) > 0 :
## Record that a new material is being applied
if vals[0] == "newmtl":
n = vals[1]
tempMat.append(n)
## Load the specular exponent
elif vals[0] == "Ns":
n = vals[1]
tempMat.append(float(n))
## Load the diffuse values
elif vals[0] == "Kd":
n = map(float, vals[1:4])
tempMat.append(n)
## if self.setAmbient is False, ignore ambient values
## and load diffuse values twice to set the ambient
## equal to diffuse
if self.setAmbient:
tempMat.append(n)
## load the ambient values (if not overridden)
elif vals[0] == "Ka" and not self.setAmbient:
n = map(float, vals[1:4])
tempMat.append(n)
## load the specular values
elif vals[0] == "Ks":
n = map(float, vals[1:4])
tempMat.append(n)
tempMat.append(None)
## specular is the last line loaded for the material
self.materials.append(tempMat)
tempMat = []
## load texture file info
elif vals[0] == "map_Kd":
## record the texture file name
fileName = vals[1]
self.materials[-1][5]=(self.loadTexture(fileName))
self.hasTex = True
if self.verbose:
print("Loaded " + self.fileName + \
".mtl with " + str(len(self.materials)) + " materials")
## Load the object geometry.
def loadOBJ(self):
## parse the materials file first so we know when to apply materials
## and textures
self.loadMat()
numFaces = 0
with open((self.fileName + ".obj"), "r") as objFile:
for line in objFile:
## Break the line into its components
vals = line.split()
if len(vals) > 0:
## Load vertices
if vals[0] == "v":
v = map(float, vals[1:4])
self.verts.append(v)
## Load normals
elif vals[0] == "vn":
n = map(float, vals[1:4])
self.norms.append(n)
## Load texture coordinates
elif vals[0] == "vt":
t = map(float, vals[1:3])
self.texCoords.append(t)
## Load materials. Set index to -1!
elif vals[0] == "usemtl":
m = vals[1]
self.faces.append([-1, m, numFaces])
## Load the faces
elif vals[0] == "f":
tempFace = []
for f in vals[1:]:
## face entries have vertex/tex coord/normal
w = f.split("/")
## Vertex required, but should work if texture or
## normal is missing
if w[1] != '' and w[2] != '':
tempFace.append([int(w[0])-1,
int(w[1])-1,
int(w[2])-1])
elif w[1] != '':
tempFace.append([int(w[0])-1,
int(w[1])-1], -1)
elif w[2] != '':
tempFace.append([int(w[0])-1, -1,
int(w[2])-1])
else :
tempFace.append([int(w[0])-1,-1, -1])
self.faces.append(tempFace)
if self.verbose:
print("Loaded " + self.fileName + ".obj with " + \
str(len(self.verts)) + " vertices, " + \
str(len(self.norms)) + " normals, and " + \
str(len(self.faces)) + " faces")
## Draws the object
def drawObject(self):
if self.hasTex:
GL.glEnable(GL.GL_TEXTURE_2D)
## Use GL.GL_MODULATE instead of GL.GL_DECAL to retain lighting
GL.glTexEnvf(GL.GL_TEXTURE_ENV,
GL.GL_TEXTURE_ENV_MODE,
GL.GL_MODULATE)
## *****************************************************************
## Change GL.GL_FRONT to GL.GL_FRONT_AND_BACK if faces are missing
## (or fix the normals in the model so they point in the correct
## direction)
## *****************************************************************
GL.glPolygonMode(GL.GL_FRONT, GL.GL_FILL)
for face in self.faces:
## Check if a material
if face[0] == -1:
self.setModelColor(face[1])
else:
GL.glBegin(GL.GL_POLYGON)
## drawing normal, then texture, then vertice coords.
for f in face:
if f[2] != -1:
GL.glNormal3f(self.norms[f[2]][0],
self.norms[f[2]][1],
self.norms[f[2]][2])
if f[1] != -1:
GL.glTexCoord2f(self.texCoords[f[1]][0],
self.texCoords[f[1]][1])
GL.glVertex3f(self.verts[f[0]][0],
self.verts[f[0]][1],
self.verts[f[0]][2])
GL.glEnd()
## Turn off texturing (global state variable again)
GL.glDisable(GL.GL_TEXTURE_2D)
## Finds the matching material properties and sets them.
def setModelColor(self, material):
mat = []
for tempMat in self.materials:
if tempMat[0] == material:
mat = tempMat
## found it, break out.
break
## Set the color for the case when lighting is turned off. Using
## the diffuse color, since the diffuse component best describes
## the object color.
GL.glColor3f(mat[3][0], mat[3][1],mat[3][2])
## Set the model to smooth or flat depending on the attribute setting
if self.isSmooth:
GL.glShadeModel(GL.GL_SMOOTH)
else:
GL.glShadeModel(GL.GL_FLAT)
## The RGBA values for the specular light intesity. The alpha value
## (1.0) is ignored unless blending is enabled.
mat_specular = [mat[4][0], mat[4][1], mat[4][2], 1.0]
## The RGBA values for the diffuse light intesity. The alpha value
## (1.0) is ignored unless blending is enabled.
mat_diffuse = [mat[3][0], mat[3][1],mat[3][2], 1.0]
## The value for the specular exponent. The higher the value, the
## "tighter" the specular highlight. Valid values are [0.0, 128.0]
mat_ambient = [mat[2][0], mat[2][1], mat[2][2],1.0]
## The value for the specular exponent. The higher the value, the
## "tighter" the specular highlight. Valid values are [0.0, 128.0]
mat_shininess = 0.128*mat[1]
## Set the material specular values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_SPECULAR, mat_specular)
## Set the material shininess values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_SHININESS, mat_shininess)
## Set the material diffuse values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_DIFFUSE, mat_diffuse)
## Set the material ambient values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_AMBIENT, mat_ambient)
## See if there is a texture and bind it if it's there
if mat[5] != None:
GL.glBindTexture(GL.GL_TEXTURE_2D, mat[5])
## Load a texture from the provided image file name
def loadTexture(self, texFile):
if self.verbose:
print("Loading " + texFile)
## Open the image file
texImage = imageOpen(texFile)
try:
ix, iy, image = texImage.size[0], \
texImage.size[1], \
texImage.tobytes("raw", "RGBA", 0, -1)
except SystemError:
ix, iy, image = texImage.size[0], \
texImage.size[1], \
texImage.tobytes("raw", "RGBX", 0, -1)
## GL.glGenTextures() and GL.glBindTexture() name and create a texture
## object for a texture image
tempID = GL.glGenTextures(1)
GL.glBindTexture(GL.GL_TEXTURE_2D, tempID)
## The four calls to GL.glTexParameter*() specify how the texture is to
## be wrapped and how the colors are to be filtered if there isn't an
## exact match between pixels in the texture and pixels on the screen
## Values for GL.GL_TEXTURE_WRAP_S and GL.GL_TEXTURE_WRAP_T are
## GL.GL_REPEAT and GL.GL_CLAMP
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_WRAP_S,
GL.GL_REPEAT)
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_WRAP_T,
GL.GL_REPEAT)
## The MAG_FILTER has values of GL.GL_NEAREST and GL.GL_LINEAR. There
## are many choices for values for the MIN_FILTER. GL.GL_NEAREST has
## more pixelation, but is the fastest
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_MAG_FILTER,
GL.GL_NEAREST)
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_MIN_FILTER,
GL.GL_NEAREST)
## Store the pixel data
GL.glPixelStorei(GL.GL_UNPACK_ALIGNMENT,1)
GL.glTexImage2D(GL.GL_TEXTURE_2D, 0, 3, ix, iy, 0,
GL.GL_RGBA, GL.GL_UNSIGNED_BYTE, image)
return tempID
import OpenGL.GL as GL
import OpenGL.GLUT as GLUT
import OpenGL.GLU as GLU
## Avoid conflict with Python open
from PIL.Image import open as imageOpen
## This class is used to create an object from geometry and materials
## saved to a file in WaveFront object format. The object exported
## from Blender must have the normals included.
class ImportedObject:
## Constructor that includes storage for geometry and materials
## for an object.
def __init__(self, fileName, setAmbient = 0.9, verbose = False):
self.faces = []
self.verts = []
self.norms = []
self.texCoords = []
self.materials = []
self.fileName = fileName
self.setAmbient = False
self.hasTex = False
## Set this value to False before loading if the model is flat
self.isSmooth = True
self.verbose = verbose
## Load the material properties from the file
def loadMat(self, texturePath = ""):
## Open the material file
with open((self.fileName + ".mtl"), "r") as matFile:
## Load the material properties into tempMat
tempMat = []
for line in matFile:
## Break the line into its components
vals = line.split()
## Make sure there's something in the line (not blank)
if len(vals) > 0 :
## Record that a new material is being applied
if vals[0] == "newmtl":
n = vals[1]
tempMat.append(n)
## Load the specular exponent
elif vals[0] == "Ns":
n = vals[1]
tempMat.append(float(n))
## Load the diffuse values
elif vals[0] == "Kd":
n = map(float, vals[1:4])
tempMat.append(n)
## if self.setAmbient is False, ignore ambient values
## and load diffuse values twice to set the ambient
## equal to diffuse
if self.setAmbient:
tempMat.append(n)
## load the ambient values (if not overridden)
elif vals[0] == "Ka" and not self.setAmbient:
n = map(float, vals[1:4])
tempMat.append(n)
## load the specular values
elif vals[0] == "Ks":
n = map(float, vals[1:4])
tempMat.append(n)
tempMat.append(None)
## specular is the last line loaded for the material
self.materials.append(tempMat)
tempMat = []
## load texture file info
elif vals[0] == "map_Kd":
## record the texture file name
fileName = vals[1]
self.materials[-1][5]=(self.loadTexture(texturePath +fileName))
self.hasTex = True
if self.verbose:
print("Loaded " + self.fileName + \
".mtl with " + str(len(self.materials)) + " materials")
## Load the object geometry.
def loadOBJ(self, texturePath = ""):
## parse the materials file first so we know when to apply materials
## and textures
self.loadMat(texturePath)
numFaces = 0
with open((self.fileName + ".obj"), "r") as objFile:
for line in objFile:
## Break the line into its components
vals = line.split()
if len(vals) > 0:
## Load vertices
if vals[0] == "v":
v = map(float, vals[1:4])
self.verts.append(v)
## Load normals
elif vals[0] == "vn":
n = map(float, vals[1:4])
self.norms.append(n)
## Load texture coordinates
elif vals[0] == "vt":
t = map(float, vals[1:3])
self.texCoords.append(t)
## Load materials. Set index to -1!
elif vals[0] == "usemtl":
m = vals[1]
self.faces.append([-1, m, numFaces])
## Load the faces
elif vals[0] == "f":
tempFace = []
for f in vals[1:]:
## face entries have vertex/tex coord/normal
w = f.split("/")
## Vertex required, but should work if texture or
## normal is missing
if w[1] != '' and w[2] != '':
tempFace.append([int(w[0])-1,
int(w[1])-1,
int(w[2])-1])
elif w[1] != '':
tempFace.append([int(w[0])-1,
int(w[1])-1], -1)
elif w[2] != '':
tempFace.append([int(w[0])-1, -1,
int(w[2])-1])
else :
tempFace.append([int(w[0])-1,-1, -1])
self.faces.append(tempFace)
if self.verbose:
print("Loaded " + self.fileName + ".obj with " + \
str(len(self.verts)) + " vertices, " + \
str(len(self.norms)) + " normals, and " + \
str(len(self.faces)) + " faces")
## Draws the object
def drawObject(self):
if self.hasTex:
GL.glEnable(GL.GL_TEXTURE_2D)
## Use GL.GL_MODULATE instead of GL.GL_DECAL to retain lighting
GL.glTexEnvf(GL.GL_TEXTURE_ENV,
GL.GL_TEXTURE_ENV_MODE,
GL.GL_MODULATE)
## *****************************************************************
## Change GL.GL_FRONT to GL.GL_FRONT_AND_BACK if faces are missing
## (or fix the normals in the model so they point in the correct
## direction)
## *****************************************************************
GL.glPolygonMode(GL.GL_FRONT, GL.GL_FILL)
for face in self.faces:
## Check if a material
if face[0] == -1:
self.setModelColor(face[1])
else:
GL.glBegin(GL.GL_POLYGON)
## drawing normal, then texture, then vertice coords.
for f in face:
if f[2] != -1:
GL.glNormal3f(self.norms[f[2]][0],
self.norms[f[2]][1],
self.norms[f[2]][2])
if f[1] != -1:
GL.glTexCoord2f(self.texCoords[f[1]][0],
self.texCoords[f[1]][1])
GL.glVertex3f(self.verts[f[0]][0],
self.verts[f[0]][1],
self.verts[f[0]][2])
GL.glEnd()
## Turn off texturing (global state variable again)
GL.glDisable(GL.GL_TEXTURE_2D)
## Finds the matching material properties and sets them.
def setModelColor(self, material):
mat = []
for tempMat in self.materials:
if tempMat[0] == material:
mat = tempMat
## found it, break out.
break
## Set the color for the case when lighting is turned off. Using
## the diffuse color, since the diffuse component best describes
## the object color.
GL.glColor3f(mat[3][0], mat[3][1],mat[3][2])
## Set the model to smooth or flat depending on the attribute setting
if self.isSmooth:
GL.glShadeModel(GL.GL_SMOOTH)
else:
GL.glShadeModel(GL.GL_FLAT)
## The RGBA values for the specular light intesity. The alpha value
## (1.0) is ignored unless blending is enabled.
mat_specular = [mat[4][0], mat[4][1], mat[4][2], 1.0]
## The RGBA values for the diffuse light intesity. The alpha value
## (1.0) is ignored unless blending is enabled.
mat_diffuse = [mat[3][0], mat[3][1],mat[3][2], 1.0]
## The value for the specular exponent. The higher the value, the
## "tighter" the specular highlight. Valid values are [0.0, 128.0]
mat_ambient = [mat[2][0], mat[2][1], mat[2][2],1.0]
## The value for the specular exponent. The higher the value, the
## "tighter" the specular highlight. Valid values are [0.0, 128.0]
mat_shininess = 0.128*mat[1]
## Set the material specular values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_SPECULAR, mat_specular)
## Set the material shininess values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_SHININESS, mat_shininess)
## Set the material diffuse values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_DIFFUSE, mat_diffuse)
## Set the material ambient values for the polygon front faces.
GL.glMaterialfv(GL.GL_FRONT, GL.GL_AMBIENT, mat_ambient)
## See if there is a texture and bind it if it's there
if mat[5] != None:
GL.glBindTexture(GL.GL_TEXTURE_2D, mat[5])
## Load a texture from the provided image file name
def loadTexture(self, texFile):
if self.verbose:
print("Loading " + texFile)
## Open the image file
texImage = imageOpen(texFile)
try:
ix, iy, image = texImage.size[0], \
texImage.size[1], \
texImage.tobytes("raw", "RGBX", 0, -1)
except SystemError:
ix, iy, image = texImage.size[0], \
texImage.size[1], \
texImage.tobytes("raw", "RGBA", 0, -1)
## GL.glGenTextures() and GL.glBindTexture() name and create a texture
## object for a texture image
tempID = GL.glGenTextures(1)
GL.glBindTexture(GL.GL_TEXTURE_2D, tempID)
## The four calls to GL.glTexParameter*() specify how the texture is to
## be wrapped and how the colors are to be filtered if there isn't an
## exact match between pixels in the texture and pixels on the screen
## Values for GL.GL_TEXTURE_WRAP_S and GL.GL_TEXTURE_WRAP_T are
## GL.GL_REPEAT and GL.GL_CLAMP
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_WRAP_S,
GL.GL_REPEAT)
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_WRAP_T,
GL.GL_REPEAT)
## The MAG_FILTER has values of GL.GL_NEAREST and GL.GL_LINEAR. There
## are many choices for values for the MIN_FILTER. GL.GL_NEAREST has
## more pixelation, but is the fastest
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_MAG_FILTER,
GL.GL_NEAREST)
GL.glTexParameteri(GL.GL_TEXTURE_2D,
GL.GL_TEXTURE_MIN_FILTER,
GL.GL_NEAREST)
## Store the pixel data
GL.glPixelStorei(GL.GL_UNPACK_ALIGNMENT,1)
GL.glTexImage2D(GL.GL_TEXTURE_2D, 0, 3, ix, iy, 0,
GL.GL_RGBA, GL.GL_UNSIGNED_BYTE, image)
return tempID
| 47.49323 | 111 | 0.465097 | 2,710 | 24,554 | 4.147601 | 0.113284 | 0.033452 | 0.033274 | 0.020819 | 0.996085 | 0.996085 | 0.996085 | 0.996085 | 0.996085 | 0.996085 | 0 | 0.023664 | 0.432068 | 24,554 | 516 | 112 | 47.585271 | 0.782359 | 0.283905 | 0 | 0.975904 | 0 | 0 | 0.016209 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036145 | false | 0 | 0.03012 | 0 | 0.078313 | 0.018072 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
963b32e771ffe1f90a5d892fca7c7340e79a35f7 | 355 | py | Python | tests/internal/instance_type/test_instance_type_c5a_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_c5a_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_c5a_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | 1 | 2021-12-15T11:58:22.000Z | 2021-12-15T11:58:22.000Z |
# Testing module instance_type.c5a
import pytest
import ec2_compare.internal.instance_type.c5a
def test_get_internal_data_instance_type_c5a_get_instances_list():
assert len(ec2_compare.internal.instance_type.c5a.get_instances_list()) > 0
def test_get_internal_data_instance_type_c5a_get():
assert len(ec2_compare.internal.instance_type.c5a.get) > 0
| 35.5 | 77 | 0.850704 | 56 | 355 | 4.946429 | 0.339286 | 0.259928 | 0.32491 | 0.259928 | 0.826715 | 0.826715 | 0.613718 | 0.613718 | 0.613718 | 0 | 0 | 0.033435 | 0.073239 | 355 | 9 | 78 | 39.444444 | 0.808511 | 0.090141 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
9641353039f8c42a105ee28af77495d02847a447 | 133 | py | Python | WeChatSecretary/utils/wechat_wiki.py | TitusWongCN/WeChatRelativeTools | 9307bae8c15e47b5bbf169a95a50be0d107a5bb1 | [
"MIT"
] | null | null | null | WeChatSecretary/utils/wechat_wiki.py | TitusWongCN/WeChatRelativeTools | 9307bae8c15e47b5bbf169a95a50be0d107a5bb1 | [
"MIT"
] | null | null | null | WeChatSecretary/utils/wechat_wiki.py | TitusWongCN/WeChatRelativeTools | 9307bae8c15e47b5bbf169a95a50be0d107a5bb1 | [
"MIT"
] | null | null | null | # -*- coding=utf-8 -*-
# python37
from utils.weather.main import weather
def get_weather(city):
return weather.get_weather(city) | 22.166667 | 38 | 0.729323 | 19 | 133 | 5 | 0.684211 | 0.210526 | 0.294737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026087 | 0.135338 | 133 | 6 | 39 | 22.166667 | 0.8 | 0.218045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
9669906bc01be0786c877bec55fce1355928eaaa | 1,239 | py | Python | tests/data/python37.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | tests/data/python37.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | tests/data/python37.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | #!/usr/bin/env python3.7
def f():
return ( i * 2 async for i in arange( 42 ) )
def g():
return (
something_long * something_long
async for something_long in async_generator( with_an_argument )
)
async def func():
if test:
out_batched = [
i
async for i in aitertools._async_map(
self.async_inc, arange( 8 ), batch_size=3
)
]
def awaited_generator_value( n ):
return ( await awaitable for awaitable in awaitable_list )
def make_arange( n ):
return ( i * 2 for i in range( n ) if await wrap( i ) )
# output
#!/usr/bin/env python3.7
def f():
return ( i * 2 async for i in arange( 42 ) )
def g():
return (
something_long * something_long
async for something_long in async_generator( with_an_argument )
)
async def func():
if test:
out_batched = [
i
async for i in aitertools._async_map(
self.async_inc, arange( 8 ), batch_size=3
)
]
def awaited_generator_value( n ):
return ( await awaitable for awaitable in awaitable_list )
def make_arange( n ):
return ( i * 2 for i in range( n ) if await wrap( i ) )
| 18.772727 | 71 | 0.577885 | 171 | 1,239 | 4.011696 | 0.263158 | 0.069971 | 0.052478 | 0.06414 | 0.991254 | 0.991254 | 0.991254 | 0.991254 | 0.991254 | 0.991254 | 0 | 0.019441 | 0.335755 | 1,239 | 65 | 72 | 19.061538 | 0.814095 | 0.042776 | 0 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0 | 0.210526 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
9673d0f710278c0944917046bbb3992119b7e09c | 70 | py | Python | proyecto/flask/Lib/collections/abc.py | grupoprog3/proyecto_final | 56fa4d33852a347476e721bf02bb3bc53a7b7a70 | [
"Apache-2.0"
] | 207 | 2018-10-01T08:53:01.000Z | 2022-03-14T12:15:54.000Z | Thonny/Lib/collections/abc.py | Pydiderot/pydiderotIDE | a42fcde3ea837ae40c957469f5d87427e8ce46d3 | [
"MIT"
] | 30 | 2019-01-04T10:14:56.000Z | 2020-10-12T14:00:31.000Z | Thonny/Lib/collections/abc.py | Pydiderot/pydiderotIDE | a42fcde3ea837ae40c957469f5d87427e8ce46d3 | [
"MIT"
] | 76 | 2020-03-16T01:47:46.000Z | 2022-03-21T16:37:07.000Z | from _collections_abc import *
from _collections_abc import __all__
| 23.333333 | 37 | 0.842857 | 9 | 70 | 5.666667 | 0.555556 | 0.588235 | 0.705882 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 70 | 2 | 38 | 35 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
738736c542e2220cb2383879fc080b45248a7688 | 3,163 | py | Python | test_suites/calculator_suite_auto.py | AntoData/appium-framework | 936c264971a0755fc815b273961e3deb0f516f6b | [
"MIT"
] | null | null | null | test_suites/calculator_suite_auto.py | AntoData/appium-framework | 936c264971a0755fc815b273961e3deb0f516f6b | [
"MIT"
] | null | null | null | test_suites/calculator_suite_auto.py | AntoData/appium-framework | 936c264971a0755fc815b273961e3deb0f516f6b | [
"MIT"
] | null | null | null | import time
import unittest
import selenium
from app_window_objects import calculatormainapp as cmp
"""
To get to know how to use this project: https://github.com/AntoData/appium-framework/wiki
"""
class CalculatorTestSuite(unittest.TestCase):
def tearDown(self):
self.calculator.destroy()
time.sleep(60)
def setUp(self) -> None:
self.calculator = cmp.CalculatorMainApp()
def test_add_two_numbers(self):
n: str = str(8)
m: str = str(3)
try:
self.calculator.click_on_arrow()
except selenium.common.exceptions.NoSuchElementException:
pass
self.calculator.click_on_number(n)
self.calculator.click_on_add()
self.calculator.click_on_number(m)
self.calculator.click_on_equal()
time.sleep(5)
result = self.calculator.get_operation_int_result()
print("{0}+{1}={2}".format(n, m, result))
self.assertEqual((int(n) + int(m)), result)
def test_subtract_two_numbers(self):
n: str = str(8)
m: str = str(3)
try:
self.calculator.click_on_arrow()
except selenium.common.exceptions.NoSuchElementException:
pass
self.calculator.click_on_number(n)
self.calculator.click_on_minus()
self.calculator.click_on_number(m)
self.calculator.click_on_equal()
time.sleep(5)
result = self.calculator.get_operation_int_result()
print("{0}-{1}={2}".format(n, m, result))
self.assertEqual((int(n) - int(m)), result)
def test_multiply_two_numbers(self):
n: str = str(8)
m: str = str(5)
try:
self.calculator.click_on_arrow()
except selenium.common.exceptions.NoSuchElementException:
pass
self.calculator.click_on_number(n)
self.calculator.click_on_multiply()
self.calculator.click_on_number(m)
self.calculator.click_on_equal()
time.sleep(5)
result = self.calculator.get_operation_int_result()
print("{0}*{1}={2}".format(n, m, result))
self.assertEqual((int(n) * int(m)), result)
def test_divide_two_numbers(self):
n: str = str(9)
m: str = str(3)
try:
self.calculator.click_on_arrow()
except selenium.common.exceptions.NoSuchElementException:
pass
self.calculator.click_on_number(n)
self.calculator.click_on_divide()
self.calculator.click_on_number(m)
self.calculator.click_on_equal()
time.sleep(5)
result = self.calculator.get_operation_float_result()
print("{0}-{1}={2}".format(n, m, result))
self.assertEqual(int(n) / (int(m)), result)
def test_delete_number(self):
try:
self.calculator.click_on_arrow()
except selenium.common.exceptions.NoSuchElementException:
pass
self.calculator.click_on_number("9")
self.calculator.click_on_delete()
time.sleep(5)
result = self.calculator.get_preview_result()
self.assertEqual("", result)
if __name__ == '__main__':
unittest.main()
| 32.27551 | 89 | 0.628201 | 388 | 3,163 | 4.904639 | 0.201031 | 0.220704 | 0.229637 | 0.25381 | 0.760904 | 0.760904 | 0.749869 | 0.732528 | 0.732528 | 0.732528 | 0 | 0.011874 | 0.254505 | 3,163 | 97 | 90 | 32.608247 | 0.795165 | 0 | 0 | 0.585366 | 0 | 0 | 0.017286 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 1 | 0.085366 | false | 0.060976 | 0.04878 | 0 | 0.146341 | 0.04878 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
73ae6bd5f1d664e9df757e025b4f0ee608b356a8 | 4,415 | py | Python | Chapter 06/6_3_Errors_with_Respect_to_Python.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | Chapter 06/6_3_Errors_with_Respect_to_Python.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | Chapter 06/6_3_Errors_with_Respect_to_Python.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | x = int(input("Enter first number : "))
y = int(input("enter second number : "))
try:
divide = x/y
print("try block over")
print("divide = ",divide)
except ZeroDivisionError:
print("OOPS exception occured due to Zero division error")
print("****THE END****")
####################################
#Example 6.4
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
except ZeroDivisionError:
print("OOPS exception occurred due to Zero division error")
print("****THE END****")
##############################################
a = int(input(“Enter first number : “))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
except (ZeroDivisionError,TypeError) :
print("OOPS exception occurred due to Exception")
print("****THE END****")
###################################################
a = int(input(“Enter first number : “))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
except (ZeroDivisionError,TypeError) :
print("OOPS exception occurred due to Exception")
print("****THE END****")
################################################
#6.3.2 Catching Exception in general
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
except TypeError as e:
print(str(e))
####################################################
#Example 6.5
import sys
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
#Access more information with sysexc_info
except:
print("OOPS exception occurred due to ",sys.exc_info()[0])
print("****THE END****")
###########################################################
#Example 6.6
import sys
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
#Access more information with sysexc_info
except:
print("OOPS exception occurred due to ",sys.exc_info()[1])
print("****THE END****")
##################33
#Example 6.7
import sys
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
#Access more information with sysexc_info
except:
print("OOPS exception occurred due to ",sys.exc_info()[2])
print("****THE END****")
###########################33
#Exaple 6.8
import sys
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("try block over")
print("divide = ",divide)
#Access more information with sysexc_info
except:
print("OOPS exception occurred due to line no.",sys.exc_info()[2].tb_lineno)
print("****THE END****")
##############################################3
#16.3.3 Try … Except…else Statement
#Case 1 with exception, else block not executed
import sys
a = int(input("Enter first number : "))
b = input("enter second number : ")
try:
divide = a/b
print("divide = ",divide)
except:
print("OOPS exception occurred due to line no.",sys.exc_info()[2].tb_lineno)
else:
print("try over successfully")
print("****THE END****")
#Case2 With no exception, else block executed
import sys
a = int(input("Enter first number : "))
b = int(input("enter second number : "))
try:
divide = a/b
print("divide = ",divide)
except:
print("OOPS exception occurred due to line no.",sys.exc_info()[2].tb_lineno)
else:
print("try over successfully")
print("****THE END****")
###########################################
#6.3.4 Try…Except…Finally…..
def just_having_fun():
try:
print(4)
except TypeError as e:
print(str(e))
finally:
print("Ok Bye")
just_having_fun()
#######################################
try:
voter_age = int(input("Enter the age of the voter : "))
if voter_age< 18:
raise ValueError;
else:
print("You can vote")
except ValueError:
print("You are not allowed to vote")
| 28.668831 | 81 | 0.549264 | 549 | 4,415 | 4.409836 | 0.160291 | 0.095002 | 0.075176 | 0.081784 | 0.831475 | 0.808757 | 0.805452 | 0.770343 | 0.747625 | 0.747625 | 0 | 0.010315 | 0.209513 | 4,415 | 153 | 82 | 28.856209 | 0.679083 | 0.091733 | 0 | 0.836066 | 0 | 0 | 0.388099 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.04918 | null | null | 0.393443 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
73afc0991f63d23a3410e3235bcfe4169eb80e35 | 60,548 | py | Python | tests/test_coilwaterheatingairtowaterheatpumpvariablespeed.py | marcelosalles/pyidf | c2f744211572b5e14e29522aac1421ba88addb0e | [
"Apache-2.0"
] | 19 | 2015-12-08T23:33:51.000Z | 2022-01-31T04:41:10.000Z | tests/test_coilwaterheatingairtowaterheatpumpvariablespeed.py | marcelosalles/pyidf | c2f744211572b5e14e29522aac1421ba88addb0e | [
"Apache-2.0"
] | 2 | 2019-10-04T10:57:00.000Z | 2021-10-01T06:46:17.000Z | tests/test_coilwaterheatingairtowaterheatpumpvariablespeed.py | marcelosalles/pyidf | c2f744211572b5e14e29522aac1421ba88addb0e | [
"Apache-2.0"
] | 7 | 2015-11-04T02:25:01.000Z | 2021-12-08T03:14:28.000Z | import os
import tempfile
import unittest
import logging
from pyidf import ValidationLevel
import pyidf
from pyidf.idf import IDF
from pyidf.coils import CoilWaterHeatingAirToWaterHeatPumpVariableSpeed
log = logging.getLogger(__name__)
class TestCoilWaterHeatingAirToWaterHeatPumpVariableSpeed(unittest.TestCase):
def setUp(self):
self.fd, self.path = tempfile.mkstemp()
def tearDown(self):
os.remove(self.path)
def test_create_coilwaterheatingairtowaterheatpumpvariablespeed(self):
pyidf.validation_level = ValidationLevel.error
obj = CoilWaterHeatingAirToWaterHeatPumpVariableSpeed()
# alpha
var_name = "Name"
obj.name = var_name
# integer
var_number_of_speeds = 5
obj.number_of_speeds = var_number_of_speeds
# integer
var_nominal_speed_level = 3
obj.nominal_speed_level = var_nominal_speed_level
# real
var_rated_water_heating_capacity = 0.0001
obj.rated_water_heating_capacity = var_rated_water_heating_capacity
# real
var_rated_evaporator_inlet_air_drybulb_temperature = 5.0001
obj.rated_evaporator_inlet_air_drybulb_temperature = var_rated_evaporator_inlet_air_drybulb_temperature
# real
var_rated_evaporator_inlet_air_wetbulb_temperature = 5.0001
obj.rated_evaporator_inlet_air_wetbulb_temperature = var_rated_evaporator_inlet_air_wetbulb_temperature
# real
var_rated_condenser_inlet_water_temperature = 25.0001
obj.rated_condenser_inlet_water_temperature = var_rated_condenser_inlet_water_temperature
# real
var_rated_evaporator_air_flow_rate = 0.0001
obj.rated_evaporator_air_flow_rate = var_rated_evaporator_air_flow_rate
# real
var_rated_condenser_water_flow_rate = 0.0001
obj.rated_condenser_water_flow_rate = var_rated_condenser_water_flow_rate
# alpha
var_evaporator_fan_power_included_in_rated_cop = "Yes"
obj.evaporator_fan_power_included_in_rated_cop = var_evaporator_fan_power_included_in_rated_cop
# alpha
var_condenser_pump_power_included_in_rated_cop = "Yes"
obj.condenser_pump_power_included_in_rated_cop = var_condenser_pump_power_included_in_rated_cop
# alpha
var_condenser_pump_heat_included_in_rated_heating_capacity_and_rated_cop = "Yes"
obj.condenser_pump_heat_included_in_rated_heating_capacity_and_rated_cop = var_condenser_pump_heat_included_in_rated_heating_capacity_and_rated_cop
# real
var_fraction_of_condenser_pump_heat_to_water = 0.5
obj.fraction_of_condenser_pump_heat_to_water = var_fraction_of_condenser_pump_heat_to_water
# node
var_evaporator_air_inlet_node_name = "node|Evaporator Air Inlet Node Name"
obj.evaporator_air_inlet_node_name = var_evaporator_air_inlet_node_name
# node
var_evaporator_air_outlet_node_name = "node|Evaporator Air Outlet Node Name"
obj.evaporator_air_outlet_node_name = var_evaporator_air_outlet_node_name
# node
var_condenser_water_inlet_node_name = "node|Condenser Water Inlet Node Name"
obj.condenser_water_inlet_node_name = var_condenser_water_inlet_node_name
# node
var_condenser_water_outlet_node_name = "node|Condenser Water Outlet Node Name"
obj.condenser_water_outlet_node_name = var_condenser_water_outlet_node_name
# real
var_crankcase_heater_capacity = 0.0
obj.crankcase_heater_capacity = var_crankcase_heater_capacity
# real
var_maximum_ambient_temperature_for_crankcase_heater_operation = 0.0
obj.maximum_ambient_temperature_for_crankcase_heater_operation = var_maximum_ambient_temperature_for_crankcase_heater_operation
# alpha
var_evaporator_air_temperature_type_for_curve_objects = "DryBulbTemperature"
obj.evaporator_air_temperature_type_for_curve_objects = var_evaporator_air_temperature_type_for_curve_objects
# object-list
var_part_load_fraction_correlation_curve_name = "object-list|Part Load Fraction Correlation Curve Name"
obj.part_load_fraction_correlation_curve_name = var_part_load_fraction_correlation_curve_name
# real
var_rated_water_heating_capacity_at_speed_1 = 0.0001
obj.rated_water_heating_capacity_at_speed_1 = var_rated_water_heating_capacity_at_speed_1
# real
var_rated_water_heating_cop_at_speed_1 = 0.0001
obj.rated_water_heating_cop_at_speed_1 = var_rated_water_heating_cop_at_speed_1
# real
var_rated_sensible_heat_ratio_at_speed_1 = 0.75
obj.rated_sensible_heat_ratio_at_speed_1 = var_rated_sensible_heat_ratio_at_speed_1
# real
var_speed_1_reference_unit_rated_air_flow_rate = 0.0
obj.speed_1_reference_unit_rated_air_flow_rate = var_speed_1_reference_unit_rated_air_flow_rate
# real
var_speed_1_reference_unit_rated_water_flow_rate = 0.0
obj.speed_1_reference_unit_rated_water_flow_rate = var_speed_1_reference_unit_rated_water_flow_rate
# real
var_speed_1_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_1_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_1_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_1_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 1 Total WH Capacity Function of Temperature Curve Name"
obj.speed_1_total_wh_capacity_function_of_temperature_curve_name = var_speed_1_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_1_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 1 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_1_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_1_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_1_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 1 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_1_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_1_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_1_cop_function_of_temperature_curve_name = "object-list|Speed 1 COP Function of Temperature Curve Name"
obj.speed_1_cop_function_of_temperature_curve_name = var_speed_1_cop_function_of_temperature_curve_name
# object-list
var_speed_1_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 1 COP Function of Air Flow Fraction Curve Name"
obj.speed_1_cop_function_of_air_flow_fraction_curve_name = var_speed_1_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_1_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 1 COP Function of Water Flow Fraction Curve Name"
obj.speed_1_cop_function_of_water_flow_fraction_curve_name = var_speed_1_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_2 = 0.0001
obj.rated_water_heating_capacity_at_speed_2 = var_rated_water_heating_capacity_at_speed_2
# real
var_rated_water_heating_cop_at_speed_2 = 0.0001
obj.rated_water_heating_cop_at_speed_2 = var_rated_water_heating_cop_at_speed_2
# real
var_rated_sensible_heat_ratio_at_speed_2 = 0.75
obj.rated_sensible_heat_ratio_at_speed_2 = var_rated_sensible_heat_ratio_at_speed_2
# real
var_speed_2_reference_unit_rated_air_flow_rate = 0.0
obj.speed_2_reference_unit_rated_air_flow_rate = var_speed_2_reference_unit_rated_air_flow_rate
# real
var_speed_2_reference_unit_rated_water_flow_rate = 0.0
obj.speed_2_reference_unit_rated_water_flow_rate = var_speed_2_reference_unit_rated_water_flow_rate
# real
var_speed_2_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_2_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_2_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_2_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 2 Total WH Capacity Function of Temperature Curve Name"
obj.speed_2_total_wh_capacity_function_of_temperature_curve_name = var_speed_2_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_2_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 2 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_2_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_2_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_2_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 2 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_2_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_2_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_2_cop_function_of_temperature_curve_name = "object-list|Speed 2 COP Function of Temperature Curve Name"
obj.speed_2_cop_function_of_temperature_curve_name = var_speed_2_cop_function_of_temperature_curve_name
# object-list
var_speed_2_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 2 COP Function of Air Flow Fraction Curve Name"
obj.speed_2_cop_function_of_air_flow_fraction_curve_name = var_speed_2_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_2_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 2 COP Function of Water Flow Fraction Curve Name"
obj.speed_2_cop_function_of_water_flow_fraction_curve_name = var_speed_2_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_3 = 0.0001
obj.rated_water_heating_capacity_at_speed_3 = var_rated_water_heating_capacity_at_speed_3
# real
var_rated_water_heating_cop_at_speed_3 = 0.0001
obj.rated_water_heating_cop_at_speed_3 = var_rated_water_heating_cop_at_speed_3
# real
var_rated_sensible_heat_ratio_at_speed_3 = 0.75
obj.rated_sensible_heat_ratio_at_speed_3 = var_rated_sensible_heat_ratio_at_speed_3
# real
var_speed_3_reference_unit_rated_air_flow_rate = 0.0
obj.speed_3_reference_unit_rated_air_flow_rate = var_speed_3_reference_unit_rated_air_flow_rate
# real
var_speed_3_reference_unit_rated_water_flow_rate = 0.0
obj.speed_3_reference_unit_rated_water_flow_rate = var_speed_3_reference_unit_rated_water_flow_rate
# real
var_speed_3_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_3_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_3_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_3_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 3 Total WH Capacity Function of Temperature Curve Name"
obj.speed_3_total_wh_capacity_function_of_temperature_curve_name = var_speed_3_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_3_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 3 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_3_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_3_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_3_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 3 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_3_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_3_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_3_cop_function_of_temperature_curve_name = "object-list|Speed 3 COP Function of Temperature Curve Name"
obj.speed_3_cop_function_of_temperature_curve_name = var_speed_3_cop_function_of_temperature_curve_name
# object-list
var_speed_3_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 3 COP Function of Air Flow Fraction Curve Name"
obj.speed_3_cop_function_of_air_flow_fraction_curve_name = var_speed_3_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_3_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 3 COP Function of Water Flow Fraction Curve Name"
obj.speed_3_cop_function_of_water_flow_fraction_curve_name = var_speed_3_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_4 = 0.0001
obj.rated_water_heating_capacity_at_speed_4 = var_rated_water_heating_capacity_at_speed_4
# real
var_rated_water_heating_cop_at_speed_4 = 0.0001
obj.rated_water_heating_cop_at_speed_4 = var_rated_water_heating_cop_at_speed_4
# real
var_rated_sensible_heat_ratio_at_speed_4 = 0.75
obj.rated_sensible_heat_ratio_at_speed_4 = var_rated_sensible_heat_ratio_at_speed_4
# real
var_speed_4_reference_unit_rated_air_flow_rate = 0.0
obj.speed_4_reference_unit_rated_air_flow_rate = var_speed_4_reference_unit_rated_air_flow_rate
# real
var_speed_4_reference_unit_rated_water_flow_rate = 0.0
obj.speed_4_reference_unit_rated_water_flow_rate = var_speed_4_reference_unit_rated_water_flow_rate
# real
var_speed_4_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_4_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_4_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_4_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 4 Total WH Capacity Function of Temperature Curve Name"
obj.speed_4_total_wh_capacity_function_of_temperature_curve_name = var_speed_4_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_4_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 4 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_4_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_4_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_4_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 4 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_4_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_4_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_4_cop_function_of_temperature_curve_name = "object-list|Speed 4 COP Function of Temperature Curve Name"
obj.speed_4_cop_function_of_temperature_curve_name = var_speed_4_cop_function_of_temperature_curve_name
# object-list
var_speed_4_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 4 COP Function of Air Flow Fraction Curve Name"
obj.speed_4_cop_function_of_air_flow_fraction_curve_name = var_speed_4_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_4_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 4 COP Function of Water Flow Fraction Curve Name"
obj.speed_4_cop_function_of_water_flow_fraction_curve_name = var_speed_4_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_5 = 0.0001
obj.rated_water_heating_capacity_at_speed_5 = var_rated_water_heating_capacity_at_speed_5
# real
var_rated_water_heating_cop_at_speed_5 = 0.0001
obj.rated_water_heating_cop_at_speed_5 = var_rated_water_heating_cop_at_speed_5
# real
var_rated_sensible_heat_ratio_at_speed_5 = 0.75
obj.rated_sensible_heat_ratio_at_speed_5 = var_rated_sensible_heat_ratio_at_speed_5
# real
var_speed_5_reference_unit_rated_air_flow_rate = 0.0
obj.speed_5_reference_unit_rated_air_flow_rate = var_speed_5_reference_unit_rated_air_flow_rate
# real
var_speed_5_reference_unit_rated_water_flow_rate = 0.0
obj.speed_5_reference_unit_rated_water_flow_rate = var_speed_5_reference_unit_rated_water_flow_rate
# real
var_speed_5_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_5_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_5_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_5_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 5 Total WH Capacity Function of Temperature Curve Name"
obj.speed_5_total_wh_capacity_function_of_temperature_curve_name = var_speed_5_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_5_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 5 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_5_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_5_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_5_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 5 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_5_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_5_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_5_cop_function_of_temperature_curve_name = "object-list|Speed 5 COP Function of Temperature Curve Name"
obj.speed_5_cop_function_of_temperature_curve_name = var_speed_5_cop_function_of_temperature_curve_name
# object-list
var_speed_5_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 5 COP Function of Air Flow Fraction Curve Name"
obj.speed_5_cop_function_of_air_flow_fraction_curve_name = var_speed_5_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_5_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 5 COP Function of Water Flow Fraction Curve Name"
obj.speed_5_cop_function_of_water_flow_fraction_curve_name = var_speed_5_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_6 = 0.0001
obj.rated_water_heating_capacity_at_speed_6 = var_rated_water_heating_capacity_at_speed_6
# real
var_rated_water_heating_cop_at_speed_6 = 0.0001
obj.rated_water_heating_cop_at_speed_6 = var_rated_water_heating_cop_at_speed_6
# real
var_rated_sensible_heat_ratio_at_speed_6 = 0.75
obj.rated_sensible_heat_ratio_at_speed_6 = var_rated_sensible_heat_ratio_at_speed_6
# real
var_speed_6_reference_unit_rated_air_flow_rate = 0.0
obj.speed_6_reference_unit_rated_air_flow_rate = var_speed_6_reference_unit_rated_air_flow_rate
# real
var_speed_6_reference_unit_rated_water_flow_rate = 0.0
obj.speed_6_reference_unit_rated_water_flow_rate = var_speed_6_reference_unit_rated_water_flow_rate
# real
var_speed_6_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_6_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_6_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_6_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 6 Total WH Capacity Function of Temperature Curve Name"
obj.speed_6_total_wh_capacity_function_of_temperature_curve_name = var_speed_6_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_6_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 6 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_6_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_6_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_6_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 6 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_6_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_6_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_6_cop_function_of_temperature_curve_name = "object-list|Speed 6 COP Function of Temperature Curve Name"
obj.speed_6_cop_function_of_temperature_curve_name = var_speed_6_cop_function_of_temperature_curve_name
# object-list
var_speed_6_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 6 COP Function of Air Flow Fraction Curve Name"
obj.speed_6_cop_function_of_air_flow_fraction_curve_name = var_speed_6_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_6_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 6 COP Function of Water Flow Fraction Curve Name"
obj.speed_6_cop_function_of_water_flow_fraction_curve_name = var_speed_6_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_7 = 0.0001
obj.rated_water_heating_capacity_at_speed_7 = var_rated_water_heating_capacity_at_speed_7
# real
var_rated_water_heating_cop_at_speed_7 = 0.0001
obj.rated_water_heating_cop_at_speed_7 = var_rated_water_heating_cop_at_speed_7
# real
var_rated_sensible_heat_ratio_at_speed_7 = 0.75
obj.rated_sensible_heat_ratio_at_speed_7 = var_rated_sensible_heat_ratio_at_speed_7
# real
var_speed_7_reference_unit_rated_air_flow_rate = 0.0
obj.speed_7_reference_unit_rated_air_flow_rate = var_speed_7_reference_unit_rated_air_flow_rate
# real
var_speed_7_reference_unit_rated_water_flow_rate = 0.0
obj.speed_7_reference_unit_rated_water_flow_rate = var_speed_7_reference_unit_rated_water_flow_rate
# real
var_speed_7_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_7_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_7_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_7_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 7 Total WH Capacity Function of Temperature Curve Name"
obj.speed_7_total_wh_capacity_function_of_temperature_curve_name = var_speed_7_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_7_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 7 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_7_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_7_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_7_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 7 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_7_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_7_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_7_cop_function_of_temperature_curve_name = "object-list|Speed 7 COP Function of Temperature Curve Name"
obj.speed_7_cop_function_of_temperature_curve_name = var_speed_7_cop_function_of_temperature_curve_name
# object-list
var_speed_7_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 7 COP Function of Air Flow Fraction Curve Name"
obj.speed_7_cop_function_of_air_flow_fraction_curve_name = var_speed_7_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_7_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 7 COP Function of Water Flow Fraction Curve Name"
obj.speed_7_cop_function_of_water_flow_fraction_curve_name = var_speed_7_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_8 = 0.0001
obj.rated_water_heating_capacity_at_speed_8 = var_rated_water_heating_capacity_at_speed_8
# real
var_rated_water_heating_cop_at_speed_8 = 0.0001
obj.rated_water_heating_cop_at_speed_8 = var_rated_water_heating_cop_at_speed_8
# real
var_rated_sensible_heat_ratio_at_speed_8 = 0.75
obj.rated_sensible_heat_ratio_at_speed_8 = var_rated_sensible_heat_ratio_at_speed_8
# real
var_speed_8_reference_unit_rated_air_flow_rate = 0.0
obj.speed_8_reference_unit_rated_air_flow_rate = var_speed_8_reference_unit_rated_air_flow_rate
# real
var_speed_8_reference_unit_rated_water_flow_rate = 0.0
obj.speed_8_reference_unit_rated_water_flow_rate = var_speed_8_reference_unit_rated_water_flow_rate
# real
var_speed_8_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_8_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_8_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_8_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 8 Total WH Capacity Function of Temperature Curve Name"
obj.speed_8_total_wh_capacity_function_of_temperature_curve_name = var_speed_8_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_8_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 8 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_8_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_8_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_8_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 8 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_8_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_8_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_8_cop_function_of_temperature_curve_name = "object-list|Speed 8 COP Function of Temperature Curve Name"
obj.speed_8_cop_function_of_temperature_curve_name = var_speed_8_cop_function_of_temperature_curve_name
# object-list
var_speed_8_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 8 COP Function of Air Flow Fraction Curve Name"
obj.speed_8_cop_function_of_air_flow_fraction_curve_name = var_speed_8_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_8_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 8 COP Function of Water Flow Fraction Curve Name"
obj.speed_8_cop_function_of_water_flow_fraction_curve_name = var_speed_8_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_9 = 0.0001
obj.rated_water_heating_capacity_at_speed_9 = var_rated_water_heating_capacity_at_speed_9
# real
var_rated_water_heating_cop_at_speed_9 = 0.0001
obj.rated_water_heating_cop_at_speed_9 = var_rated_water_heating_cop_at_speed_9
# real
var_rated_sensible_heat_ratio_at_speed_9 = 0.75
obj.rated_sensible_heat_ratio_at_speed_9 = var_rated_sensible_heat_ratio_at_speed_9
# real
var_speed_9_reference_unit_rated_air_flow_rate = 0.0
obj.speed_9_reference_unit_rated_air_flow_rate = var_speed_9_reference_unit_rated_air_flow_rate
# real
var_speed_9_reference_unit_rated_water_flow_rate = 0.0
obj.speed_9_reference_unit_rated_water_flow_rate = var_speed_9_reference_unit_rated_water_flow_rate
# real
var_speed_9_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_9_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_9_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_9_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 9 Total WH Capacity Function of Temperature Curve Name"
obj.speed_9_total_wh_capacity_function_of_temperature_curve_name = var_speed_9_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_9_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 9 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_9_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_9_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_9_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 9 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_9_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_9_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_9_cop_function_of_temperature_curve_name = "object-list|Speed 9 COP Function of Temperature Curve Name"
obj.speed_9_cop_function_of_temperature_curve_name = var_speed_9_cop_function_of_temperature_curve_name
# object-list
var_speed_9_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 9 COP Function of Air Flow Fraction Curve Name"
obj.speed_9_cop_function_of_air_flow_fraction_curve_name = var_speed_9_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_9_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 9 COP Function of Water Flow Fraction Curve Name"
obj.speed_9_cop_function_of_water_flow_fraction_curve_name = var_speed_9_cop_function_of_water_flow_fraction_curve_name
# real
var_rated_water_heating_capacity_at_speed_10 = 0.0001
obj.rated_water_heating_capacity_at_speed_10 = var_rated_water_heating_capacity_at_speed_10
# real
var_rated_water_heating_cop_at_speed_10 = 0.0001
obj.rated_water_heating_cop_at_speed_10 = var_rated_water_heating_cop_at_speed_10
# real
var_rated_sensible_heat_ratio_at_speed_10 = 0.75
obj.rated_sensible_heat_ratio_at_speed_10 = var_rated_sensible_heat_ratio_at_speed_10
# real
var_speed_10_reference_unit_rated_air_flow_rate = 0.0
obj.speed_10_reference_unit_rated_air_flow_rate = var_speed_10_reference_unit_rated_air_flow_rate
# real
var_speed_10_reference_unit_rated_water_flow_rate = 0.0
obj.speed_10_reference_unit_rated_water_flow_rate = var_speed_10_reference_unit_rated_water_flow_rate
# real
var_speed_10_reference_unit_water_pump_input_power_at_rated_conditions = 0.0
obj.speed_10_reference_unit_water_pump_input_power_at_rated_conditions = var_speed_10_reference_unit_water_pump_input_power_at_rated_conditions
# object-list
var_speed_10_total_wh_capacity_function_of_temperature_curve_name = "object-list|Speed 10 Total WH Capacity Function of Temperature Curve Name"
obj.speed_10_total_wh_capacity_function_of_temperature_curve_name = var_speed_10_total_wh_capacity_function_of_temperature_curve_name
# object-list
var_speed_10_total_wh_capacity_function_of_air_flow_fraction_curve_name = "object-list|Speed 10 Total WH Capacity Function of Air Flow Fraction Curve Name"
obj.speed_10_total_wh_capacity_function_of_air_flow_fraction_curve_name = var_speed_10_total_wh_capacity_function_of_air_flow_fraction_curve_name
# object-list
var_speed_10_total_wh_capacity_function_of_water_flow_fraction_curve_name = "object-list|Speed 10 Total WH Capacity Function of Water Flow Fraction Curve Name"
obj.speed_10_total_wh_capacity_function_of_water_flow_fraction_curve_name = var_speed_10_total_wh_capacity_function_of_water_flow_fraction_curve_name
# object-list
var_speed_10_cop_function_of_temperature_curve_name = "object-list|Speed 10 COP Function of Temperature Curve Name"
obj.speed_10_cop_function_of_temperature_curve_name = var_speed_10_cop_function_of_temperature_curve_name
# object-list
var_speed_10_cop_function_of_air_flow_fraction_curve_name = "object-list|Speed 10 COP Function of Air Flow Fraction Curve Name"
obj.speed_10_cop_function_of_air_flow_fraction_curve_name = var_speed_10_cop_function_of_air_flow_fraction_curve_name
# object-list
var_speed_10_cop_function_of_water_flow_fraction_curve_name = "object-list|Speed 10 COP Function of Water Flow Fraction Curve Name"
obj.speed_10_cop_function_of_water_flow_fraction_curve_name = var_speed_10_cop_function_of_water_flow_fraction_curve_name
idf = IDF()
idf.add(obj)
idf.save(self.path, check=False)
with open(self.path, mode='r') as f:
for line in f:
log.debug(line.strip())
idf2 = IDF(self.path)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].name, var_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].number_of_speeds, var_number_of_speeds)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].nominal_speed_level, var_nominal_speed_level)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity, var_rated_water_heating_capacity)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_evaporator_inlet_air_drybulb_temperature, var_rated_evaporator_inlet_air_drybulb_temperature)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_evaporator_inlet_air_wetbulb_temperature, var_rated_evaporator_inlet_air_wetbulb_temperature)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_condenser_inlet_water_temperature, var_rated_condenser_inlet_water_temperature)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_evaporator_air_flow_rate, var_rated_evaporator_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_condenser_water_flow_rate, var_rated_condenser_water_flow_rate)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].evaporator_fan_power_included_in_rated_cop, var_evaporator_fan_power_included_in_rated_cop)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].condenser_pump_power_included_in_rated_cop, var_condenser_pump_power_included_in_rated_cop)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].condenser_pump_heat_included_in_rated_heating_capacity_and_rated_cop, var_condenser_pump_heat_included_in_rated_heating_capacity_and_rated_cop)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].fraction_of_condenser_pump_heat_to_water, var_fraction_of_condenser_pump_heat_to_water)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].evaporator_air_inlet_node_name, var_evaporator_air_inlet_node_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].evaporator_air_outlet_node_name, var_evaporator_air_outlet_node_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].condenser_water_inlet_node_name, var_condenser_water_inlet_node_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].condenser_water_outlet_node_name, var_condenser_water_outlet_node_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].crankcase_heater_capacity, var_crankcase_heater_capacity)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].maximum_ambient_temperature_for_crankcase_heater_operation, var_maximum_ambient_temperature_for_crankcase_heater_operation)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].evaporator_air_temperature_type_for_curve_objects, var_evaporator_air_temperature_type_for_curve_objects)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].part_load_fraction_correlation_curve_name, var_part_load_fraction_correlation_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_1, var_rated_water_heating_capacity_at_speed_1)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_1, var_rated_water_heating_cop_at_speed_1)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_1, var_rated_sensible_heat_ratio_at_speed_1)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_reference_unit_rated_air_flow_rate, var_speed_1_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_reference_unit_rated_water_flow_rate, var_speed_1_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_1_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_total_wh_capacity_function_of_temperature_curve_name, var_speed_1_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_1_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_1_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_cop_function_of_temperature_curve_name, var_speed_1_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_cop_function_of_air_flow_fraction_curve_name, var_speed_1_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_1_cop_function_of_water_flow_fraction_curve_name, var_speed_1_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_2, var_rated_water_heating_capacity_at_speed_2)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_2, var_rated_water_heating_cop_at_speed_2)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_2, var_rated_sensible_heat_ratio_at_speed_2)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_reference_unit_rated_air_flow_rate, var_speed_2_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_reference_unit_rated_water_flow_rate, var_speed_2_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_2_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_total_wh_capacity_function_of_temperature_curve_name, var_speed_2_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_2_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_2_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_cop_function_of_temperature_curve_name, var_speed_2_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_cop_function_of_air_flow_fraction_curve_name, var_speed_2_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_2_cop_function_of_water_flow_fraction_curve_name, var_speed_2_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_3, var_rated_water_heating_capacity_at_speed_3)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_3, var_rated_water_heating_cop_at_speed_3)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_3, var_rated_sensible_heat_ratio_at_speed_3)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_reference_unit_rated_air_flow_rate, var_speed_3_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_reference_unit_rated_water_flow_rate, var_speed_3_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_3_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_total_wh_capacity_function_of_temperature_curve_name, var_speed_3_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_3_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_3_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_cop_function_of_temperature_curve_name, var_speed_3_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_cop_function_of_air_flow_fraction_curve_name, var_speed_3_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_3_cop_function_of_water_flow_fraction_curve_name, var_speed_3_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_4, var_rated_water_heating_capacity_at_speed_4)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_4, var_rated_water_heating_cop_at_speed_4)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_4, var_rated_sensible_heat_ratio_at_speed_4)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_reference_unit_rated_air_flow_rate, var_speed_4_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_reference_unit_rated_water_flow_rate, var_speed_4_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_4_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_total_wh_capacity_function_of_temperature_curve_name, var_speed_4_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_4_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_4_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_cop_function_of_temperature_curve_name, var_speed_4_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_cop_function_of_air_flow_fraction_curve_name, var_speed_4_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_4_cop_function_of_water_flow_fraction_curve_name, var_speed_4_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_5, var_rated_water_heating_capacity_at_speed_5)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_5, var_rated_water_heating_cop_at_speed_5)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_5, var_rated_sensible_heat_ratio_at_speed_5)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_reference_unit_rated_air_flow_rate, var_speed_5_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_reference_unit_rated_water_flow_rate, var_speed_5_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_5_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_total_wh_capacity_function_of_temperature_curve_name, var_speed_5_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_5_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_5_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_cop_function_of_temperature_curve_name, var_speed_5_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_cop_function_of_air_flow_fraction_curve_name, var_speed_5_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_5_cop_function_of_water_flow_fraction_curve_name, var_speed_5_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_6, var_rated_water_heating_capacity_at_speed_6)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_6, var_rated_water_heating_cop_at_speed_6)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_6, var_rated_sensible_heat_ratio_at_speed_6)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_reference_unit_rated_air_flow_rate, var_speed_6_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_reference_unit_rated_water_flow_rate, var_speed_6_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_6_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_total_wh_capacity_function_of_temperature_curve_name, var_speed_6_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_6_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_6_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_cop_function_of_temperature_curve_name, var_speed_6_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_cop_function_of_air_flow_fraction_curve_name, var_speed_6_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_6_cop_function_of_water_flow_fraction_curve_name, var_speed_6_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_7, var_rated_water_heating_capacity_at_speed_7)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_7, var_rated_water_heating_cop_at_speed_7)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_7, var_rated_sensible_heat_ratio_at_speed_7)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_reference_unit_rated_air_flow_rate, var_speed_7_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_reference_unit_rated_water_flow_rate, var_speed_7_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_7_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_total_wh_capacity_function_of_temperature_curve_name, var_speed_7_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_7_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_7_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_cop_function_of_temperature_curve_name, var_speed_7_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_cop_function_of_air_flow_fraction_curve_name, var_speed_7_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_7_cop_function_of_water_flow_fraction_curve_name, var_speed_7_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_8, var_rated_water_heating_capacity_at_speed_8)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_8, var_rated_water_heating_cop_at_speed_8)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_8, var_rated_sensible_heat_ratio_at_speed_8)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_reference_unit_rated_air_flow_rate, var_speed_8_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_reference_unit_rated_water_flow_rate, var_speed_8_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_8_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_total_wh_capacity_function_of_temperature_curve_name, var_speed_8_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_8_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_8_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_cop_function_of_temperature_curve_name, var_speed_8_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_cop_function_of_air_flow_fraction_curve_name, var_speed_8_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_8_cop_function_of_water_flow_fraction_curve_name, var_speed_8_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_9, var_rated_water_heating_capacity_at_speed_9)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_9, var_rated_water_heating_cop_at_speed_9)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_9, var_rated_sensible_heat_ratio_at_speed_9)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_reference_unit_rated_air_flow_rate, var_speed_9_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_reference_unit_rated_water_flow_rate, var_speed_9_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_9_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_total_wh_capacity_function_of_temperature_curve_name, var_speed_9_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_9_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_9_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_cop_function_of_temperature_curve_name, var_speed_9_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_cop_function_of_air_flow_fraction_curve_name, var_speed_9_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_9_cop_function_of_water_flow_fraction_curve_name, var_speed_9_cop_function_of_water_flow_fraction_curve_name)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_capacity_at_speed_10, var_rated_water_heating_capacity_at_speed_10)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_water_heating_cop_at_speed_10, var_rated_water_heating_cop_at_speed_10)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].rated_sensible_heat_ratio_at_speed_10, var_rated_sensible_heat_ratio_at_speed_10)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_reference_unit_rated_air_flow_rate, var_speed_10_reference_unit_rated_air_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_reference_unit_rated_water_flow_rate, var_speed_10_reference_unit_rated_water_flow_rate)
self.assertAlmostEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_reference_unit_water_pump_input_power_at_rated_conditions, var_speed_10_reference_unit_water_pump_input_power_at_rated_conditions)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_total_wh_capacity_function_of_temperature_curve_name, var_speed_10_total_wh_capacity_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_total_wh_capacity_function_of_air_flow_fraction_curve_name, var_speed_10_total_wh_capacity_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_total_wh_capacity_function_of_water_flow_fraction_curve_name, var_speed_10_total_wh_capacity_function_of_water_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_cop_function_of_temperature_curve_name, var_speed_10_cop_function_of_temperature_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_cop_function_of_air_flow_fraction_curve_name, var_speed_10_cop_function_of_air_flow_fraction_curve_name)
self.assertEqual(idf2.coilwaterheatingairtowaterheatpumpvariablespeeds[0].speed_10_cop_function_of_water_flow_fraction_curve_name, var_speed_10_cop_function_of_water_flow_fraction_curve_name) | 101.250836 | 227 | 0.853967 | 8,544 | 60,548 | 5.374064 | 0.014279 | 0.07174 | 0.088858 | 0.109766 | 0.982032 | 0.973103 | 0.961909 | 0.946184 | 0.917327 | 0.889995 | 0 | 0.022985 | 0.110441 | 60,548 | 598 | 228 | 101.250836 | 0.829506 | 0.018861 | 0 | 0 | 0 | 0 | 0.074496 | 0 | 0 | 0 | 0 | 0 | 0.315436 | 1 | 0.006711 | false | 0 | 0.017897 | 0 | 0.026846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73eabe6c2a3b270fa3c41e6c4be653b67f343091 | 54,316 | py | Python | quantumml/tests/test_data.py | janash/quantumMLdev | 0e5faee372ce7de695e8821398a97a3bfcfbf9e0 | [
"MIT"
] | 1 | 2021-06-04T14:31:10.000Z | 2021-06-04T14:31:10.000Z | quantumml/tests/test_data.py | janash/quantumMLdev | 0e5faee372ce7de695e8821398a97a3bfcfbf9e0 | [
"MIT"
] | 2 | 2021-03-24T17:58:17.000Z | 2021-03-24T18:46:37.000Z | quantumml/tests/test_data.py | janash/quantumMLdev | 0e5faee372ce7de695e8821398a97a3bfcfbf9e0 | [
"MIT"
] | 1 | 2021-02-16T20:15:27.000Z | 2021-02-16T20:15:27.000Z | import numpy as np
soap_output = np.array(
[
[
2.52389602e-02,
2.25741119e-03,
6.75635080e-03,
-2.91014612e-03,
-7.73761514e-04,
5.32436618e-03,
-1.06447594e-02,
2.06517657e-02,
2.01906308e-04,
6.04298346e-04,
-2.60287918e-04,
-6.92064129e-05,
4.76219451e-04,
-9.52083555e-04,
1.84712550e-03,
1.80864329e-03,
-7.79032413e-04,
-2.07132314e-04,
1.42530775e-03,
-2.84955196e-03,
5.52838042e-03,
3.35550687e-04,
8.92175846e-05,
-6.13919252e-04,
1.22738041e-03,
-2.38122550e-03,
2.37215351e-05,
-1.63231353e-04,
3.26340906e-04,
-6.33129944e-04,
1.12321882e-03,
-2.24559951e-03,
4.35665977e-03,
4.48952340e-03,
-8.71006866e-03,
1.68982962e-02,
8.43020477e-35,
-2.07490727e-34,
6.53727610e-34,
-1.16218781e-33,
1.57393666e-33,
-1.62791609e-33,
8.00893297e-34,
-1.59564210e-34,
5.11109434e-34,
-1.60631514e-33,
2.86104984e-33,
-3.87574454e-33,
4.00146363e-33,
-1.97324975e-33,
3.95320210e-34,
5.10931373e-33,
-9.05627250e-33,
1.22554695e-32,
-1.26649838e-32,
6.09464605e-33,
-1.15865249e-33,
1.61236801e-32,
-2.18323081e-32,
2.24499830e-32,
-1.08264605e-32,
2.07225862e-33,
2.95650572e-32,
-3.03895173e-32,
1.46786634e-32,
-2.81986226e-33,
3.15051730e-32,
-1.54077680e-32,
3.02905713e-33,
8.08716172e-33,
-1.81097441e-33,
4.88368282e-34,
5.72342913e-08,
-1.05386527e-07,
4.63962589e-07,
-1.17602718e-06,
1.78325057e-06,
-2.25389174e-06,
1.41008911e-06,
1.34284764e-07,
1.94050102e-07,
-8.54302635e-07,
2.16543993e-06,
-3.28353125e-06,
4.15013128e-06,
-2.59642236e-06,
-2.47260944e-07,
3.76105441e-06,
-9.53331649e-06,
1.44556966e-05,
-1.82708901e-05,
1.14307101e-05,
1.08856256e-06,
2.41645330e-05,
-3.66415148e-05,
4.63120601e-05,
-2.89739433e-05,
-2.75922927e-06,
5.55607926e-05,
-7.02245740e-05,
4.39341896e-05,
4.18391451e-06,
8.87584673e-05,
-5.55294409e-05,
-5.28814656e-06,
3.47405594e-05,
3.30839221e-06,
3.15062832e-07,
4.17628365e-33,
-6.14955530e-33,
7.29169653e-33,
-1.16076217e-32,
1.32232087e-32,
-1.33852411e-32,
7.10774314e-33,
-1.92189626e-33,
9.14078920e-33,
-1.04793866e-32,
1.65681564e-32,
-1.85762273e-32,
1.91032179e-32,
-1.02022004e-32,
2.80373112e-33,
1.36237429e-32,
-2.19201776e-32,
2.57780671e-32,
-2.53689467e-32,
1.32980436e-32,
-3.46272508e-33,
3.55250141e-32,
-4.22319139e-32,
4.10143937e-32,
-2.14148195e-32,
5.51465043e-33,
5.12852694e-32,
-4.89397844e-32,
2.54752917e-32,
-6.43301697e-33,
4.83739438e-32,
-2.56969604e-32,
6.70482073e-33,
1.38913430e-32,
-3.69392974e-33,
1.01286327e-33,
5.92939973e-06,
-5.29997618e-06,
1.40599521e-05,
-5.10588099e-05,
4.63188533e-05,
4.98942139e-05,
-2.54958713e-05,
1.48952724e-05,
6.59107548e-06,
-1.04976221e-05,
1.63351913e-05,
-1.04885616e-05,
-1.27723434e-05,
1.75198783e-05,
-1.17024821e-05,
3.56504727e-05,
-1.53791981e-04,
1.44349986e-04,
1.53846352e-04,
-6.63403787e-05,
3.71195743e-05,
9.02907310e-04,
-8.87539336e-04,
-9.32744884e-04,
3.02849359e-04,
-1.53741584e-04,
8.77358523e-04,
9.20497986e-04,
-2.87044253e-04,
1.43233905e-04,
9.66242085e-04,
-3.05010638e-04,
1.53008516e-04,
1.24609572e-04,
-6.86296226e-05,
3.88196176e-05,
1.32479361e-32,
-1.62689776e-32,
1.96135012e-32,
-3.07400589e-32,
3.58573321e-32,
-3.59494519e-32,
1.57302323e-32,
-4.06602829e-33,
2.18972735e-32,
-2.70813507e-32,
4.12688749e-32,
-5.02829057e-32,
4.80711798e-32,
-2.07551916e-32,
5.24652013e-33,
3.41986192e-32,
-5.12882259e-32,
6.40056084e-32,
-6.00451159e-32,
2.60636626e-32,
-6.48209639e-33,
7.80318893e-32,
-9.54690399e-32,
9.11287461e-32,
-3.94922219e-32,
9.94795750e-33,
1.21381038e-31,
-1.13913929e-31,
4.96048429e-32,
-1.23569390e-32,
1.10731814e-31,
-4.84752337e-32,
1.23693311e-32,
2.13995771e-32,
-5.45922915e-33,
1.41861861e-33,
7.75742332e-06,
-9.96016066e-06,
8.04032950e-06,
1.03021694e-05,
1.60835612e-06,
-6.39733065e-05,
-2.78809365e-05,
3.84081571e-05,
1.28020507e-05,
-1.03071763e-05,
-1.37103230e-05,
-1.06703553e-06,
8.36523748e-05,
3.23723893e-05,
-5.00190581e-05,
8.35278934e-06,
1.01053437e-05,
2.85045274e-06,
-6.45114107e-05,
-3.29595444e-05,
3.89730964e-05,
3.07222543e-05,
-3.30871272e-05,
-1.38382532e-04,
8.38659230e-05,
7.58828647e-05,
7.31401058e-05,
9.71633201e-05,
-2.55668048e-04,
-4.34543826e-05,
6.95056365e-04,
-1.49081943e-04,
-3.94727234e-04,
9.57872250e-04,
3.84331489e-05,
2.26476694e-04,
1.25057425e-02,
1.54898562e-03,
2.77964685e-03,
-3.68559755e-03,
3.71889214e-03,
2.52055253e-03,
-6.23563770e-03,
1.04200514e-02,
1.11853273e-03,
1.38543642e-04,
2.48615864e-04,
-3.29645480e-04,
3.32623399e-04,
2.25442072e-04,
-5.57724969e-04,
9.31985332e-04,
3.34772837e-03,
4.14656156e-04,
7.44098372e-04,
-9.86617109e-04,
9.95529913e-04,
6.74740043e-04,
-1.66925085e-03,
2.78939870e-03,
-1.44195869e-03,
-1.78603811e-04,
-3.20503635e-04,
4.24963125e-04,
-4.28802115e-04,
-2.90629095e-04,
7.18992252e-04,
-1.20147074e-03,
-3.83393854e-04,
-4.74879095e-05,
-8.52168131e-05,
1.12990928e-04,
-1.14011654e-04,
-7.72736486e-05,
1.91168591e-04,
-3.19451939e-04,
2.63818920e-03,
3.26771252e-04,
5.86389357e-04,
-7.77507106e-04,
7.84530873e-04,
5.31731280e-04,
-1.31545904e-03,
2.19819552e-03,
-5.27440981e-03,
-6.53298672e-04,
-1.17234116e-03,
1.55443405e-03,
-1.56847634e-03,
-1.06306579e-03,
2.62993651e-03,
-4.39475079e-03,
1.02328171e-02,
1.26745665e-03,
2.27444454e-03,
-3.01573822e-03,
3.04298150e-03,
2.06244076e-03,
-5.10230720e-03,
8.52620150e-03,
-4.01714222e-36,
2.61482828e-35,
-9.20621978e-35,
4.42720280e-34,
-4.91644204e-34,
5.43749421e-34,
-3.62573641e-34,
1.20701285e-34,
1.02917788e-35,
-6.49073857e-35,
2.27617204e-34,
-1.08369636e-33,
1.20911208e-33,
-1.33610843e-33,
8.88765345e-34,
-2.94645815e-34,
-3.07359850e-35,
2.03567655e-34,
-7.00263053e-34,
3.43079227e-33,
-3.68587225e-33,
4.13717916e-33,
-2.86344772e-33,
9.94272116e-34,
6.05814335e-35,
-3.70422816e-34,
1.25575740e-33,
-6.00884068e-33,
6.49564767e-33,
-7.29510102e-33,
5.05349711e-33,
-1.75063834e-33,
-8.28300468e-35,
5.02574908e-34,
-1.70404085e-33,
8.12705738e-33,
-8.80921011e-33,
9.88439559e-33,
-6.83152688e-33,
2.35997930e-33,
7.31304743e-35,
-4.99330611e-34,
1.76255670e-33,
-8.61199159e-33,
9.46454969e-33,
-1.04988227e-32,
7.05638887e-33,
-2.37497769e-33,
-3.01437761e-35,
2.31320119e-34,
-9.11597869e-34,
4.36261936e-33,
-5.27110085e-33,
5.58148167e-33,
-3.29734923e-33,
9.40841736e-34,
4.09213714e-36,
-4.09835906e-35,
1.99905327e-34,
-9.12896909e-34,
1.28933639e-33,
-1.27256739e-33,
5.85652717e-34,
-9.63984909e-35,
-2.72603473e-08,
2.28498601e-08,
5.74320734e-08,
1.45228860e-07,
-4.45011877e-07,
-1.50767181e-06,
2.32531230e-06,
1.77945474e-07,
5.01949664e-08,
-4.20738572e-08,
-1.05750707e-07,
-2.67412505e-07,
8.19408350e-07,
2.77610314e-06,
-4.28163924e-06,
-3.27654193e-07,
-2.20982579e-07,
1.85229519e-07,
4.65565884e-07,
1.17727950e-06,
-3.60743285e-06,
-1.22217520e-05,
1.88498519e-05,
1.44249262e-06,
5.60134642e-07,
-4.69509726e-07,
-1.18009112e-06,
-2.98410416e-06,
9.14392489e-06,
3.09790333e-05,
-4.77795810e-05,
-3.65635197e-06,
-8.49351477e-07,
7.11933792e-07,
1.78941287e-06,
4.52490005e-06,
-1.38652487e-05,
-4.69745767e-05,
7.24498266e-05,
5.54425260e-06,
1.07351503e-06,
-8.99829626e-07,
-2.26168042e-06,
-5.71912609e-06,
1.75246093e-05,
5.93722567e-05,
-9.15710156e-05,
-7.00750941e-06,
-6.71616928e-07,
5.62955147e-07,
1.41496189e-06,
3.57802341e-06,
-1.09638188e-05,
-3.71447178e-05,
5.72890391e-05,
4.38406714e-06,
-6.39590223e-08,
5.36110085e-08,
1.34748806e-07,
3.40740190e-07,
-1.04409985e-06,
-3.53734359e-06,
5.45571556e-06,
4.17500864e-07,
-5.08977780e-34,
1.95759841e-33,
-5.02796716e-34,
3.53424284e-33,
-1.86699465e-33,
1.66812523e-33,
-8.95880635e-35,
-7.03380012e-35,
8.67271482e-34,
-3.08182717e-33,
9.34245901e-34,
-5.78232273e-33,
3.32217141e-33,
-3.06087577e-33,
3.91543964e-34,
4.21690831e-35,
-8.40067590e-34,
3.93755448e-33,
-1.10564731e-33,
6.54833392e-33,
-3.99981930e-33,
2.90471714e-33,
-3.46005677e-34,
-8.66389793e-35,
8.82296537e-34,
-4.92693379e-33,
7.12983052e-34,
-7.59641146e-33,
3.18455247e-33,
-2.01293412e-33,
-7.71629255e-34,
4.44468147e-34,
-4.90901232e-34,
4.35105822e-33,
1.70618383e-34,
5.69716314e-33,
-5.91264551e-34,
-4.93983249e-34,
2.15028780e-33,
-7.99551056e-34,
1.76671321e-33,
-7.64361829e-33,
2.68045452e-33,
-1.30273441e-32,
8.90883265e-33,
-6.54444973e-33,
1.38465017e-33,
-2.57213939e-35,
-1.28259758e-33,
4.75177638e-33,
-2.18776813e-33,
8.64516126e-33,
-6.80122972e-33,
5.30738529e-33,
-1.67780966e-33,
2.39470606e-34,
4.65773268e-34,
-1.51417741e-33,
8.30572702e-34,
-2.93132190e-33,
2.49983990e-33,
-2.05158700e-33,
7.50302166e-34,
-1.35108042e-34,
-2.79598120e-06,
5.53793433e-06,
-4.68134784e-06,
-2.06433558e-05,
2.63719781e-05,
2.44492342e-05,
-5.06609245e-07,
3.16079084e-05,
1.29268239e-06,
-1.86992086e-06,
4.09622785e-06,
-4.32142564e-06,
-4.70409874e-06,
4.46533570e-06,
1.82023702e-05,
-2.21265207e-05,
-7.97706345e-06,
1.65709456e-05,
-1.11989994e-05,
-7.43785696e-05,
8.36021507e-05,
8.73623422e-05,
1.86176005e-05,
8.17898545e-05,
4.31489581e-05,
-9.63790564e-05,
4.17056488e-05,
5.37766880e-04,
-5.25365990e-04,
-6.26591829e-04,
-2.76223792e-04,
-3.69021830e-04,
-4.19616234e-05,
9.46269577e-05,
-3.80399634e-05,
-5.41041954e-04,
5.20670339e-04,
6.29903785e-04,
2.92043182e-04,
3.49074659e-04,
-4.42411775e-05,
9.94817329e-05,
-4.09061489e-05,
-5.64694428e-04,
5.45855801e-04,
6.57595724e-04,
3.00470662e-04,
3.71147980e-04,
1.54521670e-05,
-3.25685269e-05,
2.03800350e-05,
1.53502487e-04,
-1.67034265e-04,
-1.79947001e-04,
-4.82781955e-05,
-1.53325774e-04,
-8.07272519e-06,
1.65897470e-05,
-1.18366999e-05,
-7.16574759e-05,
8.26533584e-05,
8.43009424e-05,
1.41587849e-05,
8.47284124e-05,
-6.37226342e-33,
6.43269443e-33,
-8.13571957e-33,
1.38642041e-32,
-1.21218413e-32,
9.47972030e-33,
-1.98631510e-33,
9.67045254e-34,
9.34637317e-33,
-8.50677575e-33,
1.14048641e-32,
-1.95160075e-32,
1.53237062e-32,
-1.37025914e-32,
3.61060296e-33,
-1.40885135e-33,
-1.09711361e-32,
9.42358181e-33,
-1.28764343e-32,
2.26526759e-32,
-1.70493245e-32,
1.60960945e-32,
-4.57627146e-33,
1.71560157e-33,
1.71139664e-32,
-1.54682875e-32,
2.07169720e-32,
-3.57237440e-32,
2.79772383e-32,
-2.50799357e-32,
6.62578181e-33,
-2.60881571e-33,
-2.04405298e-32,
1.69202159e-32,
-2.32791971e-32,
4.20835420e-32,
-3.11031888e-32,
2.99961055e-32,
-8.73978207e-33,
3.25915986e-33,
1.97059735e-32,
-1.72802692e-32,
2.29052740e-32,
-4.16242282e-32,
3.30177686e-32,
-2.92137410e-32,
7.60640604e-33,
-3.14274820e-33,
-8.09430587e-33,
7.04997533e-33,
-9.25168417e-33,
1.72355365e-32,
-1.38716210e-32,
1.21024224e-32,
-3.11634788e-33,
1.33151344e-33,
2.14189771e-33,
-1.93881074e-33,
2.49747236e-33,
-4.62387924e-33,
3.86164101e-33,
-3.21581268e-33,
7.68388263e-34,
-3.47013071e-34,
1.44043313e-06,
-2.05296631e-06,
-2.69430911e-06,
2.55628155e-05,
-1.68192852e-05,
-4.64402931e-05,
-3.06077562e-05,
1.87753288e-05,
-1.83809439e-06,
2.62326220e-06,
3.47325911e-06,
-3.29388468e-05,
2.15276319e-05,
6.09293350e-05,
3.78057987e-05,
-2.34305154e-05,
1.50642658e-06,
-2.14283449e-06,
-2.77609149e-06,
2.63558394e-05,
-1.75127413e-05,
-4.65897661e-05,
-3.34944969e-05,
2.02617735e-05,
1.51227491e-06,
-2.28002103e-06,
-4.06854898e-06,
3.80923449e-05,
-1.99534779e-05,
-1.07633895e-04,
1.20476602e-05,
1.07241389e-06,
1.12685428e-06,
-1.34836064e-06,
4.55046246e-07,
-3.26553293e-06,
-8.41328848e-06,
8.53700735e-05,
-1.15269329e-04,
5.32157684e-05,
-1.06226949e-05,
1.55307417e-05,
2.37566509e-05,
-2.23800843e-04,
1.31232422e-04,
5.27066014e-04,
8.72078121e-05,
-8.00258425e-05,
-8.01963671e-06,
1.05455155e-05,
6.20451684e-06,
-6.24766761e-05,
7.73576338e-05,
-1.59143886e-04,
4.83854849e-04,
-2.36767295e-04,
6.54690109e-06,
-9.51289966e-06,
-1.40557939e-05,
1.32614442e-04,
-7.97958496e-05,
-2.97022821e-04,
-7.46195544e-05,
5.81264600e-05,
6.19651495e-03,
7.67512411e-04,
1.37729714e-03,
-1.82618988e-03,
1.84268713e-03,
1.24891757e-03,
-3.08971837e-03,
5.16306847e-03,
9.50655821e-05,
1.70594706e-04,
-2.26195435e-04,
2.28238818e-04,
1.54693362e-04,
-3.82698535e-04,
6.39507716e-04,
3.06131336e-04,
-4.05906564e-04,
4.09573402e-04,
2.77596455e-04,
-6.86750584e-04,
1.14759336e-03,
5.38200828e-04,
-5.43062774e-04,
-3.68071511e-04,
9.10578360e-04,
-1.52162037e-03,
5.47968641e-04,
3.71396559e-04,
-9.18804253e-04,
1.53536624e-03,
2.51721347e-04,
-6.22737713e-04,
1.04062476e-03,
1.54060140e-03,
-2.57441927e-03,
4.30197880e-03,
7.96451866e-37,
-2.19980820e-36,
4.70042916e-36,
-1.13600968e-35,
9.60072395e-36,
-1.46785387e-35,
1.65202548e-35,
-7.52767122e-36,
9.66739606e-36,
-2.85573052e-35,
1.21641328e-34,
-1.25707620e-34,
1.47757573e-34,
-1.13148741e-34,
4.24195315e-35,
1.05246500e-34,
-4.81478083e-34,
5.75838444e-34,
-6.17477146e-34,
3.78175375e-34,
-1.12579078e-34,
2.48371307e-33,
-2.83534280e-33,
3.05608271e-33,
-1.90461587e-33,
5.91778324e-34,
3.65062552e-33,
-3.72682732e-33,
1.95558965e-33,
-4.59015966e-34,
3.90701573e-33,
-2.23992072e-33,
6.14837755e-34,
1.62677364e-33,
-5.93932746e-34,
2.67346254e-34,
1.29839388e-08,
-1.08832504e-08,
-2.73545498e-08,
-6.91716292e-08,
2.11956470e-07,
7.18094978e-07,
-1.10753221e-06,
-8.47543550e-08,
9.12243508e-09,
2.29288214e-08,
5.79802609e-08,
-1.77663756e-07,
-6.01913453e-07,
9.28343124e-07,
7.10418371e-08,
5.76305389e-08,
1.45730721e-07,
-4.46549686e-07,
-1.51288180e-06,
2.33334778e-06,
1.78560393e-07,
3.68510231e-07,
-1.12919312e-06,
-3.82563414e-06,
5.90035183e-06,
4.51526836e-07,
3.46008601e-06,
1.17225503e-05,
-1.80799232e-05,
-1.38357352e-06,
3.97152516e-05,
-6.12536247e-05,
-4.68745867e-06,
9.44726871e-05,
7.22956100e-06,
5.53245112e-07,
1.37310653e-33,
-4.12855810e-33,
3.27665793e-33,
-8.44834418e-33,
9.58348122e-33,
-7.57678446e-33,
3.87626604e-33,
-8.86151432e-34,
1.35052378e-32,
-1.00903451e-32,
2.66903387e-32,
-3.01188291e-32,
2.32003910e-32,
-1.19169626e-32,
2.70067998e-33,
8.08741985e-33,
-2.04307252e-32,
2.36725381e-32,
-1.84356968e-32,
9.64538612e-33,
-2.21827578e-33,
5.35740654e-32,
-6.06415254e-32,
4.73142735e-32,
-2.42837613e-32,
5.53184409e-33,
7.00682988e-32,
-5.45157339e-32,
2.85128723e-32,
-6.55582617e-33,
4.29557741e-32,
-2.23485180e-32,
5.15125907e-33,
1.18365354e-32,
-2.74854256e-33,
6.41419336e-34,
2.10368811e-06,
-4.61611892e-06,
2.26486114e-06,
2.45565050e-05,
-2.47162091e-05,
-2.86589442e-05,
-1.13135074e-05,
-1.88917827e-05,
1.02903238e-05,
-4.51880272e-06,
-5.71210873e-05,
5.59828793e-05,
6.65674175e-05,
2.90197064e-05,
3.97003512e-05,
3.70018738e-06,
1.73815637e-05,
-2.17186333e-05,
-2.05550332e-05,
-4.44370350e-07,
-2.52463254e-05,
3.51649650e-04,
-3.23619591e-04,
-4.08461060e-04,
-2.16295336e-04,
-1.85305119e-04,
3.09350457e-04,
3.76638475e-04,
1.78414779e-04,
2.02937497e-04,
4.74497728e-04,
2.49921585e-04,
2.17311218e-04,
1.69997881e-04,
5.59579025e-05,
1.88738098e-04,
6.20449794e-33,
-5.90949928e-33,
8.09227073e-33,
-1.27459290e-32,
1.00461064e-32,
-9.05669120e-33,
2.56140773e-33,
-8.31947015e-34,
6.20787482e-33,
-8.21361545e-33,
1.24113773e-32,
-1.05983395e-32,
8.66143166e-33,
-2.14562316e-33,
7.52143470e-34,
1.10831842e-32,
-1.67388618e-32,
1.37388425e-32,
-1.17981899e-32,
3.15576377e-33,
-1.03079058e-33,
2.65159867e-32,
-2.15720519e-32,
1.87334601e-32,
-5.04827767e-33,
1.72214468e-33,
1.91649393e-32,
-1.50351416e-32,
3.54083929e-33,
-1.35623217e-33,
1.33647712e-32,
-3.78347927e-33,
1.24057204e-33,
1.36896528e-33,
-3.66432889e-34,
1.22685505e-34,
2.76887304e-07,
-3.91700292e-07,
-4.88760558e-07,
4.64918180e-06,
-3.17911710e-06,
-7.54259106e-06,
-6.92243663e-06,
4.04735861e-06,
5.55002136e-07,
7.00189831e-07,
-6.65652614e-06,
4.51357763e-06,
1.10862426e-05,
9.48064446e-06,
-5.59390104e-06,
9.49900567e-07,
-8.99773426e-06,
5.77309130e-06,
1.74522789e-05,
9.11418815e-06,
-5.83434211e-06,
8.52441236e-05,
-5.48445648e-05,
-1.64209697e-04,
-8.80463255e-05,
5.60670817e-05,
3.68000971e-05,
9.42620466e-05,
7.37321143e-05,
-4.40450134e-05,
4.01972484e-04,
4.11112565e-05,
-4.80426090e-05,
2.83722561e-04,
-1.47870539e-04,
7.88560554e-05,
]
]
)
soap_output_norm = np.array(
[
[
2.52389602e-02,
2.25741119e-03,
6.75635080e-03,
-2.91014612e-03,
-7.73761514e-04,
5.32436618e-03,
-1.06447594e-02,
2.06517657e-02,
2.01906308e-04,
6.04298346e-04,
-2.60287918e-04,
-6.92064129e-05,
4.76219451e-04,
-9.52083555e-04,
1.84712550e-03,
1.80864329e-03,
-7.79032413e-04,
-2.07132314e-04,
1.42530775e-03,
-2.84955196e-03,
5.52838042e-03,
3.35550687e-04,
8.92175846e-05,
-6.13919252e-04,
1.22738041e-03,
-2.38122550e-03,
2.37215351e-05,
-1.63231353e-04,
3.26340906e-04,
-6.33129944e-04,
1.12321882e-03,
-2.24559951e-03,
4.35665977e-03,
4.48952340e-03,
-8.71006866e-03,
1.68982962e-02,
8.43020477e-35,
-2.07490727e-34,
6.53727610e-34,
-1.16218781e-33,
1.57393666e-33,
-1.62791609e-33,
8.00893297e-34,
-1.59564210e-34,
5.11109434e-34,
-1.60631514e-33,
2.86104984e-33,
-3.87574454e-33,
4.00146363e-33,
-1.97324975e-33,
3.95320210e-34,
5.10931373e-33,
-9.05627250e-33,
1.22554695e-32,
-1.26649838e-32,
6.09464605e-33,
-1.15865249e-33,
1.61236801e-32,
-2.18323081e-32,
2.24499830e-32,
-1.08264605e-32,
2.07225862e-33,
2.95650572e-32,
-3.03895173e-32,
1.46786634e-32,
-2.81986226e-33,
3.15051730e-32,
-1.54077680e-32,
3.02905713e-33,
8.08716172e-33,
-1.81097441e-33,
4.88368282e-34,
5.72342913e-08,
-1.05386527e-07,
4.63962589e-07,
-1.17602718e-06,
1.78325057e-06,
-2.25389174e-06,
1.41008911e-06,
1.34284764e-07,
1.94050102e-07,
-8.54302635e-07,
2.16543993e-06,
-3.28353125e-06,
4.15013128e-06,
-2.59642236e-06,
-2.47260944e-07,
3.76105441e-06,
-9.53331649e-06,
1.44556966e-05,
-1.82708901e-05,
1.14307101e-05,
1.08856256e-06,
2.41645330e-05,
-3.66415148e-05,
4.63120601e-05,
-2.89739433e-05,
-2.75922927e-06,
5.55607926e-05,
-7.02245740e-05,
4.39341896e-05,
4.18391451e-06,
8.87584673e-05,
-5.55294409e-05,
-5.28814656e-06,
3.47405594e-05,
3.30839221e-06,
3.15062832e-07,
4.17628365e-33,
-6.14955530e-33,
7.29169653e-33,
-1.16076217e-32,
1.32232087e-32,
-1.33852411e-32,
7.10774314e-33,
-1.92189626e-33,
9.14078920e-33,
-1.04793866e-32,
1.65681564e-32,
-1.85762273e-32,
1.91032179e-32,
-1.02022004e-32,
2.80373112e-33,
1.36237429e-32,
-2.19201776e-32,
2.57780671e-32,
-2.53689467e-32,
1.32980436e-32,
-3.46272508e-33,
3.55250141e-32,
-4.22319139e-32,
4.10143937e-32,
-2.14148195e-32,
5.51465043e-33,
5.12852694e-32,
-4.89397844e-32,
2.54752917e-32,
-6.43301697e-33,
4.83739438e-32,
-2.56969604e-32,
6.70482073e-33,
1.38913430e-32,
-3.69392974e-33,
1.01286327e-33,
5.92939973e-06,
-5.29997618e-06,
1.40599521e-05,
-5.10588099e-05,
4.63188533e-05,
4.98942139e-05,
-2.54958713e-05,
1.48952724e-05,
6.59107548e-06,
-1.04976221e-05,
1.63351913e-05,
-1.04885616e-05,
-1.27723434e-05,
1.75198783e-05,
-1.17024821e-05,
3.56504727e-05,
-1.53791981e-04,
1.44349986e-04,
1.53846352e-04,
-6.63403787e-05,
3.71195743e-05,
9.02907310e-04,
-8.87539336e-04,
-9.32744884e-04,
3.02849359e-04,
-1.53741584e-04,
8.77358523e-04,
9.20497986e-04,
-2.87044253e-04,
1.43233905e-04,
9.66242085e-04,
-3.05010638e-04,
1.53008516e-04,
1.24609572e-04,
-6.86296226e-05,
3.88196176e-05,
1.32479361e-32,
-1.62689776e-32,
1.96135012e-32,
-3.07400589e-32,
3.58573321e-32,
-3.59494519e-32,
1.57302323e-32,
-4.06602829e-33,
2.18972735e-32,
-2.70813507e-32,
4.12688749e-32,
-5.02829057e-32,
4.80711798e-32,
-2.07551916e-32,
5.24652013e-33,
3.41986192e-32,
-5.12882259e-32,
6.40056084e-32,
-6.00451159e-32,
2.60636626e-32,
-6.48209639e-33,
7.80318893e-32,
-9.54690399e-32,
9.11287461e-32,
-3.94922219e-32,
9.94795750e-33,
1.21381038e-31,
-1.13913929e-31,
4.96048429e-32,
-1.23569390e-32,
1.10731814e-31,
-4.84752337e-32,
1.23693311e-32,
2.13995771e-32,
-5.45922915e-33,
1.41861861e-33,
7.75742332e-06,
-9.96016066e-06,
8.04032950e-06,
1.03021694e-05,
1.60835612e-06,
-6.39733065e-05,
-2.78809365e-05,
3.84081571e-05,
1.28020507e-05,
-1.03071763e-05,
-1.37103230e-05,
-1.06703553e-06,
8.36523748e-05,
3.23723893e-05,
-5.00190581e-05,
8.35278934e-06,
1.01053437e-05,
2.85045274e-06,
-6.45114107e-05,
-3.29595444e-05,
3.89730964e-05,
3.07222543e-05,
-3.30871272e-05,
-1.38382532e-04,
8.38659230e-05,
7.58828647e-05,
7.31401058e-05,
9.71633201e-05,
-2.55668048e-04,
-4.34543826e-05,
6.95056365e-04,
-1.49081943e-04,
-3.94727234e-04,
9.57872250e-04,
3.84331489e-05,
2.26476694e-04,
1.25057425e-02,
1.54898562e-03,
2.77964685e-03,
-3.68559755e-03,
3.71889214e-03,
2.52055253e-03,
-6.23563770e-03,
1.04200514e-02,
1.11853273e-03,
1.38543642e-04,
2.48615864e-04,
-3.29645480e-04,
3.32623399e-04,
2.25442072e-04,
-5.57724969e-04,
9.31985332e-04,
3.34772837e-03,
4.14656156e-04,
7.44098372e-04,
-9.86617109e-04,
9.95529913e-04,
6.74740043e-04,
-1.66925085e-03,
2.78939870e-03,
-1.44195869e-03,
-1.78603811e-04,
-3.20503635e-04,
4.24963125e-04,
-4.28802115e-04,
-2.90629095e-04,
7.18992252e-04,
-1.20147074e-03,
-3.83393854e-04,
-4.74879095e-05,
-8.52168131e-05,
1.12990928e-04,
-1.14011654e-04,
-7.72736486e-05,
1.91168591e-04,
-3.19451939e-04,
2.63818920e-03,
3.26771252e-04,
5.86389357e-04,
-7.77507106e-04,
7.84530873e-04,
5.31731280e-04,
-1.31545904e-03,
2.19819552e-03,
-5.27440981e-03,
-6.53298672e-04,
-1.17234116e-03,
1.55443405e-03,
-1.56847634e-03,
-1.06306579e-03,
2.62993651e-03,
-4.39475079e-03,
1.02328171e-02,
1.26745665e-03,
2.27444454e-03,
-3.01573822e-03,
3.04298150e-03,
2.06244076e-03,
-5.10230720e-03,
8.52620150e-03,
-4.01714222e-36,
2.61482828e-35,
-9.20621978e-35,
4.42720280e-34,
-4.91644204e-34,
5.43749421e-34,
-3.62573641e-34,
1.20701285e-34,
1.02917788e-35,
-6.49073857e-35,
2.27617204e-34,
-1.08369636e-33,
1.20911208e-33,
-1.33610843e-33,
8.88765345e-34,
-2.94645815e-34,
-3.07359850e-35,
2.03567655e-34,
-7.00263053e-34,
3.43079227e-33,
-3.68587225e-33,
4.13717916e-33,
-2.86344772e-33,
9.94272116e-34,
6.05814335e-35,
-3.70422816e-34,
1.25575740e-33,
-6.00884068e-33,
6.49564767e-33,
-7.29510102e-33,
5.05349711e-33,
-1.75063834e-33,
-8.28300468e-35,
5.02574908e-34,
-1.70404085e-33,
8.12705738e-33,
-8.80921011e-33,
9.88439559e-33,
-6.83152688e-33,
2.35997930e-33,
7.31304743e-35,
-4.99330611e-34,
1.76255670e-33,
-8.61199159e-33,
9.46454969e-33,
-1.04988227e-32,
7.05638887e-33,
-2.37497769e-33,
-3.01437761e-35,
2.31320119e-34,
-9.11597869e-34,
4.36261936e-33,
-5.27110085e-33,
5.58148167e-33,
-3.29734923e-33,
9.40841736e-34,
4.09213714e-36,
-4.09835906e-35,
1.99905327e-34,
-9.12896909e-34,
1.28933639e-33,
-1.27256739e-33,
5.85652717e-34,
-9.63984909e-35,
-2.72603473e-08,
2.28498601e-08,
5.74320734e-08,
1.45228860e-07,
-4.45011877e-07,
-1.50767181e-06,
2.32531230e-06,
1.77945474e-07,
5.01949664e-08,
-4.20738572e-08,
-1.05750707e-07,
-2.67412505e-07,
8.19408350e-07,
2.77610314e-06,
-4.28163924e-06,
-3.27654193e-07,
-2.20982579e-07,
1.85229519e-07,
4.65565884e-07,
1.17727950e-06,
-3.60743285e-06,
-1.22217520e-05,
1.88498519e-05,
1.44249262e-06,
5.60134642e-07,
-4.69509726e-07,
-1.18009112e-06,
-2.98410416e-06,
9.14392489e-06,
3.09790333e-05,
-4.77795810e-05,
-3.65635197e-06,
-8.49351477e-07,
7.11933792e-07,
1.78941287e-06,
4.52490005e-06,
-1.38652487e-05,
-4.69745767e-05,
7.24498266e-05,
5.54425260e-06,
1.07351503e-06,
-8.99829626e-07,
-2.26168042e-06,
-5.71912609e-06,
1.75246093e-05,
5.93722567e-05,
-9.15710156e-05,
-7.00750941e-06,
-6.71616928e-07,
5.62955147e-07,
1.41496189e-06,
3.57802341e-06,
-1.09638188e-05,
-3.71447178e-05,
5.72890391e-05,
4.38406714e-06,
-6.39590223e-08,
5.36110085e-08,
1.34748806e-07,
3.40740190e-07,
-1.04409985e-06,
-3.53734359e-06,
5.45571556e-06,
4.17500864e-07,
-5.08977780e-34,
1.95759841e-33,
-5.02796716e-34,
3.53424284e-33,
-1.86699465e-33,
1.66812523e-33,
-8.95880635e-35,
-7.03380012e-35,
8.67271482e-34,
-3.08182717e-33,
9.34245901e-34,
-5.78232273e-33,
3.32217141e-33,
-3.06087577e-33,
3.91543964e-34,
4.21690831e-35,
-8.40067590e-34,
3.93755448e-33,
-1.10564731e-33,
6.54833392e-33,
-3.99981930e-33,
2.90471714e-33,
-3.46005677e-34,
-8.66389793e-35,
8.82296537e-34,
-4.92693379e-33,
7.12983052e-34,
-7.59641146e-33,
3.18455247e-33,
-2.01293412e-33,
-7.71629255e-34,
4.44468147e-34,
-4.90901232e-34,
4.35105822e-33,
1.70618383e-34,
5.69716314e-33,
-5.91264551e-34,
-4.93983249e-34,
2.15028780e-33,
-7.99551056e-34,
1.76671321e-33,
-7.64361829e-33,
2.68045452e-33,
-1.30273441e-32,
8.90883265e-33,
-6.54444973e-33,
1.38465017e-33,
-2.57213939e-35,
-1.28259758e-33,
4.75177638e-33,
-2.18776813e-33,
8.64516126e-33,
-6.80122972e-33,
5.30738529e-33,
-1.67780966e-33,
2.39470606e-34,
4.65773268e-34,
-1.51417741e-33,
8.30572702e-34,
-2.93132190e-33,
2.49983990e-33,
-2.05158700e-33,
7.50302166e-34,
-1.35108042e-34,
-2.79598120e-06,
5.53793433e-06,
-4.68134784e-06,
-2.06433558e-05,
2.63719781e-05,
2.44492342e-05,
-5.06609245e-07,
3.16079084e-05,
1.29268239e-06,
-1.86992086e-06,
4.09622785e-06,
-4.32142564e-06,
-4.70409874e-06,
4.46533570e-06,
1.82023702e-05,
-2.21265207e-05,
-7.97706345e-06,
1.65709456e-05,
-1.11989994e-05,
-7.43785696e-05,
8.36021507e-05,
8.73623422e-05,
1.86176005e-05,
8.17898545e-05,
4.31489581e-05,
-9.63790564e-05,
4.17056488e-05,
5.37766880e-04,
-5.25365990e-04,
-6.26591829e-04,
-2.76223792e-04,
-3.69021830e-04,
-4.19616234e-05,
9.46269577e-05,
-3.80399634e-05,
-5.41041954e-04,
5.20670339e-04,
6.29903785e-04,
2.92043182e-04,
3.49074659e-04,
-4.42411775e-05,
9.94817329e-05,
-4.09061489e-05,
-5.64694428e-04,
5.45855801e-04,
6.57595724e-04,
3.00470662e-04,
3.71147980e-04,
1.54521670e-05,
-3.25685269e-05,
2.03800350e-05,
1.53502487e-04,
-1.67034265e-04,
-1.79947001e-04,
-4.82781955e-05,
-1.53325774e-04,
-8.07272519e-06,
1.65897470e-05,
-1.18366999e-05,
-7.16574759e-05,
8.26533584e-05,
8.43009424e-05,
1.41587849e-05,
8.47284124e-05,
-6.37226342e-33,
6.43269443e-33,
-8.13571957e-33,
1.38642041e-32,
-1.21218413e-32,
9.47972030e-33,
-1.98631510e-33,
9.67045254e-34,
9.34637317e-33,
-8.50677575e-33,
1.14048641e-32,
-1.95160075e-32,
1.53237062e-32,
-1.37025914e-32,
3.61060296e-33,
-1.40885135e-33,
-1.09711361e-32,
9.42358181e-33,
-1.28764343e-32,
2.26526759e-32,
-1.70493245e-32,
1.60960945e-32,
-4.57627146e-33,
1.71560157e-33,
1.71139664e-32,
-1.54682875e-32,
2.07169720e-32,
-3.57237440e-32,
2.79772383e-32,
-2.50799357e-32,
6.62578181e-33,
-2.60881571e-33,
-2.04405298e-32,
1.69202159e-32,
-2.32791971e-32,
4.20835420e-32,
-3.11031888e-32,
2.99961055e-32,
-8.73978207e-33,
3.25915986e-33,
1.97059735e-32,
-1.72802692e-32,
2.29052740e-32,
-4.16242282e-32,
3.30177686e-32,
-2.92137410e-32,
7.60640604e-33,
-3.14274820e-33,
-8.09430587e-33,
7.04997533e-33,
-9.25168417e-33,
1.72355365e-32,
-1.38716210e-32,
1.21024224e-32,
-3.11634788e-33,
1.33151344e-33,
2.14189771e-33,
-1.93881074e-33,
2.49747236e-33,
-4.62387924e-33,
3.86164101e-33,
-3.21581268e-33,
7.68388263e-34,
-3.47013071e-34,
1.44043313e-06,
-2.05296631e-06,
-2.69430911e-06,
2.55628155e-05,
-1.68192852e-05,
-4.64402931e-05,
-3.06077562e-05,
1.87753288e-05,
-1.83809439e-06,
2.62326220e-06,
3.47325911e-06,
-3.29388468e-05,
2.15276319e-05,
6.09293350e-05,
3.78057987e-05,
-2.34305154e-05,
1.50642658e-06,
-2.14283449e-06,
-2.77609149e-06,
2.63558394e-05,
-1.75127413e-05,
-4.65897661e-05,
-3.34944969e-05,
2.02617735e-05,
1.51227491e-06,
-2.28002103e-06,
-4.06854898e-06,
3.80923449e-05,
-1.99534779e-05,
-1.07633895e-04,
1.20476602e-05,
1.07241389e-06,
1.12685428e-06,
-1.34836064e-06,
4.55046246e-07,
-3.26553293e-06,
-8.41328848e-06,
8.53700735e-05,
-1.15269329e-04,
5.32157684e-05,
-1.06226949e-05,
1.55307417e-05,
2.37566509e-05,
-2.23800843e-04,
1.31232422e-04,
5.27066014e-04,
8.72078121e-05,
-8.00258425e-05,
-8.01963671e-06,
1.05455155e-05,
6.20451684e-06,
-6.24766761e-05,
7.73576338e-05,
-1.59143886e-04,
4.83854849e-04,
-2.36767295e-04,
6.54690109e-06,
-9.51289966e-06,
-1.40557939e-05,
1.32614442e-04,
-7.97958496e-05,
-2.97022821e-04,
-7.46195544e-05,
5.81264600e-05,
6.19651495e-03,
7.67512411e-04,
1.37729714e-03,
-1.82618988e-03,
1.84268713e-03,
1.24891757e-03,
-3.08971837e-03,
5.16306847e-03,
9.50655821e-05,
1.70594706e-04,
-2.26195435e-04,
2.28238818e-04,
1.54693362e-04,
-3.82698535e-04,
6.39507716e-04,
3.06131336e-04,
-4.05906564e-04,
4.09573402e-04,
2.77596455e-04,
-6.86750584e-04,
1.14759336e-03,
5.38200828e-04,
-5.43062774e-04,
-3.68071511e-04,
9.10578360e-04,
-1.52162037e-03,
5.47968641e-04,
3.71396559e-04,
-9.18804253e-04,
1.53536624e-03,
2.51721347e-04,
-6.22737713e-04,
1.04062476e-03,
1.54060140e-03,
-2.57441927e-03,
4.30197880e-03,
7.96451866e-37,
-2.19980820e-36,
4.70042916e-36,
-1.13600968e-35,
9.60072395e-36,
-1.46785387e-35,
1.65202548e-35,
-7.52767122e-36,
9.66739606e-36,
-2.85573052e-35,
1.21641328e-34,
-1.25707620e-34,
1.47757573e-34,
-1.13148741e-34,
4.24195315e-35,
1.05246500e-34,
-4.81478083e-34,
5.75838444e-34,
-6.17477146e-34,
3.78175375e-34,
-1.12579078e-34,
2.48371307e-33,
-2.83534280e-33,
3.05608271e-33,
-1.90461587e-33,
5.91778324e-34,
3.65062552e-33,
-3.72682732e-33,
1.95558965e-33,
-4.59015966e-34,
3.90701573e-33,
-2.23992072e-33,
6.14837755e-34,
1.62677364e-33,
-5.93932746e-34,
2.67346254e-34,
1.29839388e-08,
-1.08832504e-08,
-2.73545498e-08,
-6.91716292e-08,
2.11956470e-07,
7.18094978e-07,
-1.10753221e-06,
-8.47543550e-08,
9.12243508e-09,
2.29288214e-08,
5.79802609e-08,
-1.77663756e-07,
-6.01913453e-07,
9.28343124e-07,
7.10418371e-08,
5.76305389e-08,
1.45730721e-07,
-4.46549686e-07,
-1.51288180e-06,
2.33334778e-06,
1.78560393e-07,
3.68510231e-07,
-1.12919312e-06,
-3.82563414e-06,
5.90035183e-06,
4.51526836e-07,
3.46008601e-06,
1.17225503e-05,
-1.80799232e-05,
-1.38357352e-06,
3.97152516e-05,
-6.12536247e-05,
-4.68745867e-06,
9.44726871e-05,
7.22956100e-06,
5.53245112e-07,
1.37310653e-33,
-4.12855810e-33,
3.27665793e-33,
-8.44834418e-33,
9.58348122e-33,
-7.57678446e-33,
3.87626604e-33,
-8.86151432e-34,
1.35052378e-32,
-1.00903451e-32,
2.66903387e-32,
-3.01188291e-32,
2.32003910e-32,
-1.19169626e-32,
2.70067998e-33,
8.08741985e-33,
-2.04307252e-32,
2.36725381e-32,
-1.84356968e-32,
9.64538612e-33,
-2.21827578e-33,
5.35740654e-32,
-6.06415254e-32,
4.73142735e-32,
-2.42837613e-32,
5.53184409e-33,
7.00682988e-32,
-5.45157339e-32,
2.85128723e-32,
-6.55582617e-33,
4.29557741e-32,
-2.23485180e-32,
5.15125907e-33,
1.18365354e-32,
-2.74854256e-33,
6.41419336e-34,
2.10368811e-06,
-4.61611892e-06,
2.26486114e-06,
2.45565050e-05,
-2.47162091e-05,
-2.86589442e-05,
-1.13135074e-05,
-1.88917827e-05,
1.02903238e-05,
-4.51880272e-06,
-5.71210873e-05,
5.59828793e-05,
6.65674175e-05,
2.90197064e-05,
3.97003512e-05,
3.70018738e-06,
1.73815637e-05,
-2.17186333e-05,
-2.05550332e-05,
-4.44370350e-07,
-2.52463254e-05,
3.51649650e-04,
-3.23619591e-04,
-4.08461060e-04,
-2.16295336e-04,
-1.85305119e-04,
3.09350457e-04,
3.76638475e-04,
1.78414779e-04,
2.02937497e-04,
4.74497728e-04,
2.49921585e-04,
2.17311218e-04,
1.69997881e-04,
5.59579025e-05,
1.88738098e-04,
6.20449794e-33,
-5.90949928e-33,
8.09227073e-33,
-1.27459290e-32,
1.00461064e-32,
-9.05669120e-33,
2.56140773e-33,
-8.31947015e-34,
6.20787482e-33,
-8.21361545e-33,
1.24113773e-32,
-1.05983395e-32,
8.66143166e-33,
-2.14562316e-33,
7.52143470e-34,
1.10831842e-32,
-1.67388618e-32,
1.37388425e-32,
-1.17981899e-32,
3.15576377e-33,
-1.03079058e-33,
2.65159867e-32,
-2.15720519e-32,
1.87334601e-32,
-5.04827767e-33,
1.72214468e-33,
1.91649393e-32,
-1.50351416e-32,
3.54083929e-33,
-1.35623217e-33,
1.33647712e-32,
-3.78347927e-33,
1.24057204e-33,
1.36896528e-33,
-3.66432889e-34,
1.22685505e-34,
2.76887304e-07,
-3.91700292e-07,
-4.88760558e-07,
4.64918180e-06,
-3.17911710e-06,
-7.54259106e-06,
-6.92243663e-06,
4.04735861e-06,
5.55002136e-07,
7.00189831e-07,
-6.65652614e-06,
4.51357763e-06,
1.10862426e-05,
9.48064446e-06,
-5.59390104e-06,
9.49900567e-07,
-8.99773426e-06,
5.77309130e-06,
1.74522789e-05,
9.11418815e-06,
-5.83434211e-06,
8.52441236e-05,
-5.48445648e-05,
-1.64209697e-04,
-8.80463255e-05,
5.60670817e-05,
3.68000971e-05,
9.42620466e-05,
7.37321143e-05,
-4.40450134e-05,
4.01972484e-04,
4.11112565e-05,
-4.80426090e-05,
2.83722561e-04,
-1.47870539e-04,
7.88560554e-05,
]
]
)
| 28.289583 | 28 | 0.421681 | 5,725 | 54,316 | 4.000175 | 0.171878 | 0.013362 | 0.000699 | 0.001485 | 0.998297 | 0.998297 | 0.998297 | 0.998297 | 0.998297 | 0.998297 | 0 | 0.710111 | 0.456992 | 54,316 | 1,919 | 29 | 28.304325 | 0.066352 | 0 | 0 | 0.993219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000522 | 0 | 0.000522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
73f0cc7653448d4e62759fd04fbe5378c372adfb | 178 | py | Python | server/src/users/passwords.py | sz-piotr/fioletowe-pomarancze | 14e748041b8022709999a39f1a70788981f5ef14 | [
"MIT"
] | null | null | null | server/src/users/passwords.py | sz-piotr/fioletowe-pomarancze | 14e748041b8022709999a39f1a70788981f5ef14 | [
"MIT"
] | null | null | null | server/src/users/passwords.py | sz-piotr/fioletowe-pomarancze | 14e748041b8022709999a39f1a70788981f5ef14 | [
"MIT"
] | null | null | null | from passlib.hash import pbkdf2_sha256
def hash(password):
return pbkdf2_sha256.hash(password)
def check(password, hash):
return pbkdf2_sha256.verify(password, hash)
| 17.8 | 47 | 0.769663 | 24 | 178 | 5.583333 | 0.458333 | 0.268657 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 0.146067 | 178 | 9 | 48 | 19.777778 | 0.802632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 1 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 7 |
83c8cf7c05a4b66ee74454c29d81f4ee28549fc2 | 18,730 | py | Python | pyccl/tests/test_parameters.py | Jappenn/CCL | a37cad61f060f3928fa5d47b1e2670db3e9bce6f | [
"BSD-3-Clause"
] | 91 | 2017-07-14T02:45:59.000Z | 2022-03-28T08:55:54.000Z | pyccl/tests/test_parameters.py | Jappenn/CCL | a37cad61f060f3928fa5d47b1e2670db3e9bce6f | [
"BSD-3-Clause"
] | 703 | 2017-07-07T16:27:17.000Z | 2022-03-30T14:40:10.000Z | pyccl/tests/test_parameters.py | Jappenn/CCL | a37cad61f060f3928fa5d47b1e2670db3e9bce6f | [
"BSD-3-Clause"
] | 54 | 2017-07-12T13:08:25.000Z | 2022-02-06T13:12:10.000Z | import tempfile
import numpy as np
from numpy.testing import (
assert_raises, assert_no_warnings, assert_almost_equal)
import pytest
import pyccl as ccl
def test_parameters_lcdm_defaults():
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96)
assert np.allclose(cosmo['Omega_c'], 0.25)
assert np.allclose(cosmo['Omega_b'], 0.05)
assert np.allclose(cosmo['Omega_m'], 0.30)
assert np.allclose(cosmo['Omega_k'], 0)
assert np.allclose(cosmo['sqrtk'], 0)
assert np.allclose(cosmo['k_sign'], 0)
assert np.allclose(cosmo['w0'], -1)
assert np.allclose(cosmo['wa'], 0)
assert np.allclose(cosmo['H0'], 70)
assert np.allclose(cosmo['h'], 0.7)
assert np.allclose(cosmo['A_s'], 2.1e-9)
assert np.allclose(cosmo['n_s'], 0.96)
assert np.isnan(cosmo['sigma8'])
assert np.isnan(cosmo['z_star'])
assert np.allclose(cosmo['Neff'], 3.046)
assert cosmo['N_nu_mass'] == 0
assert np.allclose(cosmo['N_nu_rel'], 3.046)
assert np.allclose(cosmo['sum_nu_masses'], 0)
assert np.allclose(cosmo['m_nu'], 0)
assert np.allclose(cosmo['Omega_nu_mass'], 0)
assert np.allclose(cosmo['T_CMB'], ccl.physical_constants.T_CMB)
assert np.allclose(cosmo['bcm_ks'], 55.0)
assert np.allclose(cosmo['bcm_log10Mc'], np.log10(1.2e14))
assert np.allclose(cosmo['bcm_etab'], 0.5)
assert not cosmo['has_mgrowth']
assert cosmo['nz_mgrowth'] == 0
assert cosmo['z_mgrowth'] is None
assert cosmo['df_mgrowth'] is None
# these are defined in the code via some constants so
# going to test the total
# Omega_nu_rel
# Omega_g
# Omega_l
assert np.allclose(
cosmo['Omega_l'] + cosmo['Omega_m'] + cosmo['Omega_g'] +
cosmo['Omega_nu_rel'] + cosmo['Omega_nu_mass'] + cosmo['Omega_k'],
1)
@pytest.mark.parametrize('m_nu_type', ['normal', 'inverted', 'single'])
def test_parameters_nu(m_nu_type):
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
wa=0.01,
w0=-1,
Neff=3.046,
Omega_k=0.0,
m_nu=0.15,
m_nu_type=m_nu_type
)
if m_nu_type == 'inverted':
assert np.allclose(cosmo['m_nu'][1]**2 - cosmo['m_nu'][0]**2,
ccl.physical_constants.DELTAM12_sq,
atol=1e-4, rtol=0)
assert np.allclose(
cosmo['m_nu'][2]**2 - cosmo['m_nu'][0]**2,
ccl.physical_constants.DELTAM13_sq_neg, atol=1e-4, rtol=0)
elif m_nu_type == 'normal':
assert np.allclose(cosmo['m_nu'][1]**2 - cosmo['m_nu'][0]**2,
ccl.physical_constants.DELTAM12_sq,
atol=1e-4, rtol=0)
assert np.allclose(
cosmo['m_nu'][2]**2 - cosmo['m_nu'][0]**2,
ccl.physical_constants.DELTAM13_sq_pos, atol=1e-4, rtol=0)
elif m_nu_type == 'single':
assert np.allclose(cosmo['m_nu'][0], 0.15, atol=1e-4, rtol=0)
assert np.allclose(cosmo['m_nu'][1], 0., atol=1e-4, rtol=0)
assert np.allclose(cosmo['m_nu'][2], 0., atol=1e-4, rtol=0)
def test_parameters_nu_Nnurel_neg():
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.27, Omega_b=0.049,
h=0.67, sigma8=0.8, n_s=0.96, m_nu=[0.03, 0.02, 0.04],
Neff=3., m_nu_type='list')
def test_parameters_nu_list():
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
m_nu=[0.1, 0.01, 0.003],
m_nu_type='list')
assert np.allclose(cosmo['Omega_c'], 0.25)
assert np.allclose(cosmo['Omega_b'], 0.05)
assert np.allclose(cosmo['Omega_m'] - cosmo['Omega_nu_mass'], 0.30)
assert np.allclose(cosmo['Omega_k'], 0)
assert np.allclose(cosmo['sqrtk'], 0)
assert np.allclose(cosmo['k_sign'], 0)
assert np.allclose(cosmo['w0'], -1)
assert np.allclose(cosmo['wa'], 0)
assert np.allclose(cosmo['H0'], 70)
assert np.allclose(cosmo['h'], 0.7)
assert np.allclose(cosmo['A_s'], 2.1e-9)
assert np.allclose(cosmo['n_s'], 0.96)
assert np.isnan(cosmo['sigma8'])
assert np.isnan(cosmo['z_star'])
assert np.allclose(cosmo['T_CMB'], ccl.physical_constants.T_CMB)
assert np.allclose(cosmo['bcm_ks'], 55.0)
assert np.allclose(cosmo['bcm_log10Mc'], np.log10(1.2e14))
assert np.allclose(cosmo['bcm_etab'], 0.5)
assert not cosmo['has_mgrowth']
assert cosmo['nz_mgrowth'] == 0
assert cosmo['z_mgrowth'] is None
assert cosmo['df_mgrowth'] is None
# these are defined in the code via some constants so
# going to test the total
# Omega_nu_rel
# Omega_g
# Omega_l
assert np.allclose(
cosmo['Omega_l'] + cosmo['Omega_m'] + cosmo['Omega_g'] +
cosmo['Omega_nu_rel'] + cosmo['Omega_k'],
1)
assert np.allclose(cosmo['Neff'], 3.046)
assert cosmo['N_nu_mass'] == 3
assert np.allclose(cosmo['sum_nu_masses'], 0.1 + 0.01 + 0.003)
assert np.allclose(cosmo['m_nu'], [0.1, 0.01, 0.003])
def test_parameters_nu_normal():
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
m_nu=0.3,
m_nu_type='normal')
assert np.allclose(cosmo['Omega_c'], 0.25)
assert np.allclose(cosmo['Omega_b'], 0.05)
assert np.allclose(cosmo['Omega_m'] - cosmo['Omega_nu_mass'], 0.30)
assert np.allclose(cosmo['Omega_k'], 0)
assert np.allclose(cosmo['sqrtk'], 0)
assert np.allclose(cosmo['k_sign'], 0)
assert np.allclose(cosmo['w0'], -1)
assert np.allclose(cosmo['wa'], 0)
assert np.allclose(cosmo['H0'], 70)
assert np.allclose(cosmo['h'], 0.7)
assert np.allclose(cosmo['A_s'], 2.1e-9)
assert np.allclose(cosmo['n_s'], 0.96)
assert np.isnan(cosmo['sigma8'])
assert np.isnan(cosmo['z_star'])
assert np.allclose(cosmo['T_CMB'], ccl.physical_constants.T_CMB)
assert np.allclose(cosmo['bcm_ks'], 55.0)
assert np.allclose(cosmo['bcm_log10Mc'], np.log10(1.2e14))
assert np.allclose(cosmo['bcm_etab'], 0.5)
assert not cosmo['has_mgrowth']
assert cosmo['nz_mgrowth'] == 0
assert cosmo['z_mgrowth'] is None
assert cosmo['df_mgrowth'] is None
# these are defined in the code via some constants so
# going to test the total
# Omega_nu_rel
# Omega_g
# Omega_l
assert np.allclose(
cosmo['Omega_l'] + cosmo['Omega_m'] + cosmo['Omega_g'] +
cosmo['Omega_nu_rel'] + cosmo['Omega_k'],
1)
assert np.allclose(cosmo['Neff'], 3.046)
assert cosmo['N_nu_mass'] == 3
assert np.allclose(cosmo['sum_nu_masses'], 0.3)
def test_parameters_nu_inverted():
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
m_nu=0.3,
m_nu_type='inverted')
assert np.allclose(cosmo['Omega_c'], 0.25)
assert np.allclose(cosmo['Omega_b'], 0.05)
assert np.allclose(cosmo['Omega_m'] - cosmo['Omega_nu_mass'], 0.30)
assert np.allclose(cosmo['Omega_k'], 0)
assert np.allclose(cosmo['sqrtk'], 0)
assert np.allclose(cosmo['k_sign'], 0)
assert np.allclose(cosmo['w0'], -1)
assert np.allclose(cosmo['wa'], 0)
assert np.allclose(cosmo['H0'], 70)
assert np.allclose(cosmo['h'], 0.7)
assert np.allclose(cosmo['A_s'], 2.1e-9)
assert np.allclose(cosmo['n_s'], 0.96)
assert np.isnan(cosmo['sigma8'])
assert np.isnan(cosmo['z_star'])
assert np.allclose(cosmo['T_CMB'], ccl.physical_constants.T_CMB)
assert np.allclose(cosmo['bcm_ks'], 55.0)
assert np.allclose(cosmo['bcm_log10Mc'], np.log10(1.2e14))
assert np.allclose(cosmo['bcm_etab'], 0.5)
assert not cosmo['has_mgrowth']
assert cosmo['nz_mgrowth'] == 0
assert cosmo['z_mgrowth'] is None
assert cosmo['df_mgrowth'] is None
# these are defined in the code via some constants so
# going to test the total
# Omega_nu_rel
# Omega_g
# Omega_l
assert np.allclose(
cosmo['Omega_l'] + cosmo['Omega_m'] + cosmo['Omega_g'] +
cosmo['Omega_nu_rel'] + cosmo['Omega_k'],
1)
assert np.allclose(cosmo['Neff'], 3.046)
assert cosmo['N_nu_mass'] == 3
assert np.allclose(cosmo['sum_nu_masses'], 0.3)
def test_parameters_nu_equal():
cosmo = ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
m_nu=0.3,
m_nu_type='equal')
assert np.allclose(cosmo['Omega_c'], 0.25)
assert np.allclose(cosmo['Omega_b'], 0.05)
assert np.allclose(cosmo['Omega_m'] - cosmo['Omega_nu_mass'], 0.30)
assert np.allclose(cosmo['Omega_k'], 0)
assert np.allclose(cosmo['sqrtk'], 0)
assert np.allclose(cosmo['k_sign'], 0)
assert np.allclose(cosmo['w0'], -1)
assert np.allclose(cosmo['wa'], 0)
assert np.allclose(cosmo['H0'], 70)
assert np.allclose(cosmo['h'], 0.7)
assert np.allclose(cosmo['A_s'], 2.1e-9)
assert np.allclose(cosmo['n_s'], 0.96)
assert np.isnan(cosmo['sigma8'])
assert np.isnan(cosmo['z_star'])
assert np.allclose(cosmo['T_CMB'], ccl.physical_constants.T_CMB)
assert np.allclose(cosmo['bcm_ks'], 55.0)
assert np.allclose(cosmo['bcm_log10Mc'], np.log10(1.2e14))
assert np.allclose(cosmo['bcm_etab'], 0.5)
assert not cosmo['has_mgrowth']
assert cosmo['nz_mgrowth'] == 0
assert cosmo['z_mgrowth'] is None
assert cosmo['df_mgrowth'] is None
# these are defined in the code via some constants so
# going to test the total
# Omega_nu_rel
# Omega_g
# Omega_l
assert np.allclose(
cosmo['Omega_l'] + cosmo['Omega_m'] + cosmo['Omega_g'] +
cosmo['Omega_nu_rel'] + cosmo['Omega_k'],
1)
assert np.allclose(cosmo['Neff'], 3.046)
assert cosmo['N_nu_mass'] == 3
assert np.allclose(cosmo['sum_nu_masses'], 0.3)
assert np.allclose(cosmo['m_nu'], 0.1)
@pytest.mark.parametrize('m_nu,kind', [(0.05, 'normal'), (0.09, 'inverted')])
def test_parameters_nu_unphysical_raises(m_nu, kind):
with pytest.raises(ValueError):
ccl.Cosmology(
Omega_c=0.25,
Omega_b=0.05,
h=0.7,
A_s=2.1e-9,
n_s=0.96,
m_nu=m_nu,
m_nu_type=kind)
def test_parameters_valid_input():
"""
Check that valid parameter arguments are accepted.
"""
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96)
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, Omega_k=0.05)
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, Neff=2.046)
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, Neff=3.046, m_nu=0.06)
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, w0=-0.9)
assert_no_warnings(ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, w0=-0.9, wa=0.1)
# Check that kwarg order doesn't matter
assert_no_warnings(ccl.Cosmology, h=0.7, Omega_c=0.25, Omega_b=0.05,
A_s=2.1e-9, n_s=0.96)
# Try a set of parameters with non-zero mu0 / Sig0
assert_no_warnings(ccl.Cosmology, h=0.7, Omega_c=0.25, Omega_b=0.05,
A_s=2.1e-9, n_s=0.96, mu_0=0.1, sigma_0=0.1)
def test_parameters_missing():
"""
Check that errors are raised when compulsory parameters are missing, but
not when non-compulsory ones are.
"""
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25)
# Check that a single missing compulsory parameter is noticed
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05,
h=0.7, A_s=2.1e-9)
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05,
h=0.7, n_s=0.96)
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05,
A_s=2.1e-9, n_s=0.96)
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25,
h=0.7, A_s=2.1e-9, n_s=0.96)
assert_raises(ValueError, ccl.Cosmology, Omega_b=0.05,
h=0.7, A_s=2.1e-9, n_s=0.96)
# Make sure that compulsory parameters are compulsory
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
Omega_k=None)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
w0=None)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
wa=None)
# Check that sigma8 vs A_s is handled ok.
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, n_s=0.8,
A_s=2.1e-9, sigma8=0.7)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, n_s=0.8)
# Make sure that optional parameters are optional
assert_no_warnings(
ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=None, df_mg=None)
assert_no_warnings(
ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=None)
assert_no_warnings(
ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
df_mg=None)
def test_parameters_set():
"""
Check that a Cosmology object doesn't let parameters be set.
"""
params = ccl.Cosmology(
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9,
n_s=0.96)
# Check that values of sigma8 and A_s won't be misinterpreted by the C code
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05,
h=0.7, A_s=2e-5, n_s=0.96)
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05,
h=0.7, sigma8=9e-6, n_s=0.96)
# Check that error is raised when unrecognized parameter requested
assert_raises(KeyError, lambda: params['wibble'])
def test_parameters_mgrowth():
"""
Check that valid modified growth inputs are allowed, and invalid ones are
rejected.
"""
zarr = np.linspace(0., 1., 15)
dfarr = 0.1 * np.ones(15)
def f_func(z):
return 0.1 * z
# Valid constructions
for omega_g in [None, 0.0, 0.1]:
assert_no_warnings(
ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr, df_mg=dfarr, Omega_g=omega_g)
assert_no_warnings(
ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=[0., 0.1, 0.2],
df_mg=[0.1, 0.1, 0.1], Omega_g=omega_g)
# Invalid constructions
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr, Omega_g=omega_g)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
df_mg=dfarr, Omega_g=omega_g)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=None,
df_mg=dfarr, Omega_g=omega_g)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr,
df_mg=0.1, Omega_g=omega_g)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr,
df_mg=f_func, Omega_g=omega_g)
# Mis-matched array sizes and dimensionality
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr,
df_mg=dfarr[1:], Omega_g=omega_g)
assert_raises(
ValueError, ccl.Cosmology,
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
z_mg=zarr,
df_mg=np.column_stack((dfarr, dfarr)), Omega_g=omega_g)
def test_parameters_read_write():
"""Check that Cosmology objects can be read and written"""
params = ccl.Cosmology(
Omega_c=0.25, Omega_b=0.05, h=0.7, A_s=2.1e-9, n_s=0.96,
m_nu=[0.02, 0.1, 0.05], m_nu_type='list',
z_mg=[0.0, 1.0], df_mg=[0.01, 0.0])
# Make a temporary file name
with tempfile.NamedTemporaryFile(delete=False) as tmpfile:
temp_file_name = tmpfile.name
# Write out and then read in the parameters from that file
assert_raises(IOError, params.write_yaml, "/bogus/file/name")
params.write_yaml(temp_file_name)
params2 = ccl.Cosmology.read_yaml(temp_file_name)
# Check the read-in params are equal to the written out ones
assert_almost_equal(params['Omega_c'], params2['Omega_c'])
assert_almost_equal(params['Neff'], params2['Neff'])
assert_almost_equal(params['sum_nu_masses'], params2['sum_nu_masses'])
# check overriding parameters with kwargs
params3 = ccl.Cosmology.read_yaml(temp_file_name,
matter_power_spectrum='emu',
n_s=1.1)
# check unmodified parameters are the same
assert_almost_equal(params['Omega_c'], params3['Omega_c'])
assert_almost_equal(params['Neff'], params3['Neff'])
# check new parameters and config correctly updated
assert_almost_equal(1.1, params3['n_s'])
assert_almost_equal(params['sum_nu_masses'], params3['sum_nu_masses'])
assert params3._config_init_kwargs['matter_power_spectrum'] == 'emu'
# Now make a file that will be deleted so it does not exist
# and check the right error is raise
with tempfile.NamedTemporaryFile(delete=True) as tmpfile:
temp_file_name = tmpfile.name
assert_raises(IOError, ccl.Cosmology.read_yaml, filename=temp_file_name)
assert_raises(
IOError,
params.read_yaml,
filename=temp_file_name+"/nonexistent_directory/params.yml")
def test_omega_k():
""" Check that the value of Omega_k is within reasonable bounds. """
assert_raises(ValueError, ccl.Cosmology, Omega_c=0.25, Omega_b=0.05, h=0.7,
A_s=2.1e-9, n_s=0.96, Omega_k=-2)
| 35.339623 | 79 | 0.608329 | 3,145 | 18,730 | 3.429253 | 0.079809 | 0.086787 | 0.158739 | 0.208345 | 0.778674 | 0.774038 | 0.752341 | 0.719796 | 0.702364 | 0.694112 | 0 | 0.073884 | 0.241965 | 18,730 | 529 | 80 | 35.406427 | 0.68573 | 0.100427 | 0 | 0.693671 | 0 | 0 | 0.085256 | 0.003229 | 0 | 0 | 0 | 0 | 0.481013 | 1 | 0.037975 | false | 0 | 0.012658 | 0.002532 | 0.053165 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83e28e05ec40803f9236a74890909afc5097782a | 15,272 | py | Python | collect_csv_data.py | brickleq/CoW | 35e660a6f91910683750ec2726bc8a36020251db | [
"MIT"
] | 2 | 2019-07-09T00:26:31.000Z | 2021-06-27T18:35:15.000Z | collect_csv_data.py | brickleq/CoW | 35e660a6f91910683750ec2726bc8a36020251db | [
"MIT"
] | 3 | 2020-06-30T16:30:16.000Z | 2020-06-30T16:30:27.000Z | collect_csv_data.py | brickleq/CoW | 35e660a6f91910683750ec2726bc8a36020251db | [
"MIT"
] | null | null | null | #%%
import io
import os
import pandas as pd
import requests
import sys
if sys.version_info >= (3, 6):
import zipfile
else:
import zipfile36 as zipfile
from zipfile import ZipFile
# %%
csv_urls = [['states2016.csv','http://www.correlatesofwar.org/data-sets/state-system-membership/states2016'],
['majors2016.csv','http://www.correlatesofwar.org/data-sets/state-system-membership/majors2016'],
['system2016.csv','http://www.correlatesofwar.org/data-sets/state-system-membership/system2016'],
['Non-StateWarData_v4.0.csv','http://www.correlatesofwar.org/data-sets/COW-war/non-state-war-data-1'],
['Intra-StateWarData_v4.1.csv','http://www.correlatesofwar.org/data-sets/COW-war/intra-state-war-data-v4-1'],
['Inter-StateWarData_v4.0.csv','http://www.correlatesofwar.org/data-sets/COW-war/inter-state-war-data'],
['Extra-StateWarData_v4.0.csv','http://www.correlatesofwar.org/data-sets/COW-war/extra-state-war-data'],
['MIDLOCA_2_0.csv','http://www.correlatesofwar.org/data-sets/MIDLOC/midloca-2.0/at_download/file'],
['MIDLOCI_2_0.csv','http://www.correlatesofwar.org/data-sets/MIDLOC/midloci-2.0/at_download/file'],
['MIDLOC_1.1.csv','http://www.correlatesofwar.org/data-sets/militarized-interstate-dispute-locations-v1-1/MIDLOC_1.1.csv/at_download/file'],
['WRP_national.csv','http://www.correlatesofwar.org/data-sets/world-religion-data/wrp-national-data-1/at_download/file'],
['WRP_regional.csv','http://www.correlatesofwar.org/data-sets/world-religion-data/wrp-regional-data/at_download/file'],
['WRP_global.csv','http://www.correlatesofwar.org/data-sets/world-religion-data/wrp-global-data/at_download/file'],
['Diplomatic_Exchange_2006v1.csv','http://www.correlatesofwar.org/data-sets/diplomatic-exchange/diplomatic-exchange-v2006-1-data/at_download/file','Data of diplomatic exchange']]
# for url in csv_urls:
# print(url)
#%%
zip_urls = [['Dyadid_Interstate_War_Dataset.zip','http://www.correlatesofwar.org/data-sets/COW-war/dyadic-inter-state-war-dataset/at_download/file'],
['MID_4_3.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-level-v4-3-data-files/at_download/file'],
['Incident-level.zip','http://www.correlatesofwar.org/data-sets/MIDs/incident-level/at_download/file'],
['MIDS_210.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1/at_download/file'],
['MID_2.1EE.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1ee/at_download/file'],
['NMC_5_0.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-v5-1/at_download/file'],
['NMC_5_0-wsupplementary.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-v5-supplemental/at_download/file'],
['NMC_Summplement_v4_0.csv.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-supplement-v4-csv/at_download/file'],
['version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['Territorial Change Data.zip','http://www.correlatesofwar.org/data-sets/territorial-change/territorial-change-data-1816-2018/at_download/file','Territorial Change Data 1816-2018'],
['DirectContiguity320.zip','http://www.correlatesofwar.org/data-sets/direct-contiguity/direct-contiguity-v3-2/at_download/file','Direct Contiguity (v3.2)'],
['ColonialContiguity310.zip','http://www.correlatesofwar.org/data-sets/colonial-dependency-contiguity/colonial-dependency-contiguity-v3-1/at_download/file','Colonial/Dependency Contiguity (v3.1)'],
['IGO_igounit_v3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_igounit_v3.zip/at_download/file','Intergovernmental Organizations - IGO-year level (v3)'],
['IGO_stateunit_v3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_stateunit_v3.zip/at_download/file','Intergovernmental Organizations - country-year level (v3)'],
['dyadic_formatv3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_dyadunit_v3.zip/at_download/file','Intergovernmental Oganizations - dyad-year level (v3)'],
['IGO_igounity_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/IGO_igounit_v2.3.zip/at_download/file','Data on IGOs from 1815-2005, at the IGO-year level. Contains one record per IGO-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['IGO_stateunit_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/IGO_stateunit_v2.3.zip/at_download/file','Data on IGOs from 1815-2005, at the country-year level. Contains one record per country-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['IGO_dyadunit_csv_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_dyadunit_csv_v2-3.zip/at_download/file','Data on IGOs from 1815-2005, at the dyad-year level. Contains one record per dyad-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['COW_Trade_4.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_4.0/at_download/file','International Trade, 1870-2014 (v4.0)'],
['COW_Trade_3.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_3.0/at_download/file','International Trade (v3.0)'],
['COW_Trade_2.01.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_2.01/at_download/file','International Trade (v2.01)'],
['kinne_dca.zip','http://www.correlatesofwar.org/data-sets/defense-cooperation-agreement-dataset/dcad-1980-2011-1/at_download/file','Zip file containing the DCAD dyadic and main data in .csv format and the codebook.']]
# for zip in zip_urls:
# url = zip[1]
# response = requests.get(url, stream=True)
# archive = ZipFile(io.BytesIO(response.content))
# print(zip[0])
# print(archive.namelist())
# print()
#%%
zipped_csvs = [['directed_dyadic_war.csv','Dyadid_Interstate_War_Dataset.zip','http://www.correlatesofwar.org/data-sets/COW-war/dyadic-inter-state-war-dataset/at_download/file'],
['MID 4.3/MIDA 4.3.csv','MID_4_3.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-level-v4-3-data-files/at_download/file'],
['MID 4.3/MIDB 4.3.csv','MID_4_3.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-level-v4-3-data-files/at_download/file'],
['MID 4.3/MIDI 4.3.csv','MID_4_3.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-level-v4-3-data-files/at_download/file'],
['MID 4.3/MIDP 4.3.csv','MID_4_3.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-level-v4-3-data-files/at_download/file'],
['MIDI_4.01.csv','Incident-level.zip','http://www.correlatesofwar.org/data-sets/MIDs/incident-level/at_download/file'],
['MIDIP_4.01.csv','Incident-level.zip','http://www.correlatesofwar.org/data-sets/MIDs/incident-level/at_download/file'],
['MIDA_210.TXT','MIDS_210.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1/at_download/file'],
['MIDB_210.TXT','MIDS_210.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1/at_download/file'],
['MIDC_210.TXT','MIDS_210.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1/at_download/file'],
['MID 2.1EE.csv','MID_2.1EE.zip','http://www.correlatesofwar.org/data-sets/MIDs/mid-v-2.1ee/at_download/file'],
['NMC_5_0.csv','NMC_5_0.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-v5-1/at_download/file'],
['NMC_5_0-wsupplementary.csv','NMC_5_0-wsupplementary.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-v5-supplemental/at_download/file'],
['NMC_Supplement_v4_0.csv','NMC_Summplement_v4_0.csv.zip','http://www.correlatesofwar.org/data-sets/national-material-capabilities/nmc-supplement-v4-csv/at_download/file'],
['version4.1_csv/alliance_v4.1_by_directed.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['version4.1_csv/alliance_v4.1_by_directed_yearly.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['version4.1_csv/alliance_v4.1_by_dyad.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['version4.1_csv/alliance_v4.1_by_dyad_yearly.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['version4.1_csv/alliance_v4.1_by_member.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['version4.1_csv/alliance_v4.1_by_member_yearly.csv','version4.1_csv.zip','http://www.correlatesofwar.org/data-sets/formal-alliances/alliances-data-csv-zip/at_download/file','Alliances data.csv ZIP'],
['Territorial Change Data/Territorial Change Data, 1816-2018.xls','Territorial Change Data.zip','http://www.correlatesofwar.org/data-sets/territorial-change/territorial-change-data-1816-2018/at_download/file','Territorial Change Data 1816-2018'],
['DirectContiguity320/contdir.csv','DirectContiguity320.zip','http://www.correlatesofwar.org/data-sets/direct-contiguity/direct-contiguity-v3-2/at_download/file','Direct Contiguity (v3.2)'],
['DirectContiguity320/contdird.csv','DirectContiguity320.zip','http://www.correlatesofwar.org/data-sets/direct-contiguity/direct-contiguity-v3-2/at_download/file','Direct Contiguity (v3.2)'],
['DirectContiguity320/contdirs.csv','DirectContiguity320.zip','http://www.correlatesofwar.org/data-sets/direct-contiguity/direct-contiguity-v3-2/at_download/file','Direct Contiguity (v3.2)'],
['ColonialContiguity310/contcol.csv','ColonialContiguity310.zip','http://www.correlatesofwar.org/data-sets/colonial-dependency-contiguity/colonial-dependency-contiguity-v3-1/at_download/file','Colonial/Dependency Contiguity (v3.1)'],
['ColonialContiguity310/contcold.csv','ColonialContiguity310.zip','http://www.correlatesofwar.org/data-sets/colonial-dependency-contiguity/colonial-dependency-contiguity-v3-1/at_download/file','Colonial/Dependency Contiguity (v3.1)'],
['ColonialContiguity310/contcols.csv','ColonialContiguity310.zip','http://www.correlatesofwar.org/data-sets/colonial-dependency-contiguity/colonial-dependency-contiguity-v3-1/at_download/file','Colonial/Dependency Contiguity (v3.1)'],
['igo_year_formatv3.csv','IGO_igounit_v3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_igounit_v3.zip/at_download/file','Intergovernmental Organizations - IGO-year level (v3)'],
['state_year_formatv3.csv','IGO_stateunit_v3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_stateunit_v3.zip/at_download/file','Intergovernmental Organizations - country-year level (v3)'],
['dyadic_formatv3.csv','dyadic_formatv3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_dyadunit_v3.zip/at_download/file','Intergovernmental Oganizations - dyad-year level (v3)'],
['igounit_v2.3.csv','IGO_igounity_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/IGO_igounit_v2.3.zip/at_download/file','Data on IGOs from 1815-2005, at the IGO-year level. Contains one record per IGO-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['IGO_stateunit_v2.3.csv','IGO_stateunit_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/IGO_stateunit_v2.3.zip/at_download/file','Data on IGOs from 1815-2005, at the country-year level. Contains one record per country-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['IGO_dyadunit_v2.3.csv','IGO_dyadunit_csv_v2.3.zip','http://www.correlatesofwar.org/data-sets/IGOs/igo_dyadunit_csv_v2-3.zip/at_download/file','Data on IGOs from 1815-2005, at the dyad-year level. Contains one record per dyad-year (with years listed at 5 year intervals through 1965, and annually thereafter).'],
['COW_Trade_4.0/Dyadic_COW_4.0.csv','COW_Trade_4.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_4.0/at_download/file','International Trade, 1870-2014 (v4.0)'],
['COW_Trade_4.0/National_COW_4.0.csv','COW_Trade_4.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_4.0/at_download/file','International Trade, 1870-2014 (v4.0)'],
['COW_Trade_3.0/dyadic_trade_3.0.csv','COW_Trade_3.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_3.0/at_download/file','International Trade (v3.0)'],
['COW_Trade_3.0/national_trade_3.0.csv','COW_Trade_3.0.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_3.0/at_download/file','International Trade (v3.0)'],
['COW_Trade_2.01/dyadic_trade_2.01.csv','COW_Trade_2.01.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_2.01/at_download/file','International Trade (v2.01)'],
['COW_Trade_2.01/national_trade_2.0.csv','COW_Trade_2.01.zip','http://www.correlatesofwar.org/data-sets/bilateral-trade/cow_trade_2.01/at_download/file','International Trade (v2.01)'],
['kinne_dca/DCAD-v1.0-dyadic.csv','kinne_dca.zip','http://www.correlatesofwar.org/data-sets/defense-cooperation-agreement-dataset/dcad-1980-2011-1/at_download/file','Zip file containing the DCAD dyadic and main data in .csv format and the codebook.'],
['kinne_dca/DCAD-v1.0-main.csv','kinne_dca.zip','http://www.correlatesofwar.org/data-sets/defense-cooperation-agreement-dataset/dcad-1980-2011-1/at_download/file','Zip file containing the DCAD dyadic and main data in .csv format and the codebook.']]
#%%
try:
os.mkdir('Resources')
except:
pass
# for url in zip_urls:
# print(url)
#%%
print('Retrieving CSV datasets from http://www.correlatesofwar.org....\n')
no_luck = 0
for csv in csv_urls:
try:
filename = str(csv[0])
path = 'Resources/' + filename
data = pd.read_csv(csv[1])
data.to_csv(path)
print('Success saving ' + filename + '.\n')
except:
print('ERROR saving ' + filename + '... trying again....')
try:
filename = str(csv[0])
path = 'Resources/' + filename
data = pd.read_csv(csv[1], encoding='ISO-8859-1')
data.to_csv(path)
print('Success saving ' + filename + '\n')
except:
print('Sorry, NO LUCK saving ' + filename + '.')
no_luck += 1
print('\nSuccessfully saved ' + str(len(csv_urls)-no_luck) + ' CSV files.')
if no_luck > 0:
print('Failed to save ' + str(no_luck) + ' files in CSV format. Please download them manually to the /Resources/ folder.\n')
#%%
print('Extracting CSV datasets from ZIP archives at http://www.correlatesofwar.org....\n')
no_luck = 0
for csv in zipped_csvs:
try:
filename = str(csv[0])
zipname = str(csv[1])
url = csv[2]
response = requests.get(url, stream=True)
archive = ZipFile(io.BytesIO(response.content))
archive.extract(filename, path='Resources/')
print('Success extracting ' + filename + ' from ' + zipname + '.\n')
except:
print('ERROR extracting ' + filename + ' from ' + zipname + '.\n')
no_luck += 1
print('\nSuccessfully extracted and saved ' + str(len(zipped_csvs)-no_luck) + ' CSV files.')
if no_luck > 0:
print('\nFailed to extract ' + str(no_luck) + ' CSV files from ZIP archives. Please download them manually to the /Resources/ folder.\n')
| 99.168831 | 314 | 0.759233 | 2,366 | 15,272 | 4.76585 | 0.093407 | 0.049042 | 0.154133 | 0.175151 | 0.873891 | 0.855002 | 0.855002 | 0.851809 | 0.845867 | 0.82884 | 0 | 0.04318 | 0.064366 | 15,272 | 153 | 315 | 99.816993 | 0.745958 | 0.018531 | 0 | 0.160305 | 0 | 0.610687 | 0.840826 | 0.102398 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.007634 | 0.061069 | 0 | 0.061069 | 0.091603 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
83fde52778b1bf64fbd10bb5140d863794141943 | 3,118 | py | Python | tests/algorithms/shoebox/test_find_overlapping.py | dials-src/dials | 25055c1f6164dc33e672e7c5c6a9c5a35e870660 | [
"BSD-3-Clause"
] | 1 | 2021-12-10T17:28:16.000Z | 2021-12-10T17:28:16.000Z | tests/algorithms/shoebox/test_find_overlapping.py | dials-src/dials | 25055c1f6164dc33e672e7c5c6a9c5a35e870660 | [
"BSD-3-Clause"
] | null | null | null | tests/algorithms/shoebox/test_find_overlapping.py | dials-src/dials | 25055c1f6164dc33e672e7c5c6a9c5a35e870660 | [
"BSD-3-Clause"
] | 1 | 2021-12-07T12:39:04.000Z | 2021-12-07T12:39:04.000Z | from __future__ import annotations
import random
from dials.algorithms.shoebox import find_overlapping
def test_single_panel():
from dials.array_family import flex
nrefl = 1000
# Generate bboxes
bbox = flex.int6(nrefl)
for i in range(nrefl):
x0 = random.randint(0, 500)
y0 = random.randint(0, 500)
z0 = random.randint(0, 10)
x1 = x0 + random.randint(2, 10)
y1 = y0 + random.randint(2, 10)
z1 = z0 + random.randint(2, 10)
bbox[i] = (x0, x1, y0, y1, z0, z1)
# Find the overlaps
overlaps = find_overlapping(bbox)
assert overlaps.num_vertices() == nrefl
overlaps2 = brute_force(bbox)
assert overlaps.num_edges() == len(overlaps2)
edges = {}
for edge in overlaps2:
edge = (min(edge), max(edge))
edges[edge] = None
for edge in overlaps.edges():
edge = overlaps.source(edge), overlaps.target(edge)
edge = (min(edge), max(edge))
assert edge in edges
def test_multiple_panels():
from dials.array_family import flex
nrefl = 1000
# Generate bboxes
bbox = flex.int6(nrefl)
panel = flex.size_t(nrefl)
for i in range(nrefl):
x0 = random.randint(0, 500)
y0 = random.randint(0, 500)
z0 = random.randint(0, 10)
x1 = x0 + random.randint(2, 10)
y1 = y0 + random.randint(2, 10)
z1 = z0 + random.randint(2, 10)
bbox[i] = (x0, x1, y0, y1, z0, z1)
panel[i] = random.randint(0, 2)
# Find the overlaps
overlaps = find_overlapping(bbox, panel)
assert overlaps.num_vertices() == nrefl
overlaps2 = brute_force(bbox, panel)
assert overlaps.num_edges() == len(overlaps2)
edges = {}
for edge in overlaps2:
edge = (min(edge), max(edge))
edges[edge] = None
for edge in overlaps.edges():
edge = (overlaps.source(edge), overlaps.target(edge))
edge = (min(edge), max(edge))
assert edge in edges
def brute_force(bbox, panel=None):
overlaps = []
if panel is None:
for j in range(len(bbox) - 1):
jx0, jx1, jy0, jy1, jz0, jz1 = bbox[j]
for i in range(j + 1, len(bbox)):
ix0, ix1, iy0, iy1, iz0, iz1 = bbox[i]
if not (
ix0 >= jx1
or jx0 >= ix1
or iy0 >= jy1
or jy0 >= iy1
or iz0 >= jz1
or jz0 >= iz1
):
overlaps.append((i, j))
else:
for j in range(len(bbox) - 1):
jx0, jx1, jy0, jy1, jz0, jz1 = bbox[j]
for i in range(j + 1, len(bbox)):
ix0, ix1, iy0, iy1, iz0, iz1 = bbox[i]
if panel[j] != panel[i]:
continue
if not (
ix0 >= jx1
or jx0 >= ix1
or iy0 >= jy1
or jy0 >= iy1
or iz0 >= jz1
or jz0 >= iz1
):
overlaps.append((i, j))
return overlaps
| 29.140187 | 61 | 0.508339 | 394 | 3,118 | 3.969543 | 0.203046 | 0.108056 | 0.06266 | 0.061381 | 0.847187 | 0.840793 | 0.840793 | 0.787084 | 0.787084 | 0.719309 | 0 | 0.072202 | 0.378127 | 3,118 | 106 | 62 | 29.415094 | 0.734399 | 0.021488 | 0 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 1 | 0.034884 | false | 0 | 0.05814 | 0 | 0.104651 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f7a54d7e4f1eee11106369b84bc6dc6b53d21f4d | 5,105 | py | Python | tests/helpers.py | kartikeyas00/sqlalchemy-datatables | 1da56b09d11445f843cb097d7bdf931033940a5b | [
"MIT"
] | null | null | null | tests/helpers.py | kartikeyas00/sqlalchemy-datatables | 1da56b09d11445f843cb097d7bdf931033940a5b | [
"MIT"
] | null | null | null | tests/helpers.py | kartikeyas00/sqlalchemy-datatables | 1da56b09d11445f843cb097d7bdf931033940a5b | [
"MIT"
] | null | null | null | from random import shuffle
def create_dt_params(columns, search="", start=0, length=10, order=None):
"""Create DataTables input parameters when the data source from the rows
data object/ array is not set.
Read more about setting column data source here https://datatables.net/reference/option/columns.data"""
params = {
"draw": "1",
"start": str(start),
"length": str(length),
"search[value]": str(search),
"search[regex]": "false",
}
for i, item in enumerate(columns):
cols = "columns[%s]" % i
params["%s%s" % (cols, "[data]")] = i
params["%s%s" % (cols, "[name]")] = ""
params["%s%s" % (cols, "[searchable]")] = "true"
params["%s%s" % (cols, "[orderable]")] = "true"
params["%s%s" % (cols, "[search][value]")] = ""
params["%s%s" % (cols, "[search][regex]")] = "false"
for i, item in enumerate(order or [{"column": 0, "dir": "asc"}]):
for key, value in item.items():
params["order[%s][%s]" % (i, key)] = str(value)
return params
# These methods would only be used when the mData param is defined in the backend
def create_dt_params_with_mData(columns, search="", start=0, length=10, order=None):
"""Create DataTables input parameters when the data source from the rows
data object/ array is set.
Read more about setting column data source here https://datatables.net/reference/option/columns.data"""
params = {
"draw": "1",
"start": str(start),
"length": str(length),
"search[value]": str(search),
"search[regex]": "false",
}
for i, item in enumerate(columns):
cols = "columns[%s]" % i
params["%s%s" % (cols, "[data]")] = item.mData
params["%s%s" % (cols, "[name]")] = ""
params["%s%s" % (cols, "[searchable]")] = "true"
params["%s%s" % (cols, "[orderable]")] = "true"
params["%s%s" % (cols, "[search][value]")] = ""
params["%s%s" % (cols, "[search][regex]")] = "false"
for i, item in enumerate(order or [{"column": 0, "dir": "asc"}]):
for key, value in item.items():
params["order[%s][%s]" % (i, key)] = str(value)
return params
def create_dt_params_with_mData_shuffled(
columns, search="", start=0, length=10, order=None
):
"""Create DataTables input parameters when the data source from the rows
data object/ array is set. Also when the order in the frontend is not same
as in the backend.
Read more about setting column data source here https://datatables.net/reference/option/columns.data"""
params = {
"draw": "1",
"start": str(start),
"length": str(length),
"search[value]": str(search),
"search[regex]": "false",
}
# Shuffle the columns in place
shuffle(columns)
for i, item in enumerate(columns):
cols = "columns[%s]" % i
params["%s%s" % (cols, "[data]")] = item.mData
params["%s%s" % (cols, "[name]")] = ""
params["%s%s" % (cols, "[searchable]")] = "true"
params["%s%s" % (cols, "[orderable]")] = "true"
params["%s%s" % (cols, "[search][value]")] = ""
params["%s%s" % (cols, "[search][regex]")] = "false"
for i, item in enumerate(order or [{"column": 0, "dir": "asc"}]):
for key, value in item.items():
params["order[%s][%s]" % (i, key)] = str(value)
return params
def create_dt_params_with_mData_with_extra_data(
columns, search="", start=0, length=10, order=None
):
"""Create DataTables input parameters when the data source from the rows
data object/ array is set. Also when there is an extra data source defined in
the frontend just for the use in the frontend but not in the backend.
An example of this is here https://editor.datatables.net/examples/bubble-editing/simple.html
Read more about setting column data source here https://datatables.net/reference/option/columns.data"""
params = {
"draw": "1",
"start": str(start),
"length": str(length),
"search[value]": str(search),
"search[regex]": "false",
}
# Add the extra params for the extra data source added in the frontend but
# not in the backend.
params["columns[0][name]"] = ""
params["columns[0][searchable]"] = "true"
params["columns[0][orderable]"] = "false"
params["columns[0][search][value]"] = ""
params["columns[0][search][regex]"] = "false"
for i, item in enumerate(columns, 1):
cols = "columns[%s]" % i
params["%s%s" % (cols, "[data]")] = item.mData
params["%s%s" % (cols, "[name]")] = ""
params["%s%s" % (cols, "[searchable]")] = "true"
params["%s%s" % (cols, "[orderable]")] = "true"
params["%s%s" % (cols, "[search][value]")] = ""
params["%s%s" % (cols, "[search][regex]")] = "false"
for i, item in enumerate(order or [{"column": 1, "dir": "asc"}]):
for key, value in item.items():
params["order[%s][%s]" % (i, key)] = str(value)
return params
| 36.992754 | 107 | 0.567875 | 665 | 5,105 | 4.332331 | 0.144361 | 0.019438 | 0.066644 | 0.099965 | 0.825755 | 0.825408 | 0.816383 | 0.816383 | 0.794863 | 0.780285 | 0 | 0.006734 | 0.243683 | 5,105 | 137 | 108 | 37.262774 | 0.739446 | 0.251322 | 0 | 0.802198 | 0 | 0 | 0.229888 | 0.024774 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043956 | false | 0 | 0.010989 | 0 | 0.098901 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f7ef0467c973aeb78d8d43ae91a2668f2b170669 | 23,389 | py | Python | tests/unit/test_eventfilters.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 2 | 2018-02-27T18:21:21.000Z | 2018-03-16T12:12:12.000Z | tests/unit/test_eventfilters.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 1 | 2018-06-25T07:14:41.000Z | 2018-06-25T07:14:41.000Z | tests/unit/test_eventfilters.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 1 | 2020-06-25T10:37:21.000Z | 2020-06-25T10:37:21.000Z | import logging
import os
import time
import threading
import unittest
from unittest.mock import Mock, call
from performance.driver.core.eventfilters import EventFilter
from performance.driver.core.events import Event
class FooEvent(Event):
def __init__(self, a=None, b=None, *args, **kwargs):
super().__init__(*args, **kwargs)
self.a = a
self.b = b
class BarEvent(Event):
def __init__(self, a=None, b=None, *args, **kwargs):
super().__init__(*args, **kwargs)
self.a = a
self.b = b
class BazEvent(Event):
def __init__(self, a=None, b=None, *args, **kwargs):
super().__init__(*args, **kwargs)
self.a = a
self.b = b
class TestEventBus(unittest.TestCase):
def test_any(self):
"""
Test if any "*" selector is working
"""
eventFilter = EventFilter("*")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should also be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# The BarEvent should also be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
call(barEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
call(barEvent1),
])
def test_or_operator(self):
"""
Test if more than one events are properly selected
"""
eventFilter = EventFilter("FooEvent BarEvent")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should also be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# The BarEvent should also be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
call(barEvent1),
])
# The BazEvent should not be handled
bazEvent1 = BazEvent(traceid=traceids)
session.handle(bazEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
call(barEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
call(barEvent1),
])
def test_simple(self):
"""
Test if a simple selector is working
"""
eventFilter = EventFilter("FooEvent")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should also be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
def test_flag_first(self):
"""
Test if the ":first" flag is working
"""
eventFilter = EventFilter("FooEvent:first")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should not be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
def test_flag_after(self):
"""
Test if the ":after(1s)" flag is working
"""
eventFilter = EventFilter("FooEvent:after(1s)")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled immediately
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent replaces the first yet it's not handled immediately
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
])
# Wait for a bit more than 1 second
time.sleep(1.01)
# The last foo event should be there now
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled immediately
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# But it should appear at finalization, even though it's time is not
# there yet.
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
def test_flag_single(self):
"""
Test if the ":single" flag is working
"""
eventFilter = EventFilter("FooEvent:single")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should not be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# Create a new event filter once again
eventFilter = EventFilter("FooEvent:single")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# Before even any event is fired, the callback should be fired
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# No mroe FooEvents should be handled
fooEvent3 = FooEvent(traceid=traceids)
session.handle(fooEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should not be handled too
fooEvent4 = FooEvent(traceid=traceids)
session.handle(fooEvent4)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
def test_flag_last(self):
"""
Test if the ":last" flag is working
"""
eventFilter = EventFilter("FooEvent:last")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled, but not visible yet
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should replace the first fooEvent, but not visible yet
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
])
# When the session is finalized, the last FooEvent should be visible
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
def test_flag_nth(self):
"""
Test if the ":nth(x)" flag is working
"""
eventFilter = EventFilter("FooEvent:nth(2)")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The third FooEvent should not be handled
fooEvent3 = FooEvent(traceid=traceids)
session.handle(fooEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# When the session is finalized, nothing should be changed
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
def test_flag_nth_multi(self):
"""
Test if the ":nth(x)" flag is correctly applied to the event being tested
"""
eventFilter = EventFilter("FooEvent:nth(2) BarEvent:nth(3)")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The third FooEvent should not be handled
fooEvent3 = FooEvent(traceid=traceids)
session.handle(fooEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The first BarEvent should not be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The second BarEvent should not be handled
barEvent2 = BarEvent(traceid=traceids)
session.handle(barEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The third BarEvent should be handled
barEvent3 = BarEvent(traceid=traceids)
session.handle(barEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
call(barEvent3),
])
# When the session is finalized, nothing should be changed
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
call(barEvent3),
])
def test_flag_nth_group(self):
"""
Test if the ":nth(x,grp)" flag is correctly applied to the event being tested
"""
eventFilter = EventFilter("FooEvent:nth(2,all) BarEvent:nth(2,all)")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The third FooEvent should not be handled
fooEvent3 = FooEvent(traceid=traceids)
session.handle(fooEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The first BarEvent should not be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The second BarEvent should not be handled
barEvent2 = BarEvent(traceid=traceids)
session.handle(barEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The third BarEvent should not be handled
barEvent3 = BarEvent(traceid=traceids)
session.handle(barEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# When the session is finalized, nothing should be changed
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The first BarEvent reaches counter to 2, so it should be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(barEvent1),
])
# The second FooEvent should not be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(barEvent1),
])
# The second BarEvent should not be handled
barEvent2 = BarEvent(traceid=traceids)
session.handle(barEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(barEvent1),
])
# The third BarEvent should be handled
barEvent3 = BarEvent(traceid=traceids)
session.handle(barEvent3)
self.assertEqual(eventCallback.mock_calls, [
call(barEvent1),
])
# When the session is finalized, nothing should be changed
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(barEvent1),
])
def test_attrib_loose_regex(self):
"""
Test if the regex "~=" attribute matcher is working
"""
eventFilter = EventFilter("FooEvent[a~=u?lo+]")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(a="Helllll", traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(a="Heloooo", traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
def test_attrib_exact_regex(self):
"""
Test if the regex "~==" attribute matcher is working
"""
eventFilter = EventFilter("FooEvent[a~==^H.*?lo+]")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(a="Helllll", traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(a="Heloooo", traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
def test_multi_attrib_and(self):
"""
Test if multiple attributes work, when used to express an END condition
"""
eventFilter = EventFilter("FooEvent[a=He,b=Lo]")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(a="He", b="Zo", traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(a="He", b="Lo", traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent2),
])
def test_multi_attrib_and(self):
"""
Test if multiple attributes work, when used to express an OR condition
"""
eventFilter = EventFilter("FooEvent[a=He] FooEvent[b=Lo]")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(a="He", b="Zo", traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should be handled
fooEvent2 = FooEvent(a="He", b="Lo", traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(fooEvent2),
])
def test_or_first(self):
"""
Test if the ":first" flag is working when two events are selected
"""
eventFilter = EventFilter("FooEvent:first BarEvent")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should be handled
fooEvent1 = FooEvent(traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should not be handled
fooEvent2 = FooEvent(traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The BarEvent should be handled
barEvent1 = BarEvent(traceid=traceids)
session.handle(barEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(barEvent1),
])
# The second BarEvent should also be handled
barEvent2 = BarEvent(traceid=traceids)
session.handle(barEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(barEvent1),
call(barEvent2),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
call(barEvent1),
call(barEvent2),
])
def test_attrib_expr(self):
"""
Test if attribute expressions are accepted
"""
eventFilter = EventFilter("FooEvent[a.'some'.'dict'=1]")
# Start a session
traceids = ['foobar']
eventCallback = Mock()
session = eventFilter.start(traceids, eventCallback)
# The first FooEvent should not be handled
fooEvent1 = FooEvent(a={'some': {'dict':1}}, b="Zo", traceid=traceids)
session.handle(fooEvent1)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The second FooEvent should not be handled
fooEvent2 = FooEvent(a={'some': {'other':1}}, b="Zo", traceid=traceids)
session.handle(fooEvent2)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# The BarEvent should not be handled
barEvent = BarEvent(traceid=traceids)
session.handle(barEvent)
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
# No more events should be added when the session is finalized
session.finalize()
self.assertEqual(eventCallback.mock_calls, [
call(fooEvent1),
])
| 27.712085 | 81 | 0.668306 | 2,510 | 23,389 | 6.168924 | 0.064542 | 0.116378 | 0.157324 | 0.179799 | 0.922049 | 0.909842 | 0.891113 | 0.889628 | 0.883105 | 0.875032 | 0 | 0.012319 | 0.232973 | 23,389 | 843 | 82 | 27.744958 | 0.85078 | 0.22083 | 0 | 0.910781 | 0 | 0 | 0.02814 | 0.002741 | 0 | 0 | 0 | 0 | 0.16171 | 1 | 0.035316 | false | 0 | 0.01487 | 0 | 0.057621 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.