hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9924b487200dcd8dbae05e92be07fe130ec8fbcd | 5,163 | py | Python | libs/datagen/.ipynb_checkpoints/datagen_bb-checkpoint.py | vankhoa21991/selfsup | 34b21264db719265bb073441867d1a4f14836ce9 | [
"MIT"
] | 1 | 2020-09-10T13:29:41.000Z | 2020-09-10T13:29:41.000Z | libs/datagen/.ipynb_checkpoints/datagen_bb-checkpoint.py | vankhoa21991/selfsup | 34b21264db719265bb073441867d1a4f14836ce9 | [
"MIT"
] | null | null | null | libs/datagen/.ipynb_checkpoints/datagen_bb-checkpoint.py | vankhoa21991/selfsup | 34b21264db719265bb073441867d1a4f14836ce9 | [
"MIT"
] | null | null | null | import PIL
import glob
import random
import numpy as np
import torch
import torchvision
import torch.utils.data as data
import matplotlib.pyplot as plt
tensorToImage = torchvision.transforms.ToPILImage()
imageToTensor = torchvision.transforms.ToTensor()
class tiles_dataset_AE(data.Dataset):
"""Custom dataset that includes image file paths. Extends
torchvision.datasets.ImageFolder
A Dataset for Rotation-based Self-Supervision! Images are rotated clockwise.
- classification - False=Use rotation labels. True=Use original classification labels.
"""
def __init__(self, path, transform=None):
self.tile_paths = glob.glob(path + '/**/*.png')
self.transform = transform
# override the __getitem__ method. this is the method that dataloader calls
def __getitem__(self, index):
img = PIL.Image.open(self.tile_paths[index])
img = img.resize((512, 512), PIL.Image.ANTIALIAS)
if self.transform is not None:
img = self.transform(img)
return img, img
def __len__(self):
return len(self.tile_paths)
class tiles_dataset_Unet(data.Dataset):
"""Custom dataset that includes image file paths. Extends
torchvision.datasets.ImageFolder
A Dataset for Rotation-based Self-Supervision! Images are rotated clockwise.
- classification - False=Use rotation labels. True=Use original classification labels.
"""
def __init__(self, path, transform=None):
self.tile_paths = glob.glob(path + '/**/*.png')
self.transform = transform
# override the __getitem__ method. this is the method that dataloader calls
def __getitem__(self, index):
img = PIL.Image.open(self.tile_paths[index])
img = img.resize((224, 224), PIL.Image.ANTIALIAS)
if self.transform is not None:
img = self.transform(img)
return img, img
def __len__(self):
return len(self.tile_paths)
class tiles_dataset_rot(data.Dataset):
"""Custom dataset that includes image file paths. Extends
torchvision.datasets.ImageFolder
"""
def __init__(self, path, transform=None):
self.tile_paths = glob.glob(path + '/**/*.png')
self.transform = transform
# override the __getitem__ method. this is the method that dataloader calls
def __getitem__(self, index):
img = PIL.Image.open(self.tile_paths[index])
img = img.resize((512, 512), PIL.Image.ANTIALIAS)
# 4 classes for rotation
degrees = [0, 45, 90, 135, 180, 225, 270, 315]
rand_choice = random.randint(0, len(degrees) - 1)
npimg = np.asarray(img)
img = tensorToImage(npimg)
img = img.rotate(degrees[rand_choice])
if self.transform is not None:
img = self.transform(img)
return img, torch.tensor(rand_choice).long()
def show_batch(self, n=3):
fig, axs = plt.subplots(n, n)
fig.tight_layout()
for i in range(n):
for j in range(n):
rand_idx = random.randint(0, len(self) - 1)
img, label = self.__getitem__(rand_idx)
axs[i, j].imshow(tensorToImage(img), cmap='gray')
if self.classification:
axs[i, j].set_title('Label: {0} (Digit #{1})'.format(label.item(), label.item()))
else:
axs[i, j].set_title('Label: {0} ({1} Degrees)'.format(label.item(), label.item() * 45))
axs[i, j].axis('off')
def __len__(self):
return len(self.tile_paths)
class tiles_dataset_pt(data.Dataset):
"""Custom dataset that includes image file paths. Extends
torchvision.datasets.ImageFolder
"""
def __init__(self, path, transform=None):
self.tile_paths = glob.glob(path + '/**/*.png')
self.transform = transform
# override the __getitem__ method. this is the method that dataloader calls
def __getitem__(self, index):
img = PIL.Image.open(self.tile_paths[index])
img = img.resize((512, 512), PIL.Image.ANTIALIAS)
# 4 classes for rotation
degrees = [0, 45, 90, 135, 180, 225, 270, 315]
rand_choice = random.randint(0, len(degrees) - 1)
npimg = np.asarray(img)
img = tensorToImage(npimg)
img = img.rotate(degrees[rand_choice])
if self.transform is not None:
img = self.transform(img)
return img, torch.tensor(rand_choice).long()
def show_batch(self, n=3):
fig, axs = plt.subplots(n, n)
fig.tight_layout()
for i in range(n):
for j in range(n):
rand_idx = random.randint(0, len(self) - 1)
img, label = self.__getitem__(rand_idx)
axs[i, j].imshow(tensorToImage(img), cmap='gray')
if self.classification:
axs[i, j].set_title('Label: {0} (Digit #{1})'.format(label.item(), label.item()))
else:
axs[i, j].set_title('Label: {0} ({1} Degrees)'.format(label.item(), label.item() * 45))
axs[i, j].axis('off')
def __len__(self):
return len(self.tile_paths) | 36.617021 | 107 | 0.620957 | 661 | 5,163 | 4.683812 | 0.186082 | 0.031008 | 0.050388 | 0.031008 | 0.92022 | 0.92022 | 0.92022 | 0.92022 | 0.92022 | 0.92022 | 0 | 0.023043 | 0.260314 | 5,163 | 141 | 108 | 36.617021 | 0.787641 | 0.197947 | 0 | 0.840426 | 0 | 0 | 0.035433 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148936 | false | 0 | 0.085106 | 0.042553 | 0.361702 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
994c97ecc545b24312a75d59ae34fb6a0cf3d74f | 340 | py | Python | server/ebuttons.py | blackmambastudio/MiMo0.3-OS | 80d672f9315117c245878ecd578ffde8b623512b | [
"MIT"
] | null | null | null | server/ebuttons.py | blackmambastudio/MiMo0.3-OS | 80d672f9315117c245878ecd578ffde8b623512b | [
"MIT"
] | 4 | 2021-06-08T20:37:53.000Z | 2022-03-11T23:38:49.000Z | server/ebuttons.py | blackmambastudio/MiMo0.3-OS | 80d672f9315117c245878ecd578ffde8b623512b | [
"MIT"
] | null | null | null | import RPi.GPIO as GPIO
GPIO.setup(7, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
GPIO.setup(11, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
GPIO.setup(13, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
GPIO.setup(15, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
GPIO.setup(19, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
GPIO.setup(21, GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
| 26.153846 | 51 | 0.776471 | 71 | 340 | 3.464789 | 0.225352 | 0.357724 | 0.243902 | 0.292683 | 0.841463 | 0.841463 | 0.841463 | 0.841463 | 0.731707 | 0.731707 | 0 | 0.035144 | 0.079412 | 340 | 12 | 52 | 28.333333 | 0.750799 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
995543d36399ee4a1f0e337bd62e4b2546df452f | 15,013 | py | Python | tests/plugins/customer/test_sync.py | maxipavlovic/connect-cli | 73989c076c6fb5b4562c61a351448b1c77556676 | [
"Apache-2.0"
] | 12 | 2020-10-10T10:53:16.000Z | 2022-02-16T10:15:56.000Z | tests/plugins/customer/test_sync.py | maxipavlovic/connect-cli | 73989c076c6fb5b4562c61a351448b1c77556676 | [
"Apache-2.0"
] | 37 | 2020-09-28T12:00:52.000Z | 2021-12-20T12:38:25.000Z | tests/plugins/customer/test_sync.py | maxipavlovic/connect-cli | 73989c076c6fb5b4562c61a351448b1c77556676 | [
"Apache-2.0"
] | 11 | 2020-11-04T18:17:01.000Z | 2022-02-23T08:18:07.000Z | from connect.cli.plugins.customer.sync import CustomerSynchronizer
from connect.client import ConnectClient
def get_client():
return ConnectClient(
api_key='ApiKey',
endpoint='https://localhost/public/v1',
use_specs=False,
)
def test_sync_all_skip(fs, customers_workbook):
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert skipped == 2
assert created == 0
assert updated == 0
assert len(errors) == 0
def test_bad_action(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'wrong'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Action wrong is not supported']}
def test_bad_account(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook['Customers']['H2'] = 'robot'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Customer type must be customer or reseller, not robot']}
def test_empty_address(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook['Customers']['K2'] = ''
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Address line 1, city, state and zip are mandatory']}
def test_create_existing(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Create action must not have account id, is set to TA-7374-0753-1907']}
def test_create_customer_no_parent(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook['Customers']['H2'] = 'customer'
customers_workbook['Customers']['A2'] = None
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Customers requires a parent account']}
def test_update_customer_no_id(fs, customers_workbook):
customers_workbook['Customers']['D2'] = 'update'
customers_workbook['Customers']['A2'] = 'KA-'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Update operation requires account ID to be set']}
def test_update_customer_no_account_connect(fs, customers_workbook, mocked_responses):
customers_workbook['Customers']['D2'] = 'update'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/tier/accounts/TA-7374-0753-1907',
status=404,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {2: ['Account with id TA-7374-0753-1907 does not exist']}
def test_create_account_connect(fs, customers_workbook, mocked_responses, mocked_reseller):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook['Customers']['A2'] = None
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/tier/accounts',
json=mocked_reseller,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert created == 1
def test_create_account_connect_uuid(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D2'] = 'create'
customers_workbook['Customers']['A2'] = None
customers_workbook['Customers']['C2'] = None
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/tier/accounts',
json=mocked_reseller,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert created == 1
def test_create_account_connect_parent_id(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/tier/accounts',
json=mocked_reseller,
)
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts/{mocked_reseller["id"]}',
json=mocked_reseller,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert created == 1
def test_create_account_connect_parent_id_not_found(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts/{mocked_reseller["id"]}',
status=404,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {3: [f'Parent with id {mocked_reseller["id"]} does not exist']}
def test_create_account_connect_parent_external_id(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_id'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_id,{mocked_reseller["id"]})&limit=0&offset=0',
json=[mocked_reseller],
headers={
'Content-Range': 'items 0-1/1',
},
)
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_id,{mocked_reseller["id"]})&limit=1&offset=0',
json=[mocked_reseller],
headers={
'Content-Range': 'items 0-1/1',
},
)
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/tier/accounts',
json=mocked_reseller,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert created == 1
def test_create_account_connect_parent_external_id_not_found(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_id'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_id,{mocked_reseller["id"]})&limit=0&offset=0',
json=[],
headers={
'Content-Range': 'items 0-0/0',
},
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {3: ['Parent with external_id TA-7374-0753-1907 not found']}
def test_create_account_connect_parent_external_id_more_than_one(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_id'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_id,{mocked_reseller["id"]})&limit=0&offset=0',
json=[],
headers={
'Content-Range': 'items 0-2/2',
},
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {3: ['More than one Parent with external_id TA-7374-0753-1907']}
def test_create_account_connect_parent_external_uid(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_uid'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_uid,{mocked_reseller["id"]})&limit=0&offset=0',
json=[mocked_reseller],
headers={
'Content-Range': 'items 0-1/1',
},
)
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_uid,{mocked_reseller["id"]})&limit=1&offset=0',
json=[mocked_reseller],
headers={
'Content-Range': 'items 0-1/1',
},
)
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/tier/accounts',
json=mocked_reseller,
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert created == 1
def test_create_account_connect_parent_external_uid_not_found(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_uid'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_uid,{mocked_reseller["id"]})&limit=0&offset=0',
json=[],
headers={
'Content-Range': 'items 0-0/0',
},
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {3: ['Parent with external_uid TA-7374-0753-1907 not found']}
def test_create_account_connect_parent_external_uid_more_than_one(
fs,
customers_workbook,
mocked_responses,
mocked_reseller,
):
customers_workbook['Customers']['D3'] = 'create'
customers_workbook['Customers']['A3'] = None
customers_workbook['Customers']['C3'] = None
customers_workbook['Customers']['F3'] = 'external_uid'
customers_workbook.save(f'{fs.root_path}/test.xlsx')
client = get_client()
mocked_responses.add(
method='GET',
url=f'https://localhost/public/v1/tier/accounts?eq(external_uid,{mocked_reseller["id"]})&limit=0&offset=0',
json=[],
headers={
'Content-Range': 'items 0-2/2',
},
)
synchronizer = CustomerSynchronizer(
account_id='VA-123',
client=client,
silent=True,
)
synchronizer.open(f'{fs.root_path}/test.xlsx', 'Customers')
skipped, created, updated, errors = synchronizer.sync()
assert errors == {3: ['More than one Parent with external_uid TA-7374-0753-1907']}
| 32.495671 | 115 | 0.648838 | 1,718 | 15,013 | 5.481956 | 0.074505 | 0.149819 | 0.149076 | 0.042047 | 0.929815 | 0.921215 | 0.921215 | 0.893608 | 0.874283 | 0.874283 | 0 | 0.022723 | 0.208553 | 15,013 | 461 | 116 | 32.566161 | 0.769904 | 0 | 0 | 0.77037 | 0 | 0.019753 | 0.259175 | 0.059082 | 0 | 0 | 0 | 0 | 0.051852 | 1 | 0.046914 | false | 0 | 0.004938 | 0.002469 | 0.054321 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
41efc306fc874e08ad1cd0e808e78f2d21c47508 | 11,557 | py | Python | simpleredial/model/CompareInteractionModels/dual_bert_comp.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 36 | 2021-10-13T10:32:08.000Z | 2022-03-20T07:50:05.000Z | simpleredial/model/CompareInteractionModels/dual_bert_comp.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 3 | 2021-11-24T10:57:59.000Z | 2022-03-27T15:37:40.000Z | simpleredial/model/CompareInteractionModels/dual_bert_comp.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 1 | 2022-03-15T07:13:22.000Z | 2022-03-15T07:13:22.000Z | from model.utils import *
class BERTDualCompareEncoder(nn.Module):
def __init__(self, **args):
super(BERTDualCompareEncoder, self).__init__()
model = args['pretrained_model']
self.temp = args['temp']
self.comp_train_size = args['comp_train_size']
self.args = args
self.alpha = args['alpha']
p = args['dropout']
self.ctx_encoder = BertEmbedding(model=model, add_tokens=1)
self.can_encoder = BertEmbedding(model=model, add_tokens=1)
self.ctx_head = nn.Sequential(
nn.Linear(768, 768),
nn.Tanh(),
nn.Dropout(p=p),
nn.Linear(768, 768),
)
self.res_head = nn.Sequential(
nn.Linear(768, 768),
nn.Tanh(),
nn.Dropout(p=p),
nn.Linear(768, 768),
)
self.fusion_head = nn.Sequential(
nn.Linear(768*3, 768),
nn.Tanh(),
nn.Dropout(p=p),
nn.Linear(768, 1),
)
self.criterion = nn.BCEWithLogitsLoss()
def projection(self, cid_rep, rid_rep):
cid_rep += self.ctx_head(cid_rep)
rid_rep += self.res_head(rid_rep)
return cid_rep, rid_rep
def _encode(self, cid, rid, cid_mask, rid_mask):
cid_rep = self.ctx_encoder(cid, cid_mask)
rid_rep = self.can_encoder(rid, rid_mask)
return cid_rep, rid_rep
@torch.no_grad()
def get_cand(self, ids, attn_mask):
rid_rep = self.can_encoder(ids, attn_mask)
return rid_rep
@torch.no_grad()
def get_ctx(self, ids, attn_mask):
cid_rep = self.ctx_encoder(ids, attn_mask)
return cid_rep
@torch.no_grad()
def predict(self, batch):
cid = batch['ids']
cid_mask = torch.ones_like(cid)
rid = batch['rids']
rid_mask = batch['rids_mask']
bsz_c, bsz_r = len(cid), len(rid)
batch_size = rid.shape[0]
cid_rep, rid_rep = self._encode(cid, rid, cid_mask, rid_mask)
dot_product = torch.matmul(cid_rep, rid_rep.t()).squeeze(0)
cid_rep, rid_rep = self.projection(cid_rep, rid_rep)
# compare
tickets = []
reps = []
for i_r1 in range(bsz_r):
for i_r2 in range(bsz_r):
if i_r1 < i_r2:
tickets.append((i_r1, i_r2))
reps.append(
torch.cat([cid_rep[0], rid_rep[i_r1], rid_rep[i_r2]]) # [768*3]
)
reps = self.fusion_head(torch.stack(reps)).squeeze(1) # [K]
reps = torch.sigmoid(reps) # [K]
chain = {i: [] for i in range(bsz_r)}
for (i, j), l in zip(tickets, reps):
if l.item() >= 0.5 + self.args['threshold']:
chain[i].append(j)
elif l.item() < 0.5 - self.args['threshold']:
chain[j].append(i)
else:
# cannot decide, use the the dot product scores
if dot_product[i] >= dot_product[j]:
chain[i].append(j)
else:
chain[j].append(i)
scores = self.generate_scores(chain)
return torch.tensor(scores)
def generate_one_batch_compare(self, cid_rep, rid_rep):
rep, label = [], []
bsz = len(cid_rep)
random_idx = torch.randperm(bsz)
rid_rep_ = rid_rep[random_idx]
for i in range(bsz):
if random.random() > 0.5:
# good left and bad right, label: 2
rep.append(torch.cat([cid_rep[i], rid_rep[i], rid_rep_[i]]))
label.append(1)
else:
# good right and bad left, label: 1
rep.append(torch.cat([cid_rep[i], rid_rep_[i], rid_rep[i]]))
label.append(0)
return rep, label
def comp_cls(self, cid_rep, rid_rep):
# projection
cid_rep, rid_rep = self.projection(cid_rep, rid_rep)
bsz, label = len(cid_rep), []
# good vs. bad
rep, label = [], []
for i in range(self.comp_train_size):
rep_, label_ = self.generate_one_batch_compare(cid_rep, rid_rep)
rep.extend(rep_)
label.extend(label_)
rep = self.fusion_head(torch.stack(rep)).squeeze(1) # [B*K]
# random shuffle
random_idx = torch.randperm(len(rep))
rep = torch.stack([rep[i] for i in random_idx])
label = [label[i] for i in random_idx]
label = torch.tensor(label).cuda()
loss = self.criterion(rep, label.float())
# acc
acc = ((rep > 0.5).to(torch.float) == label).sum().item()
acc /= len(label)
return loss*self.alpha, acc
def forward(self, batch):
cid = batch['ids']
rid = batch['rids'] # [B, S] or [B*K, S]
cid_mask = batch['ids_mask']
rid_mask = batch['rids_mask']
cid_rep, rid_rep = self._encode(cid, rid, cid_mask, rid_mask)
dot_product = torch.matmul(cid_rep, rid_rep.t())
dot_product /= self.temp
batch_size = len(cid_rep)
# constrastive loss
mask = torch.zeros_like(dot_product)
mask[range(batch_size), range(batch_size)] = 1.
loss_ = F.log_softmax(dot_product, dim=-1) * mask
loss = (-loss_.sum(dim=1)).mean()
# compare loss
loss2, acc = self.comp_cls(cid_rep, rid_rep)
loss += loss2
return loss, acc
def generate_scores(self, edges):
# len(edges) = the number of the vertices
num = len(edges)
g = Graph(num)
for i, item_list in edges.items():
for j in item_list:
g.addEdge(i, j)
rest = g.topologicalSort()
scores = list(reversed(range(num)))
scores = [(i, j) for i, j in zip(rest, scores)]
scores = sorted(scores, key=lambda x:x[0])
scores = [j for i, j in scores]
return scores
class BERTDualCompareHNEncoder(nn.Module):
def __init__(self, **args):
super(BERTDualCompareHNEncoder, self).__init__()
model = args['pretrained_model']
self.comp_train_size = args['comp_train_size']
self.hard_comp_train_size = args['hard_comp_train_size']
self.args = args
p = args['dropout']
self.topk = args['gray_cand_num'] + 1
self.ctx_encoder = BertEmbedding(model=model, add_tokens=1)
self.can_encoder = BertEmbedding(model=model, add_tokens=1)
self.fusion_head = nn.Sequential(
nn.Linear(768*3, 768*2),
nn.Tanh(),
nn.Dropout(p=p),
nn.Linear(768*2, 768),
nn.Tanh(),
nn.Dropout(p=p),
nn.Linear(768, 1),
)
self.criterion = nn.BCEWithLogitsLoss()
def _encode(self, cid, rid, cid_mask, rid_mask):
cid_rep = self.ctx_encoder(cid, cid_mask)
rid_rep = self.can_encoder(rid, rid_mask)
return cid_rep, rid_rep
@torch.no_grad()
def predict(self, batch):
cid = batch['ids']
cid_mask = torch.ones_like(cid)
rid = batch['rids']
rid_mask = batch['rids_mask']
bsz_c, bsz_r = len(cid), len(rid)
batch_size = rid.shape[0]
cid_rep, rid_rep = self._encode(cid, rid, cid_mask, rid_mask)
# compare
tickets = []
reps = []
for i_r1 in range(bsz_r):
for i_r2 in range(bsz_r):
if i_r1 < i_r2:
tickets.append((i_r1, i_r2))
reps.append(
torch.cat([cid_rep[0], rid_rep[i_r1], rid_rep[i_r2]]) # [768*3]
)
reps = self.fusion_head(torch.stack(reps)).squeeze(1) # [K]
reps = torch.sigmoid(reps) # [K]
chain = {i: [] for i in range(bsz_r)}
for (i, j), l in zip(tickets, reps):
if l.item() >= 0.5 + self.args['threshold']:
chain[i].append(j)
elif l.item() < 0.5 - self.args['threshold']:
chain[j].append(i)
else:
# cannot decide
pass
scores = self.generate_scores(chain)
return torch.tensor(scores)
def generate_one_batch_compare_hard(self, cid_rep, rid_rep):
rep, label = [], []
bsz = len(cid_rep)
for i in range(bsz):
pos_i = self.topk * i
neg_i = random.choice([ii for ii in range(self.topk*i+1, self.topk*i+self.topk)])
if random.random() > 0.5:
# good left and bad right, label: 2
rep.append(torch.cat([cid_rep[i], rid_rep[pos_i], rid_rep[neg_i]]))
label.append(1)
else:
# good right and bad left, label: 1
rep.append(torch.cat([cid_rep[i], rid_rep[neg_i], rid_rep[pos_i]]))
label.append(0)
return rep, label
def generate_one_batch_compare(self, cid_rep, rid_rep):
rep, label = [], []
bsz, bsz_r = len(cid_rep), len(rid_rep)
for i in range(bsz):
pos_i = self.topk * i
while True:
neg_i = random.choice(range(bsz_r))
if neg_i != pos_i:
break
if random.random() > 0.5:
# good left and bad right, label: 2
rep.append(torch.cat([cid_rep[i], rid_rep[pos_i], rid_rep[neg_i]]))
label.append(1)
else:
# good right and bad left, label: 1
rep.append(torch.cat([cid_rep[i], rid_rep[neg_i], rid_rep[pos_i]]))
label.append(0)
return rep, label
def comp_cls(self, cid_rep, rid_rep):
bsz, label = len(cid_rep), []
# good vs. bad
rep, label = [], []
for i in range(self.comp_train_size):
rep_, label_ = self.generate_one_batch_compare(cid_rep, rid_rep)
rep.extend(rep_)
label.extend(label_)
# good vs. hard bad
for i in range(self.hard_comp_train_size):
rep_, label_ = self.generate_one_batch_compare_hard(cid_rep, rid_rep)
rep.extend(rep_)
label.extend(label_)
rep = self.fusion_head(torch.stack(rep)).squeeze(1) # [B*K]
# random shuffle
random_idx = torch.randperm(len(rep))
rep = torch.stack([rep[i] for i in random_idx])
label = [label[i] for i in random_idx]
label = torch.tensor(label).cuda()
loss = self.criterion(rep, label.float())
# acc
acc = ((rep > 0.5).to(torch.float) == label).sum().item()
acc /= len(label)
return loss, acc
def forward(self, batch):
cid = batch['ids'] # [B, S]
rid = batch['rids'] # [B*K, S]
cid_mask = batch['ids_mask']
rid_mask = batch['rids_mask']
cid_rep, rid_rep = self._encode(cid, rid, cid_mask, rid_mask)
# compare loss
loss, acc = self.comp_cls(cid_rep, rid_rep)
return loss, acc
def generate_scores(self, edges):
# len(edges) = the number of the vertices
num = len(edges)
g = Graph(num)
for i, item_list in edges.items():
for j in item_list:
g.addEdge(i, j)
rest = g.topologicalSort()
scores = list(reversed(range(num)))
scores = [(i, j) for i, j in zip(rest, scores)]
scores = sorted(scores, key=lambda x:x[0])
scores = [j for i, j in scores]
return scores
| 36.115625 | 98 | 0.535606 | 1,565 | 11,557 | 3.751438 | 0.102875 | 0.050077 | 0.039857 | 0.051099 | 0.863907 | 0.838017 | 0.82405 | 0.798501 | 0.778402 | 0.760688 | 0 | 0.016996 | 0.33815 | 11,557 | 319 | 99 | 36.22884 | 0.750556 | 0.051311 | 0 | 0.787879 | 0 | 0 | 0.021411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068182 | false | 0.003788 | 0.003788 | 0 | 0.140152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
51cc083f34811a3d80994d7b921153b9537abf99 | 15 | py | Python | aa.py | blogs1801/test | 2a005cb9d9340419aa9e15c7e97dcfaec3763e54 | [
"Apache-2.0"
] | null | null | null | aa.py | blogs1801/test | 2a005cb9d9340419aa9e15c7e97dcfaec3763e54 | [
"Apache-2.0"
] | null | null | null | aa.py | blogs1801/test | 2a005cb9d9340419aa9e15c7e97dcfaec3763e54 | [
"Apache-2.0"
] | null | null | null |
print(123456) | 5 | 13 | 0.733333 | 2 | 15 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.461538 | 0.133333 | 15 | 3 | 13 | 5 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
51dacd85d9ea193917a6cf33ce66c018ff39743e | 31 | py | Python | main.py | drizmiz/macrodown | 07fe0e22b3095c76356ab4783209730f45cd679e | [
"Apache-2.0"
] | null | null | null | main.py | drizmiz/macrodown | 07fe0e22b3095c76356ab4783209730f45cd679e | [
"Apache-2.0"
] | null | null | null | main.py | drizmiz/macrodown | 07fe0e22b3095c76356ab4783209730f45cd679e | [
"Apache-2.0"
] | null | null | null |
import pandocfilters
# TODO
| 5.166667 | 20 | 0.741935 | 3 | 31 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225806 | 31 | 5 | 21 | 6.2 | 0.958333 | 0.129032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cf9442aad73aca6565f35b7434c0e3f8e6a65964 | 69 | py | Python | paz/optimization/__init__.py | mohanrobotics/paz | eb691527a94aa9a9f1599726efa2628f4dfa4daa | [
"MIT"
] | 1 | 2021-01-09T13:03:04.000Z | 2021-01-09T13:03:04.000Z | paz/optimization/__init__.py | mohanrobotics/paz | eb691527a94aa9a9f1599726efa2628f4dfa4daa | [
"MIT"
] | null | null | null | paz/optimization/__init__.py | mohanrobotics/paz | eb691527a94aa9a9f1599726efa2628f4dfa4daa | [
"MIT"
] | null | null | null | from .losses import MultiBoxLoss
from .losses import KeypointNetLoss
| 23 | 35 | 0.855072 | 8 | 69 | 7.375 | 0.625 | 0.338983 | 0.542373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115942 | 69 | 2 | 36 | 34.5 | 0.967213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cfc8954ecf8b7c8afeeb6f3e32f02566e02f45f3 | 6,163 | py | Python | rmp2/utils/env_wrappers.py | UWRobotLearning/rmp2 | c612a014f517204b38c552619a441be4b3d7b67f | [
"MIT"
] | 17 | 2021-07-05T19:53:27.000Z | 2022-03-28T18:10:20.000Z | rmp2/utils/env_wrappers.py | UWRobotLearning/rmp2 | c612a014f517204b38c552619a441be4b3d7b67f | [
"MIT"
] | null | null | null | rmp2/utils/env_wrappers.py | UWRobotLearning/rmp2 | c612a014f517204b38c552619a441be4b3d7b67f | [
"MIT"
] | 2 | 2022-03-15T01:13:27.000Z | 2022-03-21T08:30:54.000Z | """
wrapper classes for input/output of the environment
"""
from abc import ABC, abstractmethod
import tensorflow as tf
def load_env_wrapper(envid, dtype=tf.float32):
if envid == '3link' or envid == '3link_residual':
return ThreeLinkWrapper(dtype=dtype)
elif envid == '3link_rmp':
return ThreeLinkRMPWrapper(dtype=dtype)
elif envid == 'franka' or envid == 'franka_residual':
return FrankaWrapper(dtype=dtype)
elif envid == 'franka_rmp':
return FrankaRMPWrapper(dtype=dtype)
else:
raise ValueError
class EnvWrapper(ABC):
"""
wrapper class for input/output of the environment
"""
@abstractmethod
def obs_to_policy_input(self, obs):
"""
params obs: environment observation
return policy_input: input to the policy
"""
@abstractmethod
def obs_to_value_input(self, obs):
"""
params obs: environment observation
return value_input: input to the value function
"""
@abstractmethod
def policy_output_to_action(self, actions):
"""
params actions: output of the policy
return env_actions: action to the environment
"""
class ThreeLinkWrapper(EnvWrapper):
def __init__(self, dtype=tf.float32):
self.dtype = dtype
def obs_to_policy_input(self, obs):
return tf.cast(obs, self.dtype)
def obs_to_value_input(self, obs):
return tf.cast(obs, self.dtype)
@staticmethod
def policy_output_to_action(actions):
return actions
class ThreeLinkRMPWrapper(EnvWrapper):
def __init__(self, dtype='float32'):
self.dtype = dtype
def obs_to_policy_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
num_obstacles = int((obs_dim - 17) / 5)
x = tf.gather(obs, range(obs_dim - 4, obs_dim - 2), axis=1)
xd = tf.gather(obs, range(obs_dim - 2, obs_dim), axis=1)
goal = tf.gather(obs, range(obs_dim - 6, obs_dim - 4), axis=1)
obstacles = tf.gather(obs, range(obs_dim - 6 - 3 * num_obstacles, obs_dim - 6), axis=1)
return {'x': x, 'xd': xd, 'goal': goal, 'obstacles': obstacles}
def obs_to_value_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
obs_to_value = tf.gather(obs, range(0, obs_dim - 6), axis=1)
return obs_to_value
@staticmethod
def policy_output_to_action(actions):
return actions
class ThreeLinkFullRMPWrapper(EnvWrapper):
def __init__(self, dtype='float32'):
self.dtype = dtype
def obs_to_policy_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
num_obstacles = int((obs_dim - 11) / 5)
sin_q = tf.gather(obs, range(0, 3), axis=1)
cos_q = tf.gather(obs, range(3, 6), axis=1)
q = tf.atan2(sin_q, cos_q)
qd = tf.gather(obs, range(6, 9), axis=1)
obstacles = tf.gather(obs, range(obs_dim - 3 * num_obstacles, obs_dim), axis=1)
return {'q': q, 'qd': qd, 'goal': None, 'obstacles': obstacles}
def obs_to_value_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
obs_to_value = tf.gather(obs, range(0, obs_dim - 6), axis=1)
return obs_to_value
@staticmethod
def policy_output_to_action(actions):
return actions
class FrankaWrapper(EnvWrapper):
def __init__(self, dtype=tf.float32):
self.dtype = dtype
def obs_to_policy_input(self, obs):
return tf.cast(obs, self.dtype)
def obs_to_value_input(self, obs):
return tf.cast(obs, self.dtype)
@staticmethod
def policy_output_to_action(actions):
return actions
class FrankaRMPWrapper(EnvWrapper):
def __init__(self, dtype='float32'):
self.dtype = dtype
def obs_to_policy_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
num_obstacles = int((obs_dim - 33) / 7)
x = tf.gather(obs, range(obs_dim - 6, obs_dim - 3), axis=1)
xd = tf.gather(obs, range(obs_dim - 3, obs_dim), axis=1)
goal = tf.gather(obs, range(obs_dim - 9, obs_dim - 6), axis=1)
obstacles = tf.gather(obs, range(obs_dim - 9 - 4 * num_obstacles, obs_dim - 9), axis=1)
return {'x': x, 'xd': xd, 'goal': goal, 'obstacles': obstacles}
def obs_to_value_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
obs_to_value = tf.gather(obs, range(0, obs_dim - 9), axis=1)
return obs_to_value
@staticmethod
def policy_output_to_action(actions):
return actions
class FrankaFullRMPWrapper(EnvWrapper):
def __init__(self, dtype='float32'):
self.dtype = dtype
def obs_to_policy_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
num_obstacles = int((obs_dim - 24) / 7)
sin_q = tf.gather(obs, range(0, 7), axis=1)
cos_q = tf.gather(obs, range(7, 14), axis=1)
q = tf.atan2(sin_q, cos_q)
qd = tf.gather(obs, range(14, 21), axis=1)
obstacles = tf.gather(obs, range(obs_dim - 4 * num_obstacles, obs_dim), axis=1)
return {'q': q, 'qd': qd, 'goal': None, 'obstacles': obstacles}
def obs_to_value_input(self, obs):
obs = tf.cast(obs, self.dtype)
batch_size, obs_dim = obs.shape
batch_size, obs_dim = int(batch_size), int(obs_dim)
obs_to_value = tf.gather(obs, range(0, obs_dim - 9), axis=1)
return obs_to_value
@staticmethod
def policy_output_to_action(actions):
return actions | 31.768041 | 95 | 0.628752 | 883 | 6,163 | 4.157418 | 0.098528 | 0.08499 | 0.059929 | 0.08717 | 0.820757 | 0.791882 | 0.771724 | 0.744484 | 0.704168 | 0.639335 | 0 | 0.019996 | 0.253448 | 6,163 | 194 | 96 | 31.768041 | 0.777874 | 0.056141 | 0 | 0.684615 | 0 | 0 | 0.026482 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.215385 | false | 0 | 0.015385 | 0.076923 | 0.453846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5c558090bb695df9bfb97a389352eea7ea17bb67 | 31,990 | py | Python | sdk/python/pulumi_oci/bastion/session.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/bastion/session.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/bastion/session.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['SessionArgs', 'Session']
@pulumi.input_type
class SessionArgs:
def __init__(__self__, *,
bastion_id: pulumi.Input[str],
key_details: pulumi.Input['SessionKeyDetailsArgs'],
target_resource_details: pulumi.Input['SessionTargetResourceDetailsArgs'],
display_name: Optional[pulumi.Input[str]] = None,
key_type: Optional[pulumi.Input[str]] = None,
session_ttl_in_seconds: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a Session resource.
:param pulumi.Input[str] bastion_id: The unique identifier (OCID) of the bastion on which to create this session.
:param pulumi.Input['SessionKeyDetailsArgs'] key_details: Public key details for a bastion session.
:param pulumi.Input['SessionTargetResourceDetailsArgs'] target_resource_details: Details about a bastion session's target resource.
:param pulumi.Input[str] display_name: (Updatable) The name of the session.
:param pulumi.Input[str] key_type: The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
:param pulumi.Input[int] session_ttl_in_seconds: The amount of time the session can remain active.
"""
pulumi.set(__self__, "bastion_id", bastion_id)
pulumi.set(__self__, "key_details", key_details)
pulumi.set(__self__, "target_resource_details", target_resource_details)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if key_type is not None:
pulumi.set(__self__, "key_type", key_type)
if session_ttl_in_seconds is not None:
pulumi.set(__self__, "session_ttl_in_seconds", session_ttl_in_seconds)
@property
@pulumi.getter(name="bastionId")
def bastion_id(self) -> pulumi.Input[str]:
"""
The unique identifier (OCID) of the bastion on which to create this session.
"""
return pulumi.get(self, "bastion_id")
@bastion_id.setter
def bastion_id(self, value: pulumi.Input[str]):
pulumi.set(self, "bastion_id", value)
@property
@pulumi.getter(name="keyDetails")
def key_details(self) -> pulumi.Input['SessionKeyDetailsArgs']:
"""
Public key details for a bastion session.
"""
return pulumi.get(self, "key_details")
@key_details.setter
def key_details(self, value: pulumi.Input['SessionKeyDetailsArgs']):
pulumi.set(self, "key_details", value)
@property
@pulumi.getter(name="targetResourceDetails")
def target_resource_details(self) -> pulumi.Input['SessionTargetResourceDetailsArgs']:
"""
Details about a bastion session's target resource.
"""
return pulumi.get(self, "target_resource_details")
@target_resource_details.setter
def target_resource_details(self, value: pulumi.Input['SessionTargetResourceDetailsArgs']):
pulumi.set(self, "target_resource_details", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The name of the session.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter(name="keyType")
def key_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
"""
return pulumi.get(self, "key_type")
@key_type.setter
def key_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_type", value)
@property
@pulumi.getter(name="sessionTtlInSeconds")
def session_ttl_in_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The amount of time the session can remain active.
"""
return pulumi.get(self, "session_ttl_in_seconds")
@session_ttl_in_seconds.setter
def session_ttl_in_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "session_ttl_in_seconds", value)
@pulumi.input_type
class _SessionState:
def __init__(__self__, *,
bastion_id: Optional[pulumi.Input[str]] = None,
bastion_name: Optional[pulumi.Input[str]] = None,
bastion_public_host_key_info: Optional[pulumi.Input[str]] = None,
bastion_user_name: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
key_details: Optional[pulumi.Input['SessionKeyDetailsArgs']] = None,
key_type: Optional[pulumi.Input[str]] = None,
lifecycle_details: Optional[pulumi.Input[str]] = None,
session_ttl_in_seconds: Optional[pulumi.Input[int]] = None,
ssh_metadata: Optional[pulumi.Input[Mapping[str, Any]]] = None,
state: Optional[pulumi.Input[str]] = None,
target_resource_details: Optional[pulumi.Input['SessionTargetResourceDetailsArgs']] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_updated: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Session resources.
:param pulumi.Input[str] bastion_id: The unique identifier (OCID) of the bastion on which to create this session.
:param pulumi.Input[str] bastion_name: The name of the bastion that is hosting this session.
:param pulumi.Input[str] bastion_public_host_key_info: The public key of the bastion host. You can use this to verify that you're connecting to the correct bastion.
:param pulumi.Input[str] bastion_user_name: The username that the session uses to connect to the target resource.
:param pulumi.Input[str] display_name: (Updatable) The name of the session.
:param pulumi.Input['SessionKeyDetailsArgs'] key_details: Public key details for a bastion session.
:param pulumi.Input[str] key_type: The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
:param pulumi.Input[str] lifecycle_details: A message describing the current session state in more detail.
:param pulumi.Input[int] session_ttl_in_seconds: The amount of time the session can remain active.
:param pulumi.Input[Mapping[str, Any]] ssh_metadata: The connection message for the session.
:param pulumi.Input[str] state: The current state of the session.
:param pulumi.Input['SessionTargetResourceDetailsArgs'] target_resource_details: Details about a bastion session's target resource.
:param pulumi.Input[str] time_created: The time the session was created. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
:param pulumi.Input[str] time_updated: The time the session was updated. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
if bastion_id is not None:
pulumi.set(__self__, "bastion_id", bastion_id)
if bastion_name is not None:
pulumi.set(__self__, "bastion_name", bastion_name)
if bastion_public_host_key_info is not None:
pulumi.set(__self__, "bastion_public_host_key_info", bastion_public_host_key_info)
if bastion_user_name is not None:
pulumi.set(__self__, "bastion_user_name", bastion_user_name)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if key_details is not None:
pulumi.set(__self__, "key_details", key_details)
if key_type is not None:
pulumi.set(__self__, "key_type", key_type)
if lifecycle_details is not None:
pulumi.set(__self__, "lifecycle_details", lifecycle_details)
if session_ttl_in_seconds is not None:
pulumi.set(__self__, "session_ttl_in_seconds", session_ttl_in_seconds)
if ssh_metadata is not None:
pulumi.set(__self__, "ssh_metadata", ssh_metadata)
if state is not None:
pulumi.set(__self__, "state", state)
if target_resource_details is not None:
pulumi.set(__self__, "target_resource_details", target_resource_details)
if time_created is not None:
pulumi.set(__self__, "time_created", time_created)
if time_updated is not None:
pulumi.set(__self__, "time_updated", time_updated)
@property
@pulumi.getter(name="bastionId")
def bastion_id(self) -> Optional[pulumi.Input[str]]:
"""
The unique identifier (OCID) of the bastion on which to create this session.
"""
return pulumi.get(self, "bastion_id")
@bastion_id.setter
def bastion_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bastion_id", value)
@property
@pulumi.getter(name="bastionName")
def bastion_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the bastion that is hosting this session.
"""
return pulumi.get(self, "bastion_name")
@bastion_name.setter
def bastion_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bastion_name", value)
@property
@pulumi.getter(name="bastionPublicHostKeyInfo")
def bastion_public_host_key_info(self) -> Optional[pulumi.Input[str]]:
"""
The public key of the bastion host. You can use this to verify that you're connecting to the correct bastion.
"""
return pulumi.get(self, "bastion_public_host_key_info")
@bastion_public_host_key_info.setter
def bastion_public_host_key_info(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bastion_public_host_key_info", value)
@property
@pulumi.getter(name="bastionUserName")
def bastion_user_name(self) -> Optional[pulumi.Input[str]]:
"""
The username that the session uses to connect to the target resource.
"""
return pulumi.get(self, "bastion_user_name")
@bastion_user_name.setter
def bastion_user_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bastion_user_name", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The name of the session.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter(name="keyDetails")
def key_details(self) -> Optional[pulumi.Input['SessionKeyDetailsArgs']]:
"""
Public key details for a bastion session.
"""
return pulumi.get(self, "key_details")
@key_details.setter
def key_details(self, value: Optional[pulumi.Input['SessionKeyDetailsArgs']]):
pulumi.set(self, "key_details", value)
@property
@pulumi.getter(name="keyType")
def key_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
"""
return pulumi.get(self, "key_type")
@key_type.setter
def key_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_type", value)
@property
@pulumi.getter(name="lifecycleDetails")
def lifecycle_details(self) -> Optional[pulumi.Input[str]]:
"""
A message describing the current session state in more detail.
"""
return pulumi.get(self, "lifecycle_details")
@lifecycle_details.setter
def lifecycle_details(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "lifecycle_details", value)
@property
@pulumi.getter(name="sessionTtlInSeconds")
def session_ttl_in_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The amount of time the session can remain active.
"""
return pulumi.get(self, "session_ttl_in_seconds")
@session_ttl_in_seconds.setter
def session_ttl_in_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "session_ttl_in_seconds", value)
@property
@pulumi.getter(name="sshMetadata")
def ssh_metadata(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The connection message for the session.
"""
return pulumi.get(self, "ssh_metadata")
@ssh_metadata.setter
def ssh_metadata(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "ssh_metadata", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[str]]:
"""
The current state of the session.
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "state", value)
@property
@pulumi.getter(name="targetResourceDetails")
def target_resource_details(self) -> Optional[pulumi.Input['SessionTargetResourceDetailsArgs']]:
"""
Details about a bastion session's target resource.
"""
return pulumi.get(self, "target_resource_details")
@target_resource_details.setter
def target_resource_details(self, value: Optional[pulumi.Input['SessionTargetResourceDetailsArgs']]):
pulumi.set(self, "target_resource_details", value)
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> Optional[pulumi.Input[str]]:
"""
The time the session was created. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_created")
@time_created.setter
def time_created(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_created", value)
@property
@pulumi.getter(name="timeUpdated")
def time_updated(self) -> Optional[pulumi.Input[str]]:
"""
The time the session was updated. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_updated")
@time_updated.setter
def time_updated(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_updated", value)
class Session(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
bastion_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
key_details: Optional[pulumi.Input[pulumi.InputType['SessionKeyDetailsArgs']]] = None,
key_type: Optional[pulumi.Input[str]] = None,
session_ttl_in_seconds: Optional[pulumi.Input[int]] = None,
target_resource_details: Optional[pulumi.Input[pulumi.InputType['SessionTargetResourceDetailsArgs']]] = None,
__props__=None):
"""
This resource provides the Session resource in Oracle Cloud Infrastructure Bastion service.
Creates a new session in a bastion. A bastion session lets authorized users connect to a target resource for a predetermined amount of time. The Bastion service recognizes two types of sessions, managed SSH sessions and SSH port forwarding sessions. Managed SSH sessions require that the target resource has an OpenSSH server and the Oracle Cloud Agent both running.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_session = oci.bastion.Session("testSession",
bastion_id=oci_bastion_bastion["test_bastion"]["id"],
key_details=oci.bastion.SessionKeyDetailsArgs(
public_key_content=var["session_key_details_public_key_content"],
),
target_resource_details=oci.bastion.SessionTargetResourceDetailsArgs(
session_type=var["session_target_resource_details_session_type"],
target_resource_id=oci_bastion_target_resource["test_target_resource"]["id"],
target_resource_operating_system_user_name=oci_identity_user["test_user"]["name"],
target_resource_port=var["session_target_resource_details_target_resource_port"],
target_resource_private_ip_address=var["session_target_resource_details_target_resource_private_ip_address"],
),
display_name=var["session_display_name"],
key_type=var["session_key_type"],
session_ttl_in_seconds=var["session_session_ttl_in_seconds"])
```
## Import
Sessions can be imported using the `id`, e.g.
```sh
$ pulumi import oci:bastion/session:Session test_session "id"
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] bastion_id: The unique identifier (OCID) of the bastion on which to create this session.
:param pulumi.Input[str] display_name: (Updatable) The name of the session.
:param pulumi.Input[pulumi.InputType['SessionKeyDetailsArgs']] key_details: Public key details for a bastion session.
:param pulumi.Input[str] key_type: The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
:param pulumi.Input[int] session_ttl_in_seconds: The amount of time the session can remain active.
:param pulumi.Input[pulumi.InputType['SessionTargetResourceDetailsArgs']] target_resource_details: Details about a bastion session's target resource.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: SessionArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provides the Session resource in Oracle Cloud Infrastructure Bastion service.
Creates a new session in a bastion. A bastion session lets authorized users connect to a target resource for a predetermined amount of time. The Bastion service recognizes two types of sessions, managed SSH sessions and SSH port forwarding sessions. Managed SSH sessions require that the target resource has an OpenSSH server and the Oracle Cloud Agent both running.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_session = oci.bastion.Session("testSession",
bastion_id=oci_bastion_bastion["test_bastion"]["id"],
key_details=oci.bastion.SessionKeyDetailsArgs(
public_key_content=var["session_key_details_public_key_content"],
),
target_resource_details=oci.bastion.SessionTargetResourceDetailsArgs(
session_type=var["session_target_resource_details_session_type"],
target_resource_id=oci_bastion_target_resource["test_target_resource"]["id"],
target_resource_operating_system_user_name=oci_identity_user["test_user"]["name"],
target_resource_port=var["session_target_resource_details_target_resource_port"],
target_resource_private_ip_address=var["session_target_resource_details_target_resource_private_ip_address"],
),
display_name=var["session_display_name"],
key_type=var["session_key_type"],
session_ttl_in_seconds=var["session_session_ttl_in_seconds"])
```
## Import
Sessions can be imported using the `id`, e.g.
```sh
$ pulumi import oci:bastion/session:Session test_session "id"
```
:param str resource_name: The name of the resource.
:param SessionArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(SessionArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
bastion_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
key_details: Optional[pulumi.Input[pulumi.InputType['SessionKeyDetailsArgs']]] = None,
key_type: Optional[pulumi.Input[str]] = None,
session_ttl_in_seconds: Optional[pulumi.Input[int]] = None,
target_resource_details: Optional[pulumi.Input[pulumi.InputType['SessionTargetResourceDetailsArgs']]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = SessionArgs.__new__(SessionArgs)
if bastion_id is None and not opts.urn:
raise TypeError("Missing required property 'bastion_id'")
__props__.__dict__["bastion_id"] = bastion_id
__props__.__dict__["display_name"] = display_name
if key_details is None and not opts.urn:
raise TypeError("Missing required property 'key_details'")
__props__.__dict__["key_details"] = key_details
__props__.__dict__["key_type"] = key_type
__props__.__dict__["session_ttl_in_seconds"] = session_ttl_in_seconds
if target_resource_details is None and not opts.urn:
raise TypeError("Missing required property 'target_resource_details'")
__props__.__dict__["target_resource_details"] = target_resource_details
__props__.__dict__["bastion_name"] = None
__props__.__dict__["bastion_public_host_key_info"] = None
__props__.__dict__["bastion_user_name"] = None
__props__.__dict__["lifecycle_details"] = None
__props__.__dict__["ssh_metadata"] = None
__props__.__dict__["state"] = None
__props__.__dict__["time_created"] = None
__props__.__dict__["time_updated"] = None
super(Session, __self__).__init__(
'oci:bastion/session:Session',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
bastion_id: Optional[pulumi.Input[str]] = None,
bastion_name: Optional[pulumi.Input[str]] = None,
bastion_public_host_key_info: Optional[pulumi.Input[str]] = None,
bastion_user_name: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
key_details: Optional[pulumi.Input[pulumi.InputType['SessionKeyDetailsArgs']]] = None,
key_type: Optional[pulumi.Input[str]] = None,
lifecycle_details: Optional[pulumi.Input[str]] = None,
session_ttl_in_seconds: Optional[pulumi.Input[int]] = None,
ssh_metadata: Optional[pulumi.Input[Mapping[str, Any]]] = None,
state: Optional[pulumi.Input[str]] = None,
target_resource_details: Optional[pulumi.Input[pulumi.InputType['SessionTargetResourceDetailsArgs']]] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_updated: Optional[pulumi.Input[str]] = None) -> 'Session':
"""
Get an existing Session resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] bastion_id: The unique identifier (OCID) of the bastion on which to create this session.
:param pulumi.Input[str] bastion_name: The name of the bastion that is hosting this session.
:param pulumi.Input[str] bastion_public_host_key_info: The public key of the bastion host. You can use this to verify that you're connecting to the correct bastion.
:param pulumi.Input[str] bastion_user_name: The username that the session uses to connect to the target resource.
:param pulumi.Input[str] display_name: (Updatable) The name of the session.
:param pulumi.Input[pulumi.InputType['SessionKeyDetailsArgs']] key_details: Public key details for a bastion session.
:param pulumi.Input[str] key_type: The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
:param pulumi.Input[str] lifecycle_details: A message describing the current session state in more detail.
:param pulumi.Input[int] session_ttl_in_seconds: The amount of time the session can remain active.
:param pulumi.Input[Mapping[str, Any]] ssh_metadata: The connection message for the session.
:param pulumi.Input[str] state: The current state of the session.
:param pulumi.Input[pulumi.InputType['SessionTargetResourceDetailsArgs']] target_resource_details: Details about a bastion session's target resource.
:param pulumi.Input[str] time_created: The time the session was created. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
:param pulumi.Input[str] time_updated: The time the session was updated. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _SessionState.__new__(_SessionState)
__props__.__dict__["bastion_id"] = bastion_id
__props__.__dict__["bastion_name"] = bastion_name
__props__.__dict__["bastion_public_host_key_info"] = bastion_public_host_key_info
__props__.__dict__["bastion_user_name"] = bastion_user_name
__props__.__dict__["display_name"] = display_name
__props__.__dict__["key_details"] = key_details
__props__.__dict__["key_type"] = key_type
__props__.__dict__["lifecycle_details"] = lifecycle_details
__props__.__dict__["session_ttl_in_seconds"] = session_ttl_in_seconds
__props__.__dict__["ssh_metadata"] = ssh_metadata
__props__.__dict__["state"] = state
__props__.__dict__["target_resource_details"] = target_resource_details
__props__.__dict__["time_created"] = time_created
__props__.__dict__["time_updated"] = time_updated
return Session(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="bastionId")
def bastion_id(self) -> pulumi.Output[str]:
"""
The unique identifier (OCID) of the bastion on which to create this session.
"""
return pulumi.get(self, "bastion_id")
@property
@pulumi.getter(name="bastionName")
def bastion_name(self) -> pulumi.Output[str]:
"""
The name of the bastion that is hosting this session.
"""
return pulumi.get(self, "bastion_name")
@property
@pulumi.getter(name="bastionPublicHostKeyInfo")
def bastion_public_host_key_info(self) -> pulumi.Output[str]:
"""
The public key of the bastion host. You can use this to verify that you're connecting to the correct bastion.
"""
return pulumi.get(self, "bastion_public_host_key_info")
@property
@pulumi.getter(name="bastionUserName")
def bastion_user_name(self) -> pulumi.Output[str]:
"""
The username that the session uses to connect to the target resource.
"""
return pulumi.get(self, "bastion_user_name")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[str]:
"""
(Updatable) The name of the session.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter(name="keyDetails")
def key_details(self) -> pulumi.Output['outputs.SessionKeyDetails']:
"""
Public key details for a bastion session.
"""
return pulumi.get(self, "key_details")
@property
@pulumi.getter(name="keyType")
def key_type(self) -> pulumi.Output[str]:
"""
The type of the key used to connect to the session. PUB is a standard public key in OpenSSH format.
"""
return pulumi.get(self, "key_type")
@property
@pulumi.getter(name="lifecycleDetails")
def lifecycle_details(self) -> pulumi.Output[str]:
"""
A message describing the current session state in more detail.
"""
return pulumi.get(self, "lifecycle_details")
@property
@pulumi.getter(name="sessionTtlInSeconds")
def session_ttl_in_seconds(self) -> pulumi.Output[int]:
"""
The amount of time the session can remain active.
"""
return pulumi.get(self, "session_ttl_in_seconds")
@property
@pulumi.getter(name="sshMetadata")
def ssh_metadata(self) -> pulumi.Output[Mapping[str, Any]]:
"""
The connection message for the session.
"""
return pulumi.get(self, "ssh_metadata")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
"""
The current state of the session.
"""
return pulumi.get(self, "state")
@property
@pulumi.getter(name="targetResourceDetails")
def target_resource_details(self) -> pulumi.Output['outputs.SessionTargetResourceDetails']:
"""
Details about a bastion session's target resource.
"""
return pulumi.get(self, "target_resource_details")
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> pulumi.Output[str]:
"""
The time the session was created. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_created")
@property
@pulumi.getter(name="timeUpdated")
def time_updated(self) -> pulumi.Output[str]:
"""
The time the session was updated. Format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339). Example: `2020-01-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_updated")
| 47.182891 | 374 | 0.669428 | 3,894 | 31,990 | 5.226759 | 0.062917 | 0.07026 | 0.057092 | 0.056208 | 0.890729 | 0.858104 | 0.841203 | 0.817914 | 0.80396 | 0.767946 | 0 | 0.008165 | 0.230478 | 31,990 | 677 | 375 | 47.252585 | 0.818621 | 0.350109 | 0 | 0.602151 | 1 | 0 | 0.145066 | 0.066768 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163978 | false | 0.002688 | 0.018817 | 0 | 0.284946 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5c61e98915d5d1492657f72b5b51f04180d53c80 | 4,746 | py | Python | dashboard/models.py | NASA-IMPACT/CDI-Dashboard | db86a25a4994ff8a36891b64afddf8682bbc0c94 | [
"MIT"
] | null | null | null | dashboard/models.py | NASA-IMPACT/CDI-Dashboard | db86a25a4994ff8a36891b64afddf8682bbc0c94 | [
"MIT"
] | 8 | 2021-10-30T15:33:25.000Z | 2022-03-30T20:22:18.000Z | dashboard/models.py | NASA-IMPACT/CDI-Dashboard | db86a25a4994ff8a36891b64afddf8682bbc0c94 | [
"MIT"
] | null | null | null | from django.db import models
class Masterlist(models.Model):
cdi_id = models.AutoField(primary_key=True)
name = models.CharField(max_length=500)
title = models.CharField(max_length=500)
organization = models.CharField(max_length=50)
catalog_url = models.URLField(max_length=500)
api_url = models.URLField(max_length=500)
cdi_themes = models.CharField(max_length=500, null=True)
metadata_type = models.CharField(max_length=50, null=True, blank=True)
geoplatform_id = models.CharField(max_length=50, null=True)
status = models.CharField(max_length=20)
datagov_ID = models.CharField(max_length=50, unique=True)
class CAPInstance(models.Model):
cap_id = models.AutoField(primary_key=True)
date = models.DateTimeField(auto_now=True)
masterlist_count = models.IntegerField()
climate_collection_count = models.IntegerField()
broken_urls = models.IntegerField()
lost_climate_tag = models.IntegerField()
not_in_masterlist = models.IntegerField()
total_warnings = models.IntegerField()
class BrokenAPI(models.Model):
broken_id = models.AutoField(primary_key=True)
cap_id = models.ForeignKey('CAPInstance', db_column='cap_id', on_delete=models.CASCADE)
datagov_ID = models.ForeignKey('Masterlist', to_field='datagov_ID', db_column='datagov_ID', on_delete=models.CASCADE)
class Retag(models.Model):
retag_id = models.AutoField(primary_key=True)
cap_id = models.ForeignKey('CAPInstance', db_column='cap_id', on_delete=models.CASCADE)
datagov_ID = models.ForeignKey('Masterlist', to_field='datagov_ID', db_column='datagov_ID', on_delete=models.CASCADE)
class QAUpdates(models.Model):
qa_id = models.AutoField(primary_key=True)
cap_id = models.ForeignKey('CAPInstance', db_column='cap_id', on_delete=models.CASCADE)
datagov_ID = models.ForeignKey('Masterlist', to_field='datagov_ID', db_column='datagov_ID', on_delete=models.CASCADE)
name = models.CharField(max_length=1000, null=True, blank=True)
title =models.CharField(max_length=1000, null=True, blank=True)
organization = models.CharField(max_length=100, null=True, blank=True)
catalog_url = models.CharField(max_length=1000, null=True, blank=True)
metadata_type = models.CharField(max_length=100, null=True, blank=True)
class NotInMasterlist(models.Model):
nml_id = models.AutoField(primary_key=True)
cap_id = models.ForeignKey('CAPInstance', db_column='cap_id', on_delete=models.CASCADE)
title = models.CharField(max_length=500)
name = models.CharField(max_length=500)
api_url = models.URLField(max_length=500)
catalog_url = models.URLField(max_length=500)
'''
# Create your models here.
class CDI_dataset(models.Model):
date_id = models.CharField(max_length=20, null=True)
cdi_id = models.IntegerField()
name = models.CharField(max_length=500)
title = models.CharField(max_length=500)
organization = models.CharField(max_length=50, null=True)
catalog_url = models.URLField(null=True)
api_url = models.URLField(null=True)
cdi_themes = models.CharField(max_length=200, null=True)
metadata_type = models.CharField(max_length=50, null=True, blank=True)
geoplatform_id = models.CharField(max_length=50, null=True)
is_active = models.BooleanField(default=True)
datagov_ID = models.UUIDField(null=True)
class Meta:
abstract = True
class retag(CDI_dataset):
pass
class broken_api(CDI_dataset):
pass
class updates_masterlist(CDI_dataset):
pass
class original_masterlist(CDI_dataset):
pass
class qa_updates(models.Model):
date_id = models.CharField(max_length=20, null=True)
cdi_id = models.IntegerField()
name = models.CharField(max_length=500)
title =models.CharField(max_length=500)
organization = models.CharField(max_length=50, null=True)
catalog_url = models.URLField(null=True)
metadata_type = models.CharField(max_length=50, null=True, blank=True)
datagov_id = models.UUIDField(null=True)
class not_in_masterlist(models.Model):
date_id = models.CharField(max_length=20, null=True)
title = models.CharField(max_length=500)
name = models.CharField(max_length=500)
api_url = models.URLField(null=True)
catalog_url = models.URLField(null=True)
class cdi_metrics(models.Model):
date_id = models.CharField(max_length=20, null=True)
masterlist_count = models.IntegerField()
climate_collection_count = models.IntegerField()
class warnings_summary(models.Model):
date_id = models.CharField(max_length=20, null=True)
total_warnings = models.IntegerField()
broken_urls = models.IntegerField()
lost_climate_tag = models.IntegerField()
not_in_masterlist = models.IntegerField()
'''
| 40.220339 | 121 | 0.750948 | 638 | 4,746 | 5.362069 | 0.134796 | 0.094709 | 0.168372 | 0.224496 | 0.849167 | 0.819059 | 0.763812 | 0.742765 | 0.708272 | 0.643087 | 0 | 0.023409 | 0.135904 | 4,746 | 117 | 122 | 40.564103 | 0.810778 | 0 | 0 | 0.326087 | 0 | 0 | 0.059332 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021739 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5cce12f47b1a5846671d607805c62b00a1b24029 | 167 | py | Python | tools/__init__.py | sun1638650145/TextExtraction | 7235ef4bb5f58d8deb7a64a3038cc781d4e08027 | [
"Apache-2.0"
] | 4 | 2020-12-10T12:21:43.000Z | 2021-06-03T03:11:25.000Z | tools/__init__.py | sun1638650145/TextExtraction | 7235ef4bb5f58d8deb7a64a3038cc781d4e08027 | [
"Apache-2.0"
] | null | null | null | tools/__init__.py | sun1638650145/TextExtraction | 7235ef4bb5f58d8deb7a64a3038cc781d4e08027 | [
"Apache-2.0"
] | null | null | null | from .preprocessing import create_tf_dataset
from .preprocessing import preprocessing_dataframe
from .utils import decode_prediction
from .utils import select_device
| 27.833333 | 50 | 0.874251 | 21 | 167 | 6.714286 | 0.571429 | 0.241135 | 0.326241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101796 | 167 | 5 | 51 | 33.4 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7a3a0b1f81a7a3d9e951d2fdfd6eeece5d9e25c5 | 216 | py | Python | Notlar/musa/variables2.py | ibrahimediz/ornekproje | c5ebeafc43a9c6d2aa639d0d95eedbce65991576 | [
"Apache-2.0"
] | null | null | null | Notlar/musa/variables2.py | ibrahimediz/ornekproje | c5ebeafc43a9c6d2aa639d0d95eedbce65991576 | [
"Apache-2.0"
] | null | null | null | Notlar/musa/variables2.py | ibrahimediz/ornekproje | c5ebeafc43a9c6d2aa639d0d95eedbce65991576 | [
"Apache-2.0"
] | null | null | null | var1="_________________Yemek_Sepeti______________"
print(var1.strip("_"))
var1 = "___ ______sssss__aaaa__________Yemek_Sepeti______ssss______ _____aaaa_______"
print(var1.strip('_sa ')) # aaaa__________Yemek_Sepeti
| 36 | 85 | 0.847222 | 20 | 216 | 4.6 | 0.45 | 0.358696 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 0.055556 | 216 | 5 | 86 | 43.2 | 0.431373 | 0.12037 | 0 | 0 | 0 | 0 | 0.659574 | 0.521277 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
7a4c0e8bae60761817669ac8d534deafddab7dc6 | 47 | py | Python | build/lib/simpleelasticlogging/__init__.py | JustinGuese/pip-simple-elasticsearch-logging | 8aa92ed6ac5505b22dd745551d7dde095d582594 | [
"MIT"
] | 1 | 2021-08-17T10:35:45.000Z | 2021-08-17T10:35:45.000Z | src/simpleelasticlogging/__init__.py | JustinGuese/pip-simple-elasticsearch-logging | 8aa92ed6ac5505b22dd745551d7dde095d582594 | [
"MIT"
] | null | null | null | src/simpleelasticlogging/__init__.py | JustinGuese/pip-simple-elasticsearch-logging | 8aa92ed6ac5505b22dd745551d7dde095d582594 | [
"MIT"
] | null | null | null | from .simpleelasticlogging import ElasticLogger | 47 | 47 | 0.914894 | 4 | 47 | 10.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 47 | 1 | 47 | 47 | 0.977273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8fbeb7e86d26d66c6d0bf232a7be0a1fe05ac261 | 229 | py | Python | projects/alarm/config.py | kingking888/crawler-pyspider | 29ba13905c73081097df9ef646a5c8194eb024be | [
"Apache-2.0"
] | 1 | 2021-12-07T15:48:51.000Z | 2021-12-07T15:48:51.000Z | projects/alarm/config.py | GongSong/crawler-pyspider | b6dcec4afa4e1cd393f94627290a21ed95238676 | [
"Apache-2.0"
] | null | null | null | projects/alarm/config.py | GongSong/crawler-pyspider | b6dcec4afa4e1cd393f94627290a21ed95238676 | [
"Apache-2.0"
] | 1 | 2021-11-10T07:12:02.000Z | 2021-11-10T07:12:02.000Z | # 万里牛同步操作异常的通知群 的钉钉报警机器人:万里牛同步操作异常的通知
HUPUN_SYNC_ABNORMAL_ROBOT = '1e3d755b23afad5874b4c263b523dac9edb63d0e6de206fa04eb00c1d5240a0c'
# 测试机器人
SPIDER_TEST_ROBOT = '87053027d1d9c2ca35b1eda6d248b3d087b764c135f1cb9b1b3387c47322d411'
| 38.166667 | 94 | 0.908297 | 13 | 229 | 15.615385 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.364055 | 0.052402 | 229 | 5 | 95 | 45.8 | 0.571429 | 0.179039 | 0 | 0 | 0 | 0 | 0.691892 | 0.691892 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8f3d5fc2ed2211303777ffb4bffe1083cdd14230 | 19,086 | py | Python | tests/spoke-tests.py | Threadless/python-spoke | e9308384a52ea1bba98912bd5d90c8438904c521 | [
"MIT"
] | 1 | 2017-02-15T21:01:57.000Z | 2017-02-15T21:01:57.000Z | tests/spoke-tests.py | Threadless/python-spoke | e9308384a52ea1bba98912bd5d90c8438904c521 | [
"MIT"
] | 6 | 2015-04-03T16:19:35.000Z | 2021-12-13T19:35:13.000Z | tests/spoke-tests.py | Threadless/python-spoke | e9308384a52ea1bba98912bd5d90c8438904c521 | [
"MIT"
] | 1 | 2017-02-26T09:15:20.000Z | 2017-02-26T09:15:20.000Z | # vim: fileencoding=utf8
import spoke
import unittest
from datetime import datetime
import os
import random
CUSTOMER_NAME = 'abc123'
CUSTOMER_KEY = 'abc123'
FAUX_ADDRESS = '123 Fake St'
FAUX_CITY = 'Funkytown'
FAUX_FIRST_NAME = 'Xavier'
FAUX_LAST_NAME = 'Ample'
FAUXN_NUMBER = '555 555 5555'
FAUX_ZIP = '12345'
FAUX_STATE = 'IL'
UNICODE_FAUX_ADDRESS = u'123 поддельная улица'
UNICODE_FAUX_CITY = u'Санкт-Петербург'
UNICODE_FAUX_FIRST_NAME = u'Björn'
UNICODE_FAUX_LAST_NAME = u'Björnsson'
class FauxTransport(object):
def send(self, request):
return '''<?xml version="1.0" encoding="utf-8" ?>
<ResponseSuccess>
<result>Success</result>
<time>11/10/2011 03:50:28 -05:00</time>
<immc_id>12345</immc_id>
</ResponseSuccess>'''
class SpokeTests(unittest.TestCase):
def test_constructor_required_fields(self):
params = dict(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
)
for k in params.keys():
copy = params.copy()
del copy[k]
self.assertRaises(spoke.ValidationError, spoke.Spoke, **copy)
def test_constructor_optional_fields(self):
spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
Logo = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/test.jpg',
),
)
def test_constructor_extra_fields(self):
self.assertRaises(spoke.ValidationError, spoke.Spoke,
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
Extra = 17,
)
def test_new_required_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
params = dict(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = 2,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
)
for k in params.keys():
copy = params.copy()
del copy[k]
self.assertRaises(spoke.ValidationError, sp.new, **copy)
def test_conditionally_required_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
params = dict(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = 2,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingAccount = '5110896',
ShippingMethodId = 66,
)
for k in params.keys():
copy = params.copy()
del copy[k]
self.assertRaises(spoke.ValidationError, sp.new, **copy)
del params['ShippingAccount']
del params['ShippingMethodId']
self.assertRaises(spoke.ValidationError, sp.new, **params)
def test_new_optional_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
sp.new(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = 2,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
def test_new_extra_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
self.assertRaises(spoke.ValidationError, sp.new,
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = 2,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)],
Extra = 'foo',
)
def test_update_required_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
params = dict(
OrderId = 1,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
)
for k in params.keys():
copy = params.copy()
del copy[k]
self.assertRaises(spoke.ValidationError, sp.update, **copy)
def test_update_extra_fields(self):
sp = spoke.Spoke(
Customer = CUSTOMER_NAME,
Key = CUSTOMER_KEY,
production = False,
transport = FauxTransport(),
)
self.assertRaises(spoke.ValidationError, sp.update,
OrderId = 1,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
Extra = 'foo',
)
@unittest.skipUnless('SPOKE_CUSTOMER' in os.environ and 'SPOKE_KEY' in os.environ, 'Please set SPOKE_CUSTOMER and SPOKE_KEY for live testing')
def test_roundtrip(self):
customer = os.getenv('SPOKE_CUSTOMER')
key = os.getenv('SPOKE_KEY')
sp = spoke.Spoke(
Customer = customer,
Key = key,
production = False,
)
order_id = random.randint(1000000, 2000000)
result = sp.new(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
self.assertTrue('immc_id' in result)
result = sp.update(
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
)
self.assertTrue('immc_id' in result)
result = sp.cancel(order_id)
self.assertTrue('immc_id' in result)
@unittest.skipUnless('SPOKE_CUSTOMER' in os.environ and 'SPOKE_KEY' in os.environ, 'Please set SPOKE_CUSTOMER and SPOKE_KEY for live testing')
def test_exceptions(self):
customer = os.getenv('SPOKE_CUSTOMER')
key = os.getenv('SPOKE_KEY')
sp = spoke.Spoke(
Customer = customer,
Key = key,
production = False,
)
order_id = random.randint(1000000, 2000000)
self.assertRaises(spoke.SpokeError, sp.update,
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
)
result = sp.new(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
self.assertRaises(spoke.SpokeError, sp.new,
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
sp.cancel(order_id)
@unittest.skipUnless('SPOKE_CUSTOMER' in os.environ and 'SPOKE_KEY' in os.environ, 'Please set SPOKE_CUSTOMER and SPOKE_KEY for live testing')
def test_unicode_roundtrip(self):
customer = os.getenv('SPOKE_CUSTOMER')
key = os.getenv('SPOKE_KEY')
sp = spoke.Spoke(
Customer = customer,
Key = key,
production = False,
)
order_id = random.randint(1000000, 2000000)
result = sp.new(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = order_id,
OrderInfo = dict(
Address1 = UNICODE_FAUX_ADDRESS,
City = UNICODE_FAUX_CITY,
CountryCode = 'RU',
FirstName = UNICODE_FAUX_FIRST_NAME,
LastName = UNICODE_FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = '-',
),
ShippingMethod = 'FirstClass',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
self.assertTrue('immc_id' in result)
result = sp.update(
OrderId = order_id,
OrderInfo = dict(
Address1 = UNICODE_FAUX_ADDRESS,
City = UNICODE_FAUX_CITY,
CountryCode = 'RU',
FirstName = UNICODE_FAUX_FIRST_NAME,
LastName = UNICODE_FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = '-',
),
)
self.assertTrue('immc_id' in result)
result = sp.cancel(order_id)
self.assertTrue('immc_id' in result)
@unittest.skipUnless('AS_SPOKE_CUSTOMER' in os.environ and 'AS_SPOKE_KEY' in os.environ, 'Please set AS_SPOKE_CUSTOMER and AS_SPOKE_KEY for live testing')
def test_roundtrip_artist_shop(self):
customer = os.getenv('AS_SPOKE_CUSTOMER')
key = os.getenv('AS_SPOKE_KEY')
sp = spoke.Spoke(
Customer = customer,
Key = key,
production = False,
)
order_id = random.randint(1000000, 2000000)
result = sp.new(
Cases = [dict(
CaseId = 1234,
CaseType = 'iph4tough',
PrintImage = dict(
ImageType = 'jpg',
Url = 'http://threadless.com/nothing.jpg',
),
Quantity = 1,
)],
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
ShippingMethodId = 66,
ShippingAccount = '5110896',
PackSlip = spoke.Image(
ImageType = 'jpg',
Url = 'file:///tmp/nothing.jpg',
),
Comments = [dict(
Type = 'Printer',
CommentText = 'testing',
)]
)
self.assertTrue('immc_id' in result)
result = sp.update(
OrderId = order_id,
OrderInfo = dict(
Address1 = FAUX_ADDRESS,
City = FAUX_CITY,
CountryCode = 'US',
FirstName = FAUX_FIRST_NAME,
LastName = FAUX_LAST_NAME,
OrderDate = datetime.now(),
PhoneNumber = FAUXN_NUMBER,
PostalCode = FAUX_ZIP,
State = FAUX_STATE,
),
)
self.assertTrue('immc_id' in result)
result = sp.cancel(order_id)
self.assertTrue('immc_id' in result) | 31.969849 | 158 | 0.450697 | 1,511 | 19,086 | 5.512905 | 0.105228 | 0.037455 | 0.026531 | 0.037815 | 0.876951 | 0.867107 | 0.851981 | 0.844778 | 0.834334 | 0.834334 | 0 | 0.019984 | 0.462538 | 19,086 | 597 | 159 | 31.969849 | 0.792065 | 0.001153 | 0 | 0.833969 | 0 | 0 | 0.085663 | 0.010964 | 0 | 0 | 0 | 0 | 0.03626 | 1 | 0.026718 | false | 0 | 0.009542 | 0.001908 | 0.041985 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8f412c72259a926430d8811a9c17d812eb78b48c | 4,414 | py | Python | tests/charts-out/test_graphics_charts_spider_sample1.py | debragail/reportlab-mirror | 1e5814e1313ed50d5abb65487b207711cb4f7595 | [
"BSD-3-Clause"
] | 1 | 2020-05-21T23:34:55.000Z | 2020-05-21T23:34:55.000Z | tests/charts-out/test_graphics_charts_spider_sample1.py | debragail/reportlab-mirror | 1e5814e1313ed50d5abb65487b207711cb4f7595 | [
"BSD-3-Clause"
] | null | null | null | tests/charts-out/test_graphics_charts_spider_sample1.py | debragail/reportlab-mirror | 1e5814e1313ed50d5abb65487b207711cb4f7595 | [
"BSD-3-Clause"
] | null | null | null | #Autogenerated by ReportLab guiedit do not edit
from reportlab.graphics.shapes import _DrawingEditorMixin, Drawing, Group, Polygon, PolyLine, Line, String
from reportlab.lib.colors import Color, CMYKColor, PCMYKColor
class ExplodedDrawing_Drawing(_DrawingEditorMixin,Drawing):
def __init__(self,width=400,height=400,*args,**kw):
Drawing.__init__(self,width,height,*args,**kw)
self.transform = (1,0,0,1,0,0)
self.add(Polygon(points=[108.3032,252.9412,200,288.2353,291.6968,252.9412,306.9796,138.2353,200,58.82353,93.02039,138.2353,108.3032,252.9412],fillColor=Color(1,.972549,.862745,1),fillOpacity=None,strokeColor=None,strokeWidth=0,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(Polygon(points=[85.37899,266.1765,200,252.9412,261.1312,235.2941,276.414,155.8824,200,94.11765,131.2274,160.2941,85.37899,266.1765],fillColor=Color(0,1,1,1),fillOpacity=None,strokeColor=None,strokeWidth=0,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(Polygon(points=[138.8688,235.2941,200,261.7647,261.1312,235.2941,329.9038,125,200,164.7059,108.3032,147.0588,138.8688,235.2941],fillColor=Color(.596078,.984314,.596078,1),fillOpacity=None,strokeColor=None,strokeWidth=0,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(PolyLine(points=[108.3032,252.9412,200,288.2353,291.6968,252.9412,306.9796,138.2353,200,58.82353,93.02039,138.2353,108.3032,252.9412],strokeColor=Color(1,.972549,.862745,1),strokeWidth=1,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(PolyLine(points=[85.37899,266.1765,200,252.9412,261.1312,235.2941,276.414,155.8824,200,94.11765,131.2274,160.2941,85.37899,266.1765],strokeColor=Color(0,1,1,1),strokeWidth=1,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(PolyLine(points=[138.8688,235.2941,200,261.7647,261.1312,235.2941,329.9038,125,200,164.7059,108.3032,147.0588,138.8688,235.2941],strokeColor=Color(.596078,.984314,.596078,1),strokeWidth=1,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=None,strokeOpacity=None))
self.add(Line(200,200,200,350,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
self.add(Line(200,200,329.9038,275,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
self.add(Line(200,200,329.9038,125,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
self.add(Line(200,200,200,50,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
self.add(Line(200,200,70.09619,125,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
self.add(Line(200,200,70.09619,275,strokeColor=Color(0,0,0,1),strokeWidth=.5,strokeLineCap=0,strokeLineJoin=0,strokeMiterLimit=0,strokeDashArray=(2,2),strokeOpacity=None))
v0=self._nn(Group())
v0.transform = (1,0,0,1,200,357.5)
v0.add(String(-2.22,-4,'a',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
v0=self._nn(Group())
v0.transform = (1,0,0,1,336.399,278.75)
v0.add(String(-2.5,-4,'b',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
v0=self._nn(Group())
v0.transform = (1,0,0,1,336.399,121.25)
v0.add(String(-2.22,-4,'c',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
v0=self._nn(Group())
v0.transform = (1,0,0,1,200,42.5)
v0.add(String(-2.5,-4,'d',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
v0=self._nn(Group())
v0.transform = (1,0,0,1,63.601,121.25)
v0.add(String(-2.22,-4,'e',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
v0=self._nn(Group())
v0.transform = (1,0,0,1,63.601,278.75)
v0.add(String(-1.665,-4,'f',textAnchor='start',fontName='Times-Roman',fontSize=10,fillColor=Color(0,0,0,1)))
if __name__=="__main__": #NORUNTESTS
ExplodedDrawing_Drawing().save(formats=['pdf'],outDir='.',fnRoot=None)
| 102.651163 | 323 | 0.769144 | 728 | 4,414 | 4.627747 | 0.18956 | 0.018997 | 0.016919 | 0.103295 | 0.860196 | 0.820421 | 0.807658 | 0.807658 | 0.791036 | 0.791036 | 0 | 0.220947 | 0.033077 | 4,414 | 42 | 324 | 105.095238 | 0.568416 | 0.012687 | 0 | 0.157895 | 1 | 0 | 0.026171 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0.052632 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8f6d031b8f2e3751dfd6130a984d5ed86348c138 | 17,162 | py | Python | src/compas_singular/algorithms/coloring/two_coloring.py | christiandimitri/compas_singular | 4a80b30664b07e15eb799229171bbc02c27e0efc | [
"MIT"
] | 1 | 2021-04-19T15:09:21.000Z | 2021-04-19T15:09:21.000Z | src/compas_singular/algorithms/coloring/two_coloring.py | christiandimitri/compas_singular | 4a80b30664b07e15eb799229171bbc02c27e0efc | [
"MIT"
] | null | null | null | src/compas_singular/algorithms/coloring/two_coloring.py | christiandimitri/compas_singular | 4a80b30664b07e15eb799229171bbc02c27e0efc | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
import time
import itertools
from compas_singular.datastructures.mesh_quad.mesh_quad import QuadMesh
from compas_singular.datastructures.mesh_quad.grammar_pattern import delete_strips
from compas_singular.datastructures.mesh_quad.grammar_pattern import collateral_strip_deletions
from compas_singular.datastructures.mesh_quad.grammar_pattern import total_boundary_deletions
from compas.topology import adjacency_from_edges
from compas_singular.topology.coloring import is_adjacency_two_colorable
from compas_singular.utilities.lists import are_items_in_list
__all__ = [
'TwoColourableProjection',
]
class TwoColourableProjection:
def __init__(self, quad_mesh):
self.quad_mesh = quad_mesh
self.results = None
self.times = None
def projection_4(self, kmax = 1):
"""Projection of a coarse quad mesh to the closest two-colourable sub-spaces.
Parameters
----------
mesh : CoarseQuadMesh
A coarse quad mesh.
Returns
-------
results : dict
The combination pointing to the its result. If the combination is valid, the result is a tuple of the the two-colourable mesh, the two-colourable network, and the network vertex colors.
References
----------
.. [1] Oval et al., *Topology Finding of Two-Colourable Quad-Mesh Patterns in Structural Design*. Submitted.
"""
mesh = self.quad_mesh
vertices, edges = mesh.strip_graph()
if is_adjacency_two_colorable(adjacency_from_edges(edges)) is not None:
self.results = True
return True
results = {}
t0 = time.time()
k = 0
current_pool = [[skey] for skey in mesh.strips()]
while k < kmax:
k += 1
next_pool = []
#print(current_pool)
for combination in current_pool:
if len(combination) > 1:
combination = list(set([i for item in combination for i in item]))
else:
combination = list(set(combination))
if len(collateral_strip_deletions(mesh, combination)) > 0:
continue
if len(total_boundary_deletions(mesh, combination)) > 0:
continue
# delete strips in mesh and check validity
copy_mesh = mesh.copy()
delete_strips(copy_mesh, combination, preserve_boundaries=True)
topological_validity = copy_mesh.is_manifold() and copy_mesh.euler() == mesh.euler()
if not topological_validity:
pass
# delete strip vertices in network and check colourability
else:
new_vertices = {vkey: xyz for vkey, xyz in vertices.items() if vkey not in combination}
new_edges = [(u, v) for u, v in edges if u not in combination and v not in combination]
two_colourability = is_adjacency_two_colorable(adjacency_from_edges(new_edges))
if not two_colourability:
next_pool.append(combination)
else:
results[tuple(combination)] = (copy_mesh, (new_vertices, new_edges), two_colourability)
current_pool = itertools.combinations(next_pool, 2)
t1 = time.time()
print(t1 - t0)
self.results = results
def projection_1(self, kmax = 1):
"""Projection of a coarse quad mesh to the closest two-colourable sub-spaces.
Parameters
----------
mesh : CoarseQuadMesh
A coarse quad mesh.
Returns
-------
results : dict
The combination pointing to the its result. If the combination is valid, the result is a tuple of the the two-colourable mesh, the two-colourable network, and the network vertex colors.
References
----------
.. [1] Oval et al., *Topology Finding of Two-Colourable Quad-Mesh Patterns in Structural Design*. Submitted.
"""
mesh = self.quad_mesh
n = mesh.number_of_strips()
vertices, edges = mesh.strip_graph()
if is_adjacency_two_colorable(adjacency_from_edges(edges)) is not None:
self.results = True
return True
relation = {}
for k in range(n):
for combination in itertools.combination(mesh.strips(), k):
other_strips = list(mesh.strips())
for skey in combination:
del other_strips[skey]
downsteam_combinations = itertools.combination(mesh.strips(), k)
relation[combination] = []
results = {}
t0 = time.time()
k = 0
current_pool = [[skey] for skey in mesh.strips()]
while k < kmax:
k += 1
next_pool = []
#print(current_pool)
for combination in current_pool:
if len(combination) > 1:
combination = list(set([i for item in combination for i in item]))
else:
combination = list(set(combination))
if len(collateral_strip_deletions(mesh, combination)) > 0:
continue
if len(total_boundary_deletions(mesh, combination)) > 0:
continue
# delete strips in mesh and check validity
copy_mesh = mesh.copy()
delete_strips(copy_mesh, combination, preserve_boundaries=True)
topological_validity = copy_mesh.is_manifold() and copy_mesh.euler() == mesh.euler()
if not topological_validity:
pass
# delete strip vertices in network and check colourability
else:
new_vertices = {vkey: xyz for vkey, xyz in vertices.items() if vkey not in combination}
new_edges = [(u, v) for u, v in edges if u not in combination and v not in combination]
two_colourability = is_adjacency_two_colorable(adjacency_from_edges(new_edges))
if not two_colourability:
next_pool.append(combination)
else:
results[tuple(combination)] = (copy_mesh, (new_vertices, new_edges), two_colourability)
current_pool = itertools.combinations(next_pool, 2)
t1 = time.time()
print(t1 - t0)
self.results = results
def projection_2(self, kmax = 1):
"""Projection of a coarse quad mesh to the closest two-colourable sub-spaces.
Parameters
----------
mesh : CoarseQuadMesh
A coarse quad mesh.
Returns
-------
results : dict
The combination pointing to the its result. If the combination is valid, the result is a tuple of the the two-colourable mesh, the two-colourable network, and the network vertex colors.
References
----------
.. [1] Oval et al., *Topology Finding of Two-Colourable Quad-Mesh Patterns in Structural Design*. Submitted.
"""
mesh = self.quad_mesh
# result for input mesh
vertices, edges = mesh.strip_graph()
if is_adjacency_two_colorable(adjacency_from_edges(edges)) is not None:
self.results = True
return True
results = {}
# guarantee valid kmax
n = mesh.number_of_strips()
if kmax < 1 or kmax > n:
kmax = n
t0 = time.time()
# start iteration
k = 0
discarding_combination = []
while k < kmax:
k += 1
to_continue = False
at_least_one_valid_k = False
# test all combinations of (n k) strips
for combination in itertools.combinations(mesh.strips(), k):
set_combi = set(combination)
# check results from potential previous sub-combinations
for disc_comb in discarding_combination:
if disc_comb.issubset(set_combi):
break
if len(collateral_strip_deletions(mesh, combination)) > 0:
to_continue = True
continue
if len(total_boundary_deletions(mesh, combination)) > 0:
discarding_combination.append(set(combination))
continue
# delete strips in mesh and check validity
copy_mesh = mesh.copy()
delete_strips(copy_mesh, combination, preserve_boundaries=True)
topological_validity = copy_mesh.is_manifold() and copy_mesh.euler() == mesh.euler()
if not topological_validity:
discarding_combination.append(set(combination))
# delete strip vertices in network and check colourability
else:
new_vertices = {vkey: xyz for vkey, xyz in vertices.items() if vkey not in combination}
new_edges = [(u, v) for u, v in edges if u not in combination and v not in combination]
two_colourability = is_adjacency_two_colorable(adjacency_from_edges(new_edges))
if not two_colourability:
to_continue = True
else:
results[combination] = (copy_mesh, (new_vertices, new_edges), two_colourability)
discarding_combination.append(set(combination))
if not to_continue:
break
t1 = time.time()
print(t1 - t0)
#print(results)
self.results = results
def projection(self, kmax = 1):
"""Projection of a coarse quad mesh to the closest two-colourable sub-spaces.
Parameters
----------
mesh : CoarseQuadMesh
A coarse quad mesh.
Returns
-------
results : dict
The combination pointing to the its result. If the combination is valid, the result is a tuple of the the two-colourable mesh, the two-colourable network, and the network vertex colors.
References
----------
.. [1] Oval et al., *Topology Finding of Two-Colourable Quad-Mesh Patterns in Structural Design*. Submitted.
"""
mesh = self.quad_mesh
# # result for input mesh
# vertices, edges = mesh.strip_graph()
# if is_adjacency_two_colorable(adjacency_from_edges(edges)) is not None:
# self.results = True
# return True
results = {}
# guarantee valid kmax
n = mesh.number_of_strips()
if kmax < 1 or kmax > n:
kmax = n
# start iteration
k = 0
while k < kmax:
k += 1
to_continue = False
# test all combinations of (n k) strips
for combination in itertools.combinations(mesh.strips(), k):
# check results from potential previous sub-combinations
for previous_combination in results:
if are_items_in_list(previous_combination, combination):
# if a sub-combination yielded an invalid topology do not pursue
if results[previous_combination] == 'invalid shape topology':
results[combination] = 'already invalid shape topology'
break
elif type(results[previous_combination]) == tuple:
results[combination] = 'already two-colourable'
break
if combination in results:
continue
if len(collateral_strip_deletions(mesh, combination)) > 0:
to_continue = True
continue
if len(total_boundary_deletions(mesh, combination)) > 0:
results[combination] = 'invalid shape topology'
continue
# delete strips in mesh and check validity
copy_mesh = mesh.copy()
copy_mesh.collect_strips()
delete_strips(copy_mesh, combination, preserve_boundaries=True)
topological_validity = copy_mesh.is_manifold() and copy_mesh.euler() == mesh.euler()
if not topological_validity:
results[combination] = 'invalid shape topology'
# delete strip vertices in network and check colourability
else:
vertices, edges = copy_mesh.strip_graph()
two_colourability = is_adjacency_two_colorable(adjacency_from_edges(edges))
if not two_colourability:
results[combination] = 'not two-colourable'
to_continue = True
else:
results[combination] = (copy_mesh, (vertices, edges), two_colourability)
if not to_continue:
break
return self.results
def projection_0(self, kmax = 1):
"""Projection of a coarse quad mesh to the closest two-colourable sub-spaces.
Parameters
----------
mesh : CoarseQuadMesh
A coarse quad mesh.
Returns
-------
results : dict
The combination pointing to the its result. If the combination is valid, the result is a tuple of the the two-colourable mesh, the two-colourable network, and the network vertex colors.
References
----------
.. [1] Oval et al., *Topology Finding of Two-Colourable Quad-Mesh Patterns in Structural Design*. Submitted.
"""
mesh = self.quad_mesh
# result for input mesh
vertices, edges = mesh.strip_graph()
if is_adjacency_two_colorable(adjacency_from_edges(edges)) is not None:
self.results = True
return True
results = {}
# guarantee valid kmax
n = mesh.number_of_strips()
if kmax < 1 or kmax > n:
kmax = n
t0 = time.time()
t1 = - float('inf')
t2 = - float('inf')
total_valid = 0
# start iteration
k = 0
discarding_combination = []
discarding_combination_type = {}
while k < kmax:
k += 1
to_continue = False
at_least_one_valid_k = False
# test all combinations of (n k) strips
for combination in itertools.combinations(mesh.strips(), k):
set_combi = set(combination)
# check results from potential previous sub-combinations
for disc_comb in discarding_combination:
if disc_comb.issubset(set_combi):
#if are_items_in_list(previous_combination, combination):
# if a sub-combination yielded an invalid topology do not pursue
if discarding_combination_type[tuple(disc_comb)] == 'invalid shape topology':
results[combination] = 'already invalid shape topology'
break
elif discarding_combination_type[tuple(disc_comb)] == 'two-colourable':
results[combination] = 'already two-colourable'
break
if len(collateral_strip_deletions(mesh, combination)) > 0:
results[combination] = 'collateral deletions'
if len(total_boundary_deletions(mesh, combination)) > 0:
results[combination] = 'invalid shape topology'
discarding_combination.append(set(combination))
discarding_combination_type[tuple(combination)] = 'invalid shape topology'
if combination in results:
continue
# delete strips in mesh and check validity
copy_mesh = mesh.copy()
#copy_mesh.collect_strips()
delete_strips(copy_mesh, combination, preserve_boundaries=True)
topological_validity = copy_mesh.is_manifold() and copy_mesh.euler() == mesh.euler()
if not topological_validity:
results[combination] = 'invalid shape topology'
discarding_combination.append(set(combination))
discarding_combination_type[tuple(combination)] = 'invalid shape topology'
# delete strip vertices in network and check colourability
else:
#vertices, edges = copy_mesh.strip_graph()
new_vertices = {vkey: xyz for vkey, xyz in vertices.items() if vkey not in combination}
new_edges = [(u, v) for u, v in edges if u not in combination and v not in combination]
two_colourability = is_adjacency_two_colorable(adjacency_from_edges(new_edges))
if not two_colourability:
results[combination] = 'not two-colourable'
to_continue = True
else:
results[combination] = (copy_mesh, (new_vertices, new_edges), two_colourability)
discarding_combination.append(set(combination))
discarding_combination_type[tuple(combination)] = 'two-colourable'
at_least_one_valid_k = True
total_valid += 1
if t1 < 0:
t1 = time.time()
if t2 < 0 and total_valid > 0 and not at_least_one_valid_k:
t2 = time.time()
if not to_continue:
break
t3 = time.time()
print(len(discarding_combination))
print(t3 - t0)
self.results = results
self.times = (t1 - t0, t2 - t0, t3 - t0)
# --------------------------------------------------------------------------
# results
# --------------------------------------------------------------------------
def get_results(self):
return self.results
def strip_deletions_yielding_two_colourability(self):
out = []
for combination, result in self.get_results().items():
if type(result) == tuple:
out.append(combination)
return out
# ==============================================================================
# Main
# ==============================================================================
if __name__ == '__main__':
import compas
from compas_plotters.meshplotter import MeshPlotter
vertices = [[1.9, 11.2, 0.0], [9.7, 9.0, 0.0], [4.3, 4.7, 0.0], [3.8, 13.2, 0.0], [1.9, 13.2, 0.0], [4.7, 2.2, 0.0], [5.7, 9.4, 0.0], [9.1, 6.4, 0.0], [14.2, 5.2, 0.0], [14.2, 2.2, 0.0], [14.2, 13.2, 0.0], [1.9, 2.2, 0.0], [4.1, 10.9, 0.0], [11.5, 5.0, 0.0], [11.4, 2.2, 0.0], [5.7, 6.7, 0.0], [14.2, 10.2, 0.0], [1.9, 4.2, 0.0], [11.4, 13.2, 0.0], [11.7, 10.6, 0.0]]
faces = [[7, 15, 2, 13], [15, 6, 12, 2], [6, 1, 19, 12], [1, 7, 13, 19], [8, 16, 19, 13], [16, 10, 18, 19], [18, 3, 12, 19], [3, 4, 0, 12], [0, 17, 2, 12], [17, 11, 5, 2], [5, 14, 13, 2], [14, 9, 8, 13]]
vertices_1 = [[-332.0, -22.0, 0.0], [-332.0, -19.0, 0.0], [-332.0, -5.0, 0.0], [-332.0, -2.0, 0.0], [-329.0, -22.0, 0.0], [-329.0, -19.0, 0.0], [-329.0, -5.0, 0.0], [-329.0, -2.0, 0.0], [-324.0, -15.0, 0.0], [-324.0, -9.0, 0.0], [-318.0, -15.0, 0.0], [-318.0, -9.0, 0.0], [-312.0, -22.0, 0.0], [-312.0, -19.0, 0.0], [-312.0, -5.0, 0.0], [-312.0, -2.0, 0.0], [-305.0, -15.0, 0.0], [-305.0, -9.0, 0.0], [-299.0, -15.0, 0.0], [-299.0, -9.0, 0.0], [-295.0, -22.0, 0.0], [-295.0, -19.0, 0.0], [-295.0, -5.0, 0.0], [-295.0, -2.0, 0.0], [-292.0, -22.0, 0.0], [-292.0, -19.0, 0.0], [-292.0, -5.0, 0.0], [-292.0, -2.0, 0.0]]
faces_1 = [[16, 17, 14, 13], [14, 17, 19, 22], [21, 22, 19, 18], [21, 18, 16, 13], [8, 9, 6, 5], [6, 9, 11, 14], [13, 14, 11, 10], [13, 10, 8, 5], [4, 5, 1, 0], [5, 6, 2, 1], [6, 7, 3, 2], [14, 15, 7, 6], [22, 23, 15, 14], [12, 13, 5, 4], [20, 21, 13, 12], [26, 27, 23, 22], [25, 26, 22, 21], [24, 25, 21, 20]]
mesh = QuadMesh.from_vertices_and_faces(vertices, faces)
# plotter = MeshPlotter(mesh, figsize=(5.0, 5.0))
# plotter.draw_vertices(text='key')
# plotter.draw_edges()
# plotter.draw_faces()
# plotter.show()
mesh.collect_strips()
projection = TwoColourableProjection(mesh)
for i in projection.projection(kmax=5):
print(projection.strip_deletions_yielding_two_colourability())
| 33.195358 | 616 | 0.668803 | 2,440 | 17,162 | 4.558197 | 0.085246 | 0.014026 | 0.008092 | 0.022748 | 0.835731 | 0.776029 | 0.753282 | 0.745819 | 0.739256 | 0.713001 | 0 | 0.047057 | 0.19514 | 17,162 | 516 | 617 | 33.25969 | 0.758126 | 0.255273 | 0 | 0.732639 | 0 | 0 | 0.031707 | 0.001819 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027778 | false | 0.006944 | 0.048611 | 0.003472 | 0.104167 | 0.024306 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a446ce543b7402dbd19aa6d1fa794a5baef11145 | 83,015 | py | Python | infoblox_netmri/api/broker/v3_8_0/discovery_statuses_broker.py | infobloxopen/infoblox_netmri | aa1c744df7e439dbe163bb9edd165e4e85a9771b | [
"Apache-2.0"
] | 12 | 2016-02-19T12:37:54.000Z | 2022-03-04T20:11:08.000Z | infoblox_netmri/api/broker/v3_8_0/discovery_statuses_broker.py | azinfoblox/infoblox-netmri | 02372c5231e2677ab6299cb659a73c9a41b4b0f4 | [
"Apache-2.0"
] | 18 | 2015-11-12T18:37:00.000Z | 2021-05-19T07:59:55.000Z | infoblox_netmri/api/broker/v3_8_0/discovery_statuses_broker.py | azinfoblox/infoblox-netmri | 02372c5231e2677ab6299cb659a73c9a41b4b0f4 | [
"Apache-2.0"
] | 18 | 2016-01-07T12:04:34.000Z | 2022-03-31T11:05:41.000Z | from ..broker import Broker
class DiscoveryStatusesBroker(Broker):
controller = "discovery_statuses"
def check_withdrawal_license(self, **kwargs):
"""Check Management Server and Managed Devices when device unlicensed with SDC license
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIDs: The list of devices to be checked
:type DeviceIDs: Array
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return cascading_impact: Warning message for user
:rtype cascading_impact: String
"""
return self.api_request(self._get_method_fullname("check_withdrawal_license"), kwargs)
def search(self, **kwargs):
"""Searches the current discovery statuses based on view
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` current
:param view: Defined filters for displaying discovery statuses data. Valid values are "current", "non_reached", "problems". "ok"
:type view: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param NetworkID: The NetworkID maps to one or more collectors that are identified by DataSourceIDs.
:type NetworkID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_DeviceName: The operator to apply to the field DeviceName. DeviceName: The NetMRI name of the device; this will be either the same as DeviceSysName or DeviceDNSName, depending on your NetMRI configuration.
:type op_DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_DeviceName: If op_DeviceName is specified, this value will be compared to the value in DeviceName using the specified operator. The value in this input will be treated as an explicit constant value. This field must be specified if op_DeviceName is specified.
:type val_c_DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IPAddress: The operator to apply to ip address fields like Device.DeviceIPDotted and ifAddr.IfIPDotted. DeviceIPDotted : The management IP address of the device, in dotted (or colon-delimited for IPv6) format. IfIPDotted: The IP address configured on the device in dotted (or colon-delimited for IPv6) format.
:type op_IPAddress: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IPAddress: If op_IPAddress is specified, this value will be compared to the value in IPAddress using the specified operator. The value in this input will be treated as an explicit constant value. This field must be specified if op_IPAddress is specified.
:type val_c_IPAddress: String
| ``api version min:`` 2.10
| ``api version max:`` None
| ``required:`` False
| ``default:`` False
:param management_ip_only_ind: If op_IPAddress is used, by default entire ip addresses is searched. Set the value in this input to true to search only management ip addresses.
:type management_ip_only_ind: Boolean
| ``api version min:`` 3.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param select: The list of fields to return for each discovery status. Valid values are DeviceID, DeviceIPDotted, VirtualNetworkID, VirtualNetworkName, VirtualNetworkMemberArtificialInd, DeviceIPNumeric, DeviceName, DeviceType, DeviceUniqueKey, Status, StatusDetail, SPMLicensedInd, SAMLicensedInd, LicenseOverride, FirstSeen, DeviceFirstOccurrenceTime, LastSeen, FingerPrintEnabled, SDNCollectionEnabled, SNMPCollectionEnabled, CLICollectionEnabled, ConfigCollectionEnabled, CLICredentialMessage, CLICredentialStatus, CLICredentialTimestamp, GlobalConfigCollectionEnabled, GlobalConfigCollectionMessage, ConfigCollectionMessage, ConfigCollectionStatus, ConfigCollectionStatusSort, ConfigCollectionTimestamp, DeviceGroupMessage, DeviceGroupStatus, DeviceGroupTimestamp, ExistsMessage, ExistsTimestamp, FingerPrintMessage, FingerPrintStatus, FingerPrintTimestamp, SDNCollectionMessage, SDNCollectionStatus, SDNCollectionTimestamp, LastAction, LastTimestamp, ReachableMessage, ReachableStatus, ReachableTimestamp, SNMPCollectionMessage, SNMPCollectionStatus, SNMPCollectionTimestamp, SNMPCredentialMessage, SNMPCredentialStatus, SNMPCredentialTimestamp, RuleListAnalysisMessage, RuleListAnalysisStatus, RuleListAnalysisTimestamp, UnitID, AccessInd, ForwardingInd, StartPortControlBlackoutSchedule, PortControlBlackoutDuration, StartBlackoutSchedule, BlackoutDuration. If empty or omitted, all fields will be returned.
:type select: Array
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The index of the first row to display
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 25
:param limit: The index of the last row to display. Default value is 25
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param order: The sort order for the output.
:type order: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param xml_filter: A SetFilter XML structure to further refine the search. The SetFilter will be applied AFTER any search query or field values, but before any limit options. The limit will not be enforced when this filter is used, and paging is not supported.
:type xml_filter: String
**Outputs**
"""
return self.api_request(self._get_method_fullname("search"), kwargs)
def static(self, **kwargs):
"""Displays the current discovery statuses
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` current
:param view: Defined filters for displaying discovery statuses data. Valid values are "current", "licensed", "non_reached", "problems", "ok"
:type view: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The index of the first row to display
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 25
:param limit: The index of the last row to display. Default value is 25
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param query: Search string
:type query: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only. Specifying a UnitID will pull the most recent data from a specific collector while not specifying one will pull data directly from the OC.
:type UnitID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return AccessInd: The indicator of collection of the filtering access information
:rtype AccessInd: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return BlackoutDuration: The blackout duration in minutes
:rtype BlackoutDuration: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICollectionEnabled: CLI collection enabled
:rtype CLICollectionEnabled: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialMessage: A message detailing the current CLI credential status
:rtype CLICredentialMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialStatus: Indicates the status of SSH discovery attempt of the device
:rtype CLICredentialStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialTimestamp: The date/time of the last update to the CLI credential status
:rtype CLICredentialTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionEnabled: Whether configuration polling is enabled for this device
:rtype ConfigCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionMessage: A message detailing the current configuration file collection status
:rtype ConfigCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionStatus: Indicates whether configuration files have been successfully collected from the device
:rtype ConfigCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionStatusSort: Config collection status sorted
:rtype ConfigCollectionStatusSort: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionTimestamp: The date/time of the last update to the configuration collection status
:rtype ConfigCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceFirstOccurrenceTime: The date/time that this device was first seen on the network
:rtype DeviceFirstOccurrenceTime: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupMessage: A message detailing the current device group assignment status
:rtype DeviceGroupMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupStatus: Indicates whether the device has been successfully assigned to a device group
:rtype DeviceGroupStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupTimestamp: The date/time of the last update to the device group assignment status
:rtype DeviceGroupTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: Internal system identifier for the discovery status
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPDotted: The management IP address of the device, in dotted (or colon-delimited for IPv6) format
:rtype DeviceIPDotted: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPNumeric: The numerical value of the device IP address
:rtype DeviceIPNumeric: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceName: The name of the device; this will be either the same as DeviceSysName or DeviceDNSName, depending on your configuration
:rtype DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceType: The determined device type
:rtype DeviceType: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceUniqueKey: Unique key which allows duplicates detecting over different Virtual Networks
:rtype DeviceUniqueKey: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ExistsMessage: A message detailing the method by which the device is decided to exist
:rtype ExistsMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ExistsTimestamp: The date/time of the last device existance update
:rtype ExistsTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintEnabled: Whether fingerprinting is enabled for this device
:rtype FingerPrintEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintMessage: A message detailing the current fingerprint status
:rtype FingerPrintMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintStatus: Indicates whether device fingerprinting has been successfully executed for the device
:rtype FingerPrintStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintTimestamp: The date/time of the last update to the fingerprint status
:rtype FingerPrintTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FirstSeen: The date and time this record was first seen
:rtype FirstSeen: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ForwardingInd: The indicator of collection of forwarding information
:rtype ForwardingInd: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return GlobalConfigCollectionEnabled: Whether configuration collection is enabled globally
:rtype GlobalConfigCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return GlobalConfigCollectionMessage: Global configuration collection details message
:rtype GlobalConfigCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return InBlackout: In Blackout?
:rtype InBlackout: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return InPortControlBlackout: In Port Control Blackout?
:rtype InPortControlBlackout: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastAction: The last action (from the set in this model) executed against this device
:rtype LastAction: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastSeen: The timestamp of when data was last polled from this device
:rtype LastSeen: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastTimestamp: The date/time of the last action
:rtype LastTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LicenseOverride: Licensing setting. Requested unlicensed: 0, requested licensed: 1, automatic: 2
:rtype LicenseOverride: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return PortControlBlackoutDuration: Port Control Blackout in minutes
:rtype PortControlBlackoutDuration: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableMessage: A message detailing the current ACL/rule list analysis status
:rtype ReachableMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableStatus: Indicates whether device is reachable according to it's ACL/rule list
:rtype ReachableStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableTimestamp: The date/time of the last update to the reachable status
:rtype ReachableTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisMessage: A message detailing the current ACL/rule list analysis status
:rtype RuleListAnalysisMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisStatus: Indicates whether ACLs/rule lists have been successfully collected from the device
:rtype RuleListAnalysisStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisTimestamp: The date/time of the last update to the rule list status
:rtype RuleListAnalysisTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SAMLicensedInd: Licensed for Security Device Controller
:rtype SAMLicensedInd: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionEnabled: SDN collection enabled?
:rtype SDNCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionMessage: SDN collection message
:rtype SDNCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionStatus: Indicates whether SDN data has been successfully collected for the device
:rtype SDNCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionTimestamp: The date/time of the last update to the SDN collection status
:rtype SDNCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionEnabled: SNMP collection enabled?
:rtype SNMPCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionMessage: A message detailing the current SNMP collection status
:rtype SNMPCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionStatus: Indicates whether SNMP data has been successfully collected from the device
:rtype SNMPCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionTimestamp: The date/time of the last update to the SNMP collection status
:rtype SNMPCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialMessage: A message detailing the current SNMP credential status
:rtype SNMPCredentialMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialStatus: Indicates the status of SNMP discovery attempt of the device
:rtype SNMPCredentialStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialTimestamp: The date/time of the last update to the SNMP credential status
:rtype SNMPCredentialTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SPMLicensedInd: Licensed for Switch Port Manager
:rtype SPMLicensedInd: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return StartBlackoutSchedule: The blackout start time in cron format
:rtype StartBlackoutSchedule: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return StartPortControlBlackoutSchedule: Port Control Blackout in cron format
:rtype StartPortControlBlackoutSchedule: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: License Status
:rtype Status: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return StatusDetail: Licenses applied
:rtype StatusDetail: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return UnitID: The internal identifier of the collector on which the device is configured
:rtype UnitID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkID: The internal identifier for the network which the device is associated to
:rtype VirtualNetworkID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkMemberArtificialInd: Indicates is this is an artificial VNM
:rtype VirtualNetworkMemberArtificialInd: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkName: The name of the Network View
:rtype VirtualNetworkName: String
"""
return self.api_request(self._get_method_fullname("static"), kwargs)
def telnet_queue(self, **kwargs):
"""Displays the telnet queue
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The index of the first row to display
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 25
:param limit: The maximum number of rows from the start to display
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: The id of the device
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPDotted: The IP address of the device, in dotted (or colon-delimited for IPv6) format
:rtype DeviceIPDotted: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPNumeric: The numerical value of the device IP address
:rtype DeviceIPNumeric: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceName: The name of the device
:rtype DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceType: The determined device type
:rtype DeviceType: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastAction: The last action executed against this device
:rtype LastAction: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastTimestamp: The date/time of the last action
:rtype LastTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NextAttempt: The date/time of the next attempt
:rtype NextAttempt: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: The status of the request
:rtype Status: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkID: The internal identifier for the network which the device is associated to
:rtype VirtualNetworkID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkName: The name of the Network View
:rtype VirtualNetworkName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return total: The total number of rows in the response
:rtype total: Integer
"""
return self.api_request(self._get_method_fullname("telnet_queue"), kwargs)
def ssh_queue(self, **kwargs):
"""Displays the ssh queue
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The index of the first row to display
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 25
:param limit: The maximum number of rows from the start to display
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: The id of the device
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPDotted: The IP address of the device, in dotted (or colon-delimited for IPv6) format
:rtype DeviceIPDotted: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPNumeric: The numerical value of the device IP address
:rtype DeviceIPNumeric: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceName: The name of the device
:rtype DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceType: The determined device type
:rtype DeviceType: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastAction: The last action executed against this device
:rtype LastAction: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastTimestamp: The date/time of the last action
:rtype LastTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NextAttempt: The date/time of the next attempt
:rtype NextAttempt: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: The status of the request
:rtype Status: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkID: The internal identifier for the network which the device is associated to
:rtype VirtualNetworkID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkName: The name of the Network View
:rtype VirtualNetworkName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return total: The total number of rows in the response
:rtype total: Integer
"""
return self.api_request(self._get_method_fullname("ssh_queue"), kwargs)
def snmp_queue(self, **kwargs):
"""Displays the snmp queue
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The index of the first row to display
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 25
:param limit: The maximum number of rows from the start to display
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: The id of the device
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPDotted: The IP address of the device, in dotted (or colon-delimited for IPv6) format
:rtype DeviceIPDotted: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPNumeric: The numerical value of the device IP address
:rtype DeviceIPNumeric: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceName: The name of the device
:rtype DeviceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceType: The determined device type
:rtype DeviceType: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastAction: The last action executed against this device
:rtype LastAction: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastTimestamp: The date/time of the last action
:rtype LastTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NextAttempt: The date/time of the next attempt
:rtype NextAttempt: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: The status of the request
:rtype Status: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkID: The internal identifier for the network which the device is associated to
:rtype VirtualNetworkID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return VirtualNetworkName: The name of the Network View
:rtype VirtualNetworkName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return total: The total number of rows in the response
:rtype total: Integer
"""
return self.api_request(self._get_method_fullname("snmp_queue"), kwargs)
def summary(self, **kwargs):
"""Summary data for discovery statuses
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NumIdentified: The number of identified IP addresses
:rtype NumIdentified: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NumReached: The number of devices that have been touched by NetMRI
:rtype NumReached: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NumClassified: The number of fully discovered devices
:rtype NumClassified: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NumReachedNotClassified: The number of devices touched by NetMRI but not classified
:rtype NumReachedNotClassified: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NumProcessed: The number of fully processed devices
:rtype NumProcessed: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Licensed: The number of licensed devices
:rtype Licensed: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Networked: The number of networked devices
:rtype Networked: Integer
"""
return self.api_request(self._get_method_fullname("summary"), kwargs)
def discover_next(self, **kwargs):
"""Set the given device ids to be discovered next
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIDs: An array of device ids to be discovered next
:type DeviceIDs: Array
**Outputs**
"""
return self.api_request(self._get_method_fullname("discover_next"), kwargs)
def discover_now(self, **kwargs):
"""Set the "DeviceID" of the given device to be discovered now and check the status of previously started discovery process. If the "id" parameter is not set, the method starts a new Discover Now process and returns its id for further tracking the process status. If the "id" parameter is set, the method returns a chunk of log output of the corresponding discovery process. The number of bytes to read is defined by the "read" parameter.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param id: The id of the previously started discovery process required to retrieve its status
:type id: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param DeviceID: ID of the device to re-discover now, if known
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param ip_address: Device IP address to be discovered now
:type ip_address: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param read: The number of bytes to read from the discover now output
:type read: Integer
| ``api version min:`` 2.1.1
| ``api version max:`` None
| ``required:`` False
| ``default:`` False
:param ignore_history_ind: If true, NetMRI will not check for previous instances of this IP address, and therefore will not associate this IP address with the associated device history.
:type ignore_history_ind: Boolean
| ``api version min:`` 2.4
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param mac_address: Device MAC address to be discovered now
:type mac_address: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return id: ID for further tracking the process status
:rtype id: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return read: The number of bytes to read
:rtype read: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return output: Chunk of log output of the corresponding discovery process
:rtype output: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return status: Status of the remaining output data: 0 - no data, 1 - data is available
:rtype status: Integer
"""
return self.api_request(self._get_method_fullname("discover_now"), kwargs)
def abort_discovery(self, **kwargs):
"""Abort an outstanding discover now session
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: ID of the discover now request to abort
:type id: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("abort_discovery"), kwargs)
def override_licenses(self, **kwargs):
"""Overrides a license for all or some devices
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIDs: The list of devices to override licenses
:type DeviceIDs: Array
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param Licensed: The state of the device license override. Valid values are 0 (unlicensed), 1 (licensed), 2 (automatic)
:type Licensed: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SwitchPortManagement: The state of the Switch Port Management license override. Valid values are on or off if exists. An empty string is passed if not present.
:type SwitchPortManagement: String
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param AccessProvisioning: The state of the Security Device Controller license override. Valid values are on or off if exists. An empty string is passed if not present.
:type AccessProvisioning: String
**Outputs**
"""
return self.api_request(self._get_method_fullname("override_licenses"), kwargs)
def unmanage(self, **kwargs):
"""Set one or more specified devices to unmanaged
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIDs: The list of devices to set to unmanaged
:type DeviceIDs: Array
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param UnitID: ID of the collector to send the request to, OC only
:type UnitID: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("unmanage"), kwargs)
def update_snmp(self, **kwargs):
"""Update the SNMP read credential for a device
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device to be updated
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param SNMPVersion: The SNMP Version
:type SNMPVersion: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SNMPRead: The SNMP read community
:type SNMPRead: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SNMPAuthPW: The SNMP auth password for SNMPv3
:type SNMPAuthPW: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SNMPAuthProto: The SNMP auth protocol for SNMPv3
:type SNMPAuthProto: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SNMPPrivPW: The SNMP priv password for SNMPv3
:type SNMPPrivPW: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param SNMPPrivProto: The SNMP priv protocol for SNMPv3
:type SNMPPrivProto: String
**Outputs**
"""
return self.api_request(self._get_method_fullname("update_snmp"), kwargs)
def update_cli_credentials(self, **kwargs):
"""Update the CLI credential for a device
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device to be updated
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.SSH.Username: SSH username
:type config.SSH.Username: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.SSH.Password: SSH password
:type config.SSH.Password: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.Telnet.Username: Telnet username
:type config.Telnet.Username: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.Telnet.Password: Telnet password
:type config.Telnet.Password: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.HTTP.Username: HTTP username
:type config.HTTP.Username: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.HTTP.Password: HTTP password
:type config.HTTP.Password: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param config.EnablePassword: Enable password
:type config.EnablePassword: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 22
:param config.SSH.Port: CLI SSH Port
:type config.SSH.Port: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 23
:param config.Telnet.Port: CLI Telnet Port
:type config.Telnet.Port: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("update_cli_credentials"), kwargs)
def cli_guesses(self, **kwargs):
"""Displays the status of all devices CLI guesses
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param filename: Filename for exported data
:type filename: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: The id of the device
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Protocol: Protocol
:rtype Protocol: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Username: Username
:rtype Username: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Password: Password
:rtype Password: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return EnablePassword: Enable Password
:rtype EnablePassword: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: Status
:rtype Status: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Timestamp: Timestamp
:rtype Timestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Port: Port
:rtype Port: Integer
"""
return self.api_request(self._get_method_fullname("cli_guesses"), kwargs)
def snmp_guesses(self, **kwargs):
"""Displays the status of all devices SNMP guesses
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param filename: Filename for exported data
:type filename: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: The id of the device
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Protocol: Protocol
:rtype Protocol: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Username: Username
:rtype Username: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Community: The SNMP community string
:rtype Community: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPAuthPW: The SNMPv3 authentication protocol password
:rtype SNMPAuthPW: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPAuthProto: The SNMPv3 authentication protocol to use with this credential
:rtype SNMPAuthProto: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPPrivPW: The SNMPv3 privacy protocol password
:rtype SNMPPrivPW: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPPrivProto: The SNMPv3 privacy protocol to use with this credential
:rtype SNMPPrivProto: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Status: Status
:rtype Status: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return Timestamp: Timestamp
:rtype Timestamp: DateTime
"""
return self.api_request(self._get_method_fullname("snmp_guesses"), kwargs)
def show_opsec_certificate(self, **kwargs):
"""Return OPSEC device certificate
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device
:type DeviceID: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("show_opsec_certificate"), kwargs)
def opsec_guesses(self, **kwargs):
"""Displays the status of all devices OPSEC guesses
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device
:type DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param filename: Filename for exported data
:type filename: String
**Outputs**
"""
return self.api_request(self._get_method_fullname("opsec_guesses"), kwargs)
def status_management(self, **kwargs):
"""Returns a list of discovery statuses of the device with given id
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceID: The id of the device to get management status
:type DeviceID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return AccessInd: The indicator of collection of the filtering access information
:rtype AccessInd: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialMessage: A message detailing the current CLI credential status
:rtype CLICredentialMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialStatus: Indicates the status of SSH discovery attempt of the device
:rtype CLICredentialStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return CLICredentialTimestamp: The date/time of the last update to the CLI credential status
:rtype CLICredentialTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionEnabled: Whether configuration polling is enabled for this device
:rtype ConfigCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionMessage: A message detailing the current configuration file collection status
:rtype ConfigCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionStatus: Indicates whether configuration files have been successfully collected from the device
:rtype ConfigCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ConfigCollectionTimestamp: The date/time of the last update to the configuration collection status
:rtype ConfigCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DataSourceName: The name of this data source
:rtype DataSourceName: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceFirstOccurrenceTime: The date/time that this device was first seen on the network
:rtype DeviceFirstOccurrenceTime: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupMessage: A message detailing the current device group assignment status
:rtype DeviceGroupMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupStatus: Indicates whether the device has been successfully assigned to a device group
:rtype DeviceGroupStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceGroupTimestamp: The date/time of the last update to the device group assignment status
:rtype DeviceGroupTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceID: Internal system identifier for the discovery status
:rtype DeviceID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceIPDotted: The management IP address of the device, in dotted (or colon-delimited for IPv6) format
:rtype DeviceIPDotted: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DeviceType: The determined device type
:rtype DeviceType: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return DisplayCollector: Display collector
:rtype DisplayCollector: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ExistsMessage: A message detailing the method by which the device is decided to exist
:rtype ExistsMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ExistsTimestamp: The date/time of the last device existance update
:rtype ExistsTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintEnabled: Whether fingerprinting is enabled for this device
:rtype FingerPrintEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintMessage: A message detailing the current fingerprint status
:rtype FingerPrintMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintStatus: Indicates whether device fingerprinting has been successfully executed for the device
:rtype FingerPrintStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FingerPrintTimestamp: The date/time of the last update to the fingerprint status
:rtype FingerPrintTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return FirstSeen: The date and time this record was first seen
:rtype FirstSeen: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ForwardingInd: The indicator of collection of forwarding information
:rtype ForwardingInd: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastAction: The last action (from the set in this model) executed against this device
:rtype LastAction: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastSeen: The timestamp of when data was last polled from this device
:rtype LastSeen: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LastTimestamp: The date/time of the last action
:rtype LastTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LicenseOverride: Licensing setting. Requested unlicensed: 0, requested licensed: 1, automatic: 2
:rtype LicenseOverride: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return LicenseStatus: Licenses applied
:rtype LicenseStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NetBIOSMessage: Message detailing current accessibility via NetBIOS
:rtype NetBIOSMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NetBIOSStatus: Indicates where device is accessible via NetBIOS
:rtype NetBIOSStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return NetBIOSTimestamp: The date/time of the last successful NetBIOS discovery attempt
:rtype NetBIOSTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return PingMessage: Message detailing current pingable status
:rtype PingMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return PingStatus: Indicates whether device is pingable
:rtype PingStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return PingTimestamp: The date/time of the last successful ping
:rtype PingTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableMessage: A message detailing the current ACL/rule list analysis status
:rtype ReachableMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableStatus: Indicates whether device is reachable according to it's ACL/rule list
:rtype ReachableStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ReachableTimestamp: The date/time of the last update to the reachable status
:rtype ReachableTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisMessage: A message detailing the current ACL/rule list analysis status
:rtype RuleListAnalysisMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisStatus: Indicates whether ACLs/rule lists have been successfully collected from the device
:rtype RuleListAnalysisStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return RuleListAnalysisTimestamp: The date/time of the last update to the rule list status
:rtype RuleListAnalysisTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SAMLicensedInd: Licensed for Security Device Controller
:rtype SAMLicensedInd: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionMessage: SDN collection message
:rtype SDNCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionStatus: Indicates whether SDN data has been successfully collected for the device
:rtype SDNCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SDNCollectionTimestamp: The date/time of the last update to the SDN collection status
:rtype SDNCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionEnabled: SNMP collection enabled?
:rtype SNMPCollectionEnabled: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionMessage: A message detailing the current SNMP collection status
:rtype SNMPCollectionMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionStatus: Indicates whether SNMP data has been successfully collected from the device
:rtype SNMPCollectionStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCollectionTimestamp: The date/time of the last update to the SNMP collection status
:rtype SNMPCollectionTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialMessage: A message detailing the current SNMP credential status
:rtype SNMPCredentialMessage: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialStatus: Indicates the status of SNMP discovery attempt of the device
:rtype SNMPCredentialStatus: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SNMPCredentialTimestamp: The date/time of the last update to the SNMP credential status
:rtype SNMPCredentialTimestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return SPMLicensedInd: Licensed for Switch Port Manager
:rtype SPMLicensedInd: Boolean
"""
return self.api_request(self._get_method_fullname("status_management"), kwargs)
def setup_summary(self, **kwargs):
"""Discovery Wizard setup summary information
**Inputs**
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return automatic_discovery_enabled: Automatic discovery enabled
:rtype automatic_discovery_enabled: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return ranges_reasonably_small: Ranges reasonably small
:rtype ranges_reasonably_small: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return seed_exists: Seed exists
:rtype seed_exists: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return cli_credentials_exist: CLI credentials exist
:rtype cli_credentials_exist: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return snmp_credentials_exist: SNMP credentials exist
:rtype snmp_credentials_exist: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return at_least_one_device_found: At least one device found
:rtype at_least_one_device_found: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return at_least_one_device_fingerprinted: At least one device fingerprinted
:rtype at_least_one_device_fingerprinted: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return at_least_one_device_fully_discovered: At least one device fully discovered
:rtype at_least_one_device_fully_discovered: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return snmp_collected_from_one_device: SNMP collected from one device
:rtype snmp_collected_from_one_device: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return sdn_collected_from_one_device: SDN collected from one device
:rtype sdn_collected_from_one_device: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return config_collected_from_one_device: Config collected from one device
:rtype config_collected_from_one_device: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return snmp_collection_enabled: SNMP collection enabled
:rtype snmp_collection_enabled: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return config_collection_enabled: Config collection enabled
:rtype config_collection_enabled: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return finger_print_enabled: Fingerprint enabled
:rtype finger_print_enabled: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return sdn_collection_enabled: SDN collection enabled
:rtype sdn_collection_enabled: Boolean
"""
return self.api_request(self._get_method_fullname("setup_summary"), kwargs)
| 34.909588 | 1,428 | 0.523857 | 7,669 | 83,015 | 5.639327 | 0.062329 | 0.124399 | 0.080859 | 0.105739 | 0.814581 | 0.797494 | 0.791644 | 0.785724 | 0.785724 | 0.766949 | 0 | 0.001387 | 0.374643 | 83,015 | 2,377 | 1,429 | 34.924274 | 0.831683 | 0.722749 | 0 | 0 | 0 | 0 | 0.099749 | 0.024399 | 0 | 0 | 0 | 0 | 0 | 1 | 0.465116 | false | 0 | 0.023256 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
a46d486c5f7d0c60104f6fcb0d332062fe84cb58 | 38,358 | py | Python | deepeeg/models/wavenet.py | TanTingyi/WaveNet-EEG | 6a529c1b0b27712c47156a515c3068ef93c4d331 | [
"MIT"
] | 3 | 2020-04-14T02:18:52.000Z | 2020-09-25T08:25:35.000Z | deepeeg/models/wavenet.py | TanTingyi/DeepEEG | 6a529c1b0b27712c47156a515c3068ef93c4d331 | [
"MIT"
] | null | null | null | deepeeg/models/wavenet.py | TanTingyi/DeepEEG | 6a529c1b0b27712c47156a515c3068ef93c4d331 | [
"MIT"
] | 2 | 2020-05-11T15:45:35.000Z | 2021-12-31T07:13:28.000Z | import tensorflow as tf
from tensorflow.keras import Model, Sequential
from tensorflow.keras.layers import Conv1D, Conv2D, Input, Add
from tensorflow.keras.layers import Activation, Lambda, Dense, Dropout
from tensorflow.keras.layers import BatchNormalization, Flatten
from tensorflow.keras.layers import AveragePooling2D, AveragePooling1D
from tensorflow.keras.layers import SeparableConv2D, Concatenate
from tensorflow.keras.layers import DepthwiseConv2D
from tensorflow.keras.regularizers import l2
from tensorflow import keras
class DilatedBlock(Model):
"""
Creates a single causal dilated convolution layer
|-> [gate] -| |-> 1x1 conv -> skip output
| |-> (*) -|
input -|-> [filter] -| |-> 1x1 conv -|
| |-> (+) -> dense output
|------------------------------------|
"""
def __init__(self, dilation, output_width, residual_channels,
dilation_channels, skip_channels, use_biases, regularizer,
last_layer, **kwargs):
super(DilatedBlock, self).__init__(**kwargs)
self.output_width = output_width
self.conv_filter = Conv1D(dilation_channels,
2,
dilation_rate=dilation,
padding='causal',
activation='tanh',
use_bias=use_biases,
kernel_regularizer=l2(l=regularizer),
bias_regularizer=l2(l=regularizer))
self.conv_gate = Conv1D(dilation_channels,
2,
dilation_rate=dilation,
padding='causal',
activation='sigmoid',
use_bias=use_biases,
kernel_regularizer=l2(l=regularizer),
bias_regularizer=l2(l=regularizer))
self.transformed = Conv1D(residual_channels,
1,
padding='same',
use_bias=use_biases,
kernel_regularizer=l2(l=regularizer),
bias_regularizer=l2(l=regularizer))
self.skip_contribution = Conv1D(skip_channels,
1,
padding='same',
use_bias=use_biases,
kernel_regularizer=l2(l=regularizer),
bias_regularizer=l2(l=regularizer))
self.last_layer = last_layer
def call(self, inputs, training=None):
if self.last_layer:
# [b, sample, residual_channels]
filters = self.conv_filter(inputs)
gates = self.conv_gate(inputs)
out = filters * gates
# The 1x1 conv to produce the skip output
skip_cut = tf.shape(out)[1] - self.output_width
out_skip = tf.slice(out, [0, skip_cut, 0], [-1, -1, -1])
skip_contribution = self.skip_contribution(out_skip)
return skip_contribution
else:
# [b, sample, residual_channels]
filters = self.conv_filter(inputs)
gates = self.conv_gate(inputs)
out = filters * gates
# The 1x1 conv to produce the residual output
transformed = self.transformed(out)
# The 1x1 conv to produce the skip output
skip_cut = tf.shape(out)[1] - self.output_width
out_skip = tf.slice(out, [0, skip_cut, 0], [-1, -1, -1])
skip_contribution = self.skip_contribution(out_skip)
return skip_contribution, inputs + transformed
class WaveNet(Model):
'''Implements the WaveNet network for generative audio.
Usage (with the architecture as in the DeepMind paper):
dilations = [2**i for i in range(N)] * M
filter_width = 2 # Convolutions just use 2 samples.
residual_channels = 16 # Not specified in the paper.
dilation_channels = 32 # Not specified in the paper.
skip_channels = 16 # Not specified in the paper.
'''
def __init__(self, batch_size, dilations, filter_width, signal_length,
residual_channels, dilation_channels, skip_channels,
quantization_channels, use_biases, regularizer):
'''Initializes the WaveNet model.
Args:
batch_size: How many audio files are supplied per batch
(recommended: 1).
dilations: A list with the dilation factor for each layer.
filter_width: The samples that are included in each convolution,
after dilating.
signal_length: The length of input wave.
residual_channels: How many filters to learn for the residual.
dilation_channels: How many filters to learn for the dilated
convolution.
skip_channels: How many filters to learn that contribute to the
quantized softmax output.
quantization_channels: How many amplitude values to use for audio
quantization and the corresponding one-hot encoding.
Default: 256 (8-bit quantization).
use_biases: Whether to add a bias layer to each convolution.
Default: False.
regularizer: Regularzation weight
TODO:
scalar_input: Whether to use the quantized waveform directly as
input to the network instead of one-hot encoding it.
Default: False.
initial_filter_width: The width of the initial filter of the
convolution applied to the scalar input. This is only relevant
if scalar_input=True.
global_condition_channels: Number of channels in (embedding
size) of global conditioning vector. None indicates there is
no global conditioning.
global_condition_cardinality: Number of mutually exclusive
categories to be embedded in global condition embedding. If
not None, then this implies that global_condition tensor
specifies an integer selecting which of the N global condition
categories, where N = global_condition_cardinality. If None,
then the global_condition tensor is regarded as a vector which
must have dimension global_condition_channels.
'''
super(WaveNet, self).__init__()
self.batch_size = batch_size
self.dilations = dilations
self.filter_width = filter_width
self.residual_channels = residual_channels
self.dilation_channels = dilation_channels
self.skip_channels = skip_channels
self.quantization_channels = quantization_channels
self.use_biases = use_biases
self.regularizer = regularizer
self.signal_length = signal_length
self.receptive_field = WaveNet.calculate_receptive_field(
self.filter_width, self.dilations)
self.output_width = WaveNet.calculate_output_width(
self.signal_length, self.receptive_field)
self.pre_block = self._build_preprocess_block()
self.residual_blocks = self._build_residual_blocks()
self.post_block = self._build_postprocess_block()
@staticmethod
def calculate_receptive_field(filter_width, dilations):
return (filter_width - 1) * sum(dilations) + filter_width
@staticmethod
def calculate_output_width(signal_length, receptive_field):
return signal_length - receptive_field + 1
def _build_preprocess_block(self):
pre_block = Sequential()
pre_block.add(
Conv1D(self.residual_channels,
self.filter_width,
padding='causal',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)))
return pre_block
def _build_residual_blocks(self):
outputs = []
# Add all defined dilation layers.
inputs = Input(shape=(self.signal_length, self.residual_channels))
current_layer = inputs
for i, dilation in enumerate(self.dilations):
if i == len(self.dilations) - 1:
output = DilatedBlock(dilation, self.output_width,
self.residual_channels,
self.dilation_channels,
self.skip_channels, self.use_biases,
self.regularizer, True)(current_layer)
else:
output, current_layer = DilatedBlock(
dilation, self.output_width, self.residual_channels,
self.dilation_channels, self.skip_channels,
self.use_biases, self.regularizer, False)(current_layer)
outputs.append(output)
outputs = Add()(outputs)
return Model(inputs, outputs)
def _build_postprocess_block(self):
post_block = Sequential()
post_block.add(Activation('relu'))
post_block.add(
Conv1D(self.skip_channels,
1,
padding='same',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer),
activation='relu'))
post_block.add(
Conv1D(self.quantization_channels,
1,
padding='same',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)))
# post_block.add(Activation('softmax'))
return post_block
def call(self, inputs, training=None):
x = self.pre_block(inputs, training=training)
x = self.residual_blocks(x, training=training)
x = self.post_block(x, training=training)
return x
class EEGWaveNetv1(Model):
'''
Implements the WaveNet network for EEG classification.
The shape of inputs must be [batch_size, signal_length, data_channels, 1].
Set tensorflow data format as channle last
'''
def __init__(self, signal_length, data_channels, dilations, filter_width,
residual_channels, dilation_channels, skip_channels,
use_biases, regularizer):
'''
Initializes the EEGWaveNet model.
Args:
signal_length: How long of raw data.
data_channels: How many channels of raw data.
dilations: A list with the dilation factor for each layer.
filter_width: The samples that are included in each convolution,
after dilating.
residual_channels: How many filters to learn for the residual.
dilation_channels: How many filters to learn for the dilated
convolution.
skip_channels: How many filters to learn that contribute to the
quantized softmax output.
use_biases: Whether to add a bias layer to each convolution.
Default: False.
regularizer: Regularzation weight
'''
super(EEGWaveNetv1, self).__init__()
self.signal_length = signal_length
self.data_channels = data_channels
self.dilations = dilations
self.filter_width = filter_width
self.residual_channels = residual_channels
self.dilation_channels = dilation_channels
self.skip_channels = skip_channels
self.use_biases = use_biases
self.regularizer = regularizer
self.pre_block = self._build_preprocess_block()
self.residual_blocks = self._build_residual_blocks()
self.post_block = self._build_postprocess_block()
def _build_preprocess_block(self):
# [batch_size, data_channels, signal_length, 1]
pre_block = Sequential(
[
Input((self.data_channels, self.signal_length, 1)),
Conv2D(self.residual_channels, (1, self.filter_width),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
# [batch_size, data_channels, signal_length, residual_channels]
Conv2D(self.residual_channels, (self.data_channels, 1),
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(axis=-1),
Activation('elu'),
# [batch_size, 1, signal_length, residual_channels]
Lambda(tf.squeeze, arguments=dict(axis=1))
],
name='preprocess_block')
# [batch_size, signal_length, residual_channels]
return pre_block
def _build_residual_blocks(self):
def single_block(inputs):
# [b, sample, residual_channels]
filters = Conv1D(self.dilation_channels,
self.filter_width,
dilation_rate=dilation,
padding='causal',
activation='tanh',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
gates = Conv1D(self.dilation_channels,
self.filter_width,
dilation_rate=dilation,
padding='causal',
activation='sigmoid',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
out = filters * gates
skip_contribution = Conv1D(
self.skip_channels,
1,
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
if last_layer:
return skip_contribution, None
else:
transformed = Conv1D(
self.residual_channels,
1,
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
return skip_contribution, inputs + transformed
outputs = []
# Add all defined dilation layers.
# [batch_size, signal_length, residual_channels]
inputs = Input(shape=(self.signal_length, self.residual_channels))
current_layer = inputs
for i, dilation in enumerate(self.dilations):
last_layer = (i == (len(self.dilations) - 1))
output, current_layer = single_block(current_layer)
outputs.append(output)
outputs = Add()(outputs)
# [batch_size, signal_length, skip_channels]
return Model(inputs, outputs, name='residual_blocks')
def _build_postprocess_block(self):
post_block = Sequential([
Input((self.signal_length, self.skip_channels)),
BatchNormalization(),
Activation('elu'),
Dropout(0.5),
Conv1D(16,
1,
padding='same',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
Dropout(0.5),
AveragePooling1D(pool_size=4),
Conv1D(8,
3,
padding='valid',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
Dropout(0.5),
AveragePooling1D(pool_size=4),
Conv1D(4,
1,
padding='same',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
Flatten(),
Dense(2,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))
])
return post_block
def call(self, inputs, training=None):
x = self.pre_block(inputs, training=training)
x = self.residual_blocks(x, training=training)
x = self.post_block(x, training=training)
return x
class EEGWaveNetv2(Model):
'''
Implements the WaveNet network for EEG classification.
The shape of inputs must be [batch_size, signal_length, data_channels, 1].
Set tensorflow data format as channle last
'''
def __init__(self, signal_length, data_channels, dilations, filter_width,
residual_channels, dilation_channels, skip_channels,
use_biases, regularizer):
'''
Initializes the EEGWaveNet model.
Args:
signal_length: How long of raw data.
data_channels: How many channels of raw data.
dilations: A list with the dilation factor for each layer.
filter_width: The samples that are included in each convolution,
after dilating.
residual_channels: How many filters to learn for the residual.
dilation_channels: How many filters to learn for the dilated
convolution.
skip_channels: How many filters to learn that contribute to the
quantized softmax output.
use_biases: Whether to add a bias layer to each convolution.
Default: False.
regularizer: Regularzation weight
'''
super(EEGWaveNetv2, self).__init__()
self.signal_length = signal_length
self.data_channels = data_channels
self.dilations = dilations
self.filter_width = filter_width
self.residual_channels = residual_channels
self.dilation_channels = dilation_channels
self.skip_channels = skip_channels
self.use_biases = use_biases
self.regularizer = regularizer
self.pre_block = self._build_preprocess_block()
self.residual_blocks = self._build_residual_blocks()
self.post_block = self._build_postprocess_block()
def _build_preprocess_block(self):
pre_block = Sequential(
[ # [batch_size, data_channels, signal_length, 1]
Input((self.data_channels, self.signal_length, 1)),
Conv2D(self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
# Dropout(0.2),
# [batch_size, data_channels, signal_length, residual_channels]
],
name='preprocess_block')
# [batch_size, signal_length, residual_channels]
return pre_block
def _build_residual_blocks(self):
def single_block(inputs):
# [b, sample, residual_channels]
filters = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='tanh',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
gates = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='sigmoid',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
out = filters * gates
skip_contribution = Conv2D(
self.skip_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
if last_layer:
return skip_contribution, None
else:
transformed = Conv2D(
self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
return skip_contribution, inputs + transformed
outputs = []
# Add all defined dilation layers.
# [batch_size, data_channels, signal_length, residual_channels]
inputs = Input(shape=(self.data_channels, self.signal_length,
self.residual_channels))
current_layer = inputs
for i, dilation in enumerate(self.dilations):
last_layer = (i == (len(self.dilations) - 1))
output, current_layer = single_block(current_layer)
outputs.append(output)
outputs = Add()(outputs)
# [batch_size, data_channels, signal_length, skip_channels * len(dilations)]
return Model(inputs, outputs, name='residual_blocks')
def _build_postprocess_block(self):
post_block = Sequential([
Input(
(self.data_channels, self.signal_length, self.skip_channels)),
BatchNormalization(),
Activation('elu'),
# Dropout(0.2),
Conv2D(self.skip_channels * 2, (1, 1),
padding='same',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
# Dropout(0.2),
AveragePooling2D(pool_size=(1, 8)),
Conv2D(self.skip_channels * 4, (self.data_channels, 3),
padding='valid',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
# Dropout(0.2),
Flatten(),
Dense(200,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
Dropout(0.2),
Dense(2,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))
])
return post_block
def call(self, inputs, training=None):
x = self.pre_block(inputs, training=training)
x = self.residual_blocks(x, training=training)
x = self.post_block(x, training=training)
return x
class EEGWaveNetv3(Model):
'''
Implements the WaveNet network for EEG classification.
The shape of inputs must be [batch_size, signal_length, data_channels, 1].
Set tensorflow data format as channle last
'''
def __init__(self, signal_length, data_channels, dilations, filter_width,
residual_channels, dilation_channels, skip_channels,
use_biases, regularizer):
'''
Initializes the EEGWaveNet model.
Args:
signal_length: How long of raw data.
data_channels: How many channels of raw data.
dilations: A list with the dilation factor for each layer.
filter_width: The samples that are included in each convolution,
after dilating.
residual_channels: How many filters to learn for the residual.
dilation_channels: How many filters to learn for the dilated
convolution.
skip_channels: How many filters to learn that contribute to the
quantized softmax output.
use_biases: Whether to add a bias layer to each convolution.
Default: False.
regularizer: Regularzation weight
'''
super(EEGWaveNetv3, self).__init__()
self.signal_length = signal_length
self.data_channels = data_channels
self.dilations = dilations
self.filter_width = filter_width
self.residual_channels = residual_channels
self.dilation_channels = dilation_channels
self.skip_channels = skip_channels
self.use_biases = use_biases
self.regularizer = regularizer
self.pre_block = self._build_preprocess_block()
self.residual_blocks = self._build_residual_blocks()
self.post_block = self._build_postprocess_block()
def _build_preprocess_block(self):
pre_block = Sequential(
[ # [batch_size, data_channels, signal_length, 1]
Input((self.data_channels, self.signal_length, 1)),
Conv2D(self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
Dropout(0.2),
# [batch_size, data_channels, signal_length, residual_channels]
],
name='preprocess_block')
# [batch_size, signal_length, residual_channels]
return pre_block
def _build_residual_blocks(self):
def single_block(inputs):
# [b, sample, residual_channels]
filters = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='tanh',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
gates = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='sigmoid',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
out = filters * gates
skip_contribution = Conv2D(
self.skip_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
if last_layer:
return skip_contribution, None
else:
transformed = Conv2D(
self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
return skip_contribution, inputs + transformed
outputs = []
# Add all defined dilation layers.
# [batch_size, data_channels, signal_length, residual_channels]
inputs = Input(shape=(self.data_channels, self.signal_length,
self.residual_channels))
current_layer = inputs
for i, dilation in enumerate(self.dilations):
last_layer = (i == (len(self.dilations) - 1))
output, current_layer = single_block(current_layer)
outputs.append(output)
outputs = Concatenate()(outputs)
# [batch_size, data_channels, signal_length, skip_channels * len(dilations)]
return Model(inputs, outputs, name='residual_blocks')
def _build_postprocess_block(self):
post_block = Sequential([
Input((self.data_channels, self.signal_length,
self.skip_channels * len(self.dilations))),
BatchNormalization(),
Activation('elu'),
Dropout(0.2),
DepthwiseConv2D((self.data_channels, 1),
use_bias=self.use_biases,
depthwise_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
AveragePooling2D((1, 4)),
Dropout(0.2),
SeparableConv2D(self.skip_channels * len(self.dilations) * 2,
(1, 3),
padding='valid',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
AveragePooling2D((1, 8)),
Dropout(0.2),
Flatten(),
# Dense(200,
# kernel_regularizer=l2(l=self.regularizer),
# bias_regularizer=l2(l=self.regularizer)),
# BatchNormalization(),
# Activation('elu'),
# Dropout(0.2),
Dense(2,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))
])
return post_block
def call(self, inputs, training=None):
x = self.pre_block(inputs, training=training)
x = self.residual_blocks(x, training=training)
x = self.post_block(x, training=training)
return x
class EEGWaveNetv4(Model):
'''
Implements the WaveNet network for EEG classification.
The shape of inputs must be [batch_size, signal_length, data_channels, 1].
Set tensorflow data format as channle last
'''
def __init__(self, signal_length, data_channels, dilations, filter_width,
residual_channels, dilation_channels, skip_channels,
use_biases, regularizer):
'''
Initializes the EEGWaveNet model.
Args:
signal_length: How long of raw data.
data_channels: How many channels of raw data.
dilations: A list with the dilation factor for each layer.
filter_width: The samples that are included in each convolution,
after dilating.
residual_channels: How many filters to learn for the residual.
dilation_channels: How many filters to learn for the dilated
convolution.
skip_channels: How many filters to learn that contribute to the
quantized softmax output.
use_biases: Whether to add a bias layer to each convolution.
Default: False.
regularizer: Regularzation weight
'''
super(EEGWaveNetv4, self).__init__()
self.signal_length = signal_length
self.data_channels = data_channels
self.dilations = dilations
self.filter_width = filter_width
self.residual_channels = residual_channels
self.dilation_channels = dilation_channels
self.skip_channels = skip_channels
self.use_biases = use_biases
self.regularizer = regularizer
self.pre_block = self._build_preprocess_block()
self.residual_blocks = self._build_residual_blocks()
self.post_block = self._build_postprocess_block()
def _build_preprocess_block(self):
pre_block = Sequential(
[ # [batch_size, data_channels, signal_length, 1]
Input((self.data_channels, self.signal_length, 1)),
Conv2D(self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
Dropout(0.5),
# [batch_size, data_channels, signal_length, residual_channels]
],
name='preprocess_block')
# [batch_size, signal_length, residual_channels]
return pre_block
def _build_residual_blocks(self):
def single_block(inputs):
# [b, sample, residual_channels]
filters = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='tanh',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
gates = Conv2D(self.dilation_channels, (1, self.filter_width),
dilation_rate=dilation,
padding='same',
activation='sigmoid',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(inputs)
out = filters * gates
skip_contribution = Conv2D(
self.skip_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
if last_layer:
return skip_contribution, None
else:
transformed = Conv2D(
self.residual_channels, (1, 1),
padding='same',
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))(out)
return skip_contribution, inputs + transformed
outputs = []
# Add all defined dilation layers.
# [batch_size, data_channels, signal_length, residual_channels]
inputs = Input(shape=(self.data_channels, self.signal_length,
self.residual_channels))
current_layer = inputs
for i, dilation in enumerate(self.dilations):
last_layer = (i == (len(self.dilations) - 1))
output, current_layer = single_block(current_layer)
outputs.append(output)
outputs = Concatenate()(outputs)
# [batch_size, data_channels, signal_length, skip_channels * len(dilations)]
return Model(inputs, outputs, name='residual_blocks')
def _build_postprocess_block(self):
post_block = Sequential([
Input((self.data_channels, self.signal_length,
self.skip_channels * len(self.dilations))),
BatchNormalization(),
Activation('elu'),
Dropout(0.5),
DepthwiseConv2D((self.data_channels, 1),
use_bias=self.use_biases,
depthwise_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
AveragePooling2D((1, 4)),
Dropout(0.5),
SeparableConv2D(self.skip_channels * len(self.dilations) * 2,
(1, 7),
padding='valid',
strides=1,
use_bias=self.use_biases,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer)),
BatchNormalization(),
Activation('elu'),
AveragePooling2D((1, 8)),
Dropout(0.5),
Flatten(),
# Dense(200,
# kernel_regularizer=l2(l=self.regularizer),
# bias_regularizer=l2(l=self.regularizer)),
# BatchNormalization(),
# Activation('elu'),
# Dropout(0.2),
Dense(2,
kernel_regularizer=l2(l=self.regularizer),
bias_regularizer=l2(l=self.regularizer))
])
return post_block
def call(self, inputs, training=None):
x = self.pre_block(inputs, training=training)
x = self.residual_blocks(x, training=training)
x = self.post_block(x, training=training)
return x
| 43.489796 | 84 | 0.560483 | 3,771 | 38,358 | 5.485017 | 0.063909 | 0.055308 | 0.059563 | 0.069619 | 0.870044 | 0.856314 | 0.844711 | 0.834655 | 0.82252 | 0.815751 | 0 | 0.013381 | 0.359012 | 38,358 | 881 | 85 | 43.53916 | 0.827877 | 0.208249 | 0 | 0.851792 | 0 | 0 | 0.013349 | 0 | 0 | 0 | 0 | 0.001135 | 0 | 1 | 0.053746 | false | 0 | 0.016287 | 0.003257 | 0.131922 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a48b4ddcb20faf80529610ee598599c963954e73 | 13,597 | py | Python | tests/test_ses_pages.py | jasonjorge/infi.asi | 78a4c34a421102f99b959a659cf7303804627d9b | [
"BSD-3-Clause"
] | 9 | 2015-01-05T08:02:34.000Z | 2020-07-09T22:26:13.000Z | tests/test_ses_pages.py | jasonjorge/infi.asi | 78a4c34a421102f99b959a659cf7303804627d9b | [
"BSD-3-Clause"
] | 3 | 2017-05-22T22:06:05.000Z | 2020-07-11T11:10:09.000Z | tests/test_ses_pages.py | jasonjorge/infi.asi | 78a4c34a421102f99b959a659cf7303804627d9b | [
"BSD-3-Clause"
] | 4 | 2015-02-12T08:42:42.000Z | 2020-07-09T22:27:49.000Z | import binascii
import unittest
from infi.asi.cdb.diagnostic.ses_pages import get_ses_page, get_ses_page_data
# Information received from Sanmina enclosure, firmware 0509:
RECEIVE_DIAGNOSTICS_RESULT_PAGE_00 = "00 00 00 09 00 01 02 04 05 07 0a 0e 80"
RECEIVE_DIAGNOSTICS_RESULT_PAGE_01 = (
"00 00 00 01 12 00 08 25 50 00 93 d0 01 3c 40 00"
"4e 45 57 49 53 59 53 20 4e 44 53 2d 34 36 30 30"
"2d 4a 44 20 20 20 20 20 30 35 30 39 05 17 3c 00"
"10 9e 01 00 10 02 02 00 10 03 04 00 10 04 06 00"
"10 06 01 00 10 07 02 00 10 18 06 00 10 41 72 72"
"61 79 20 44 65 76 20 53 6c 6f 74 20 20 34 36 30"
"30 20 45 6e 63 6c 6f 73 75 72 65 20 20 50 6f 77"
"65 72 20 53 75 70 70 6c 79 20 20 20 20 43 6f 6f"
"6c 69 6e 67 20 46 61 6e 20 20 20 20 20 54 65 6d"
"70 20 53 65 6e 73 6f 72 20 20 20 20 20 42 75 7a"
"7a 65 72 20 20 20 20 20 20 20 20 20 20 45 53 20"
"50 72 6f 63 65 73 73 6f 72 20 20 20 20 53 41 53"
"20 45 78 70 61 6e 64 65 72 20 20 20 20 "
)
RECEIVE_DIAGNOSTICS_RESULT_PAGE_01_ALT = (
"01 00 00 cd 00 00 00 01 12 00 08 25 50 00 93 d0"
"01 45 40 00 4e 45 57 49 53 59 53 20 4e 44 53 2d"
"34 36 30 30 2d 4a 44 20 20 20 20 20 30 35 30 39"
"01 17 3c 00 10 9e 01 00 10 02 02 00 10 03 04 00"
"10 04 06 00 10 06 01 00 10 07 02 00 10 18 06 00"
"10 41 72 72 61 79 20 44 65 76 20 53 6c 6f 74 20"
"20 34 36 30 30 20 45 6e 63 6c 6f 73 75 72 65 20"
"20 50 6f 77 65 72 20 53 75 70 70 6c 79 20 20 20"
"20 43 6f 6f 6c 69 6e 67 20 46 61 6e 20 20 20 20"
"20 54 65 6d 70 20 53 65 6e 73 6f 72 20 20 20 20"
"20 42 75 7a 7a 65 72 20 20 20 20 20 20 20 20 20"
"20 45 53 20 50 72 6f 63 65 73 73 6f 72 20 20 20"
"20 53 41 53 20 45 78 70 61 6e 64 65 72 20 20 20"
"20"
)
RECEIVE_DIAGNOSTICS_RESULT_PAGE_02 = (
"00 00 00 01 00 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 00 00 00 00 01 00 00 00"
"00 00 00 20 01 00 00 20 01 00 00 20 00 00 00 00"
"01 02 bc 22 01 02 c1 22 01 02 cb 22 01 02 c3 22"
"00 00 00 00 01 00 34 00 01 00 33 00 01 00 36 00"
"01 00 37 00 01 00 2b 00 01 00 2a 00 20 00 00 00"
"21 00 00 00 00 00 00 00 01 00 01 00 01 00 00 00"
"00 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00"
"01 00 00 00 01 00 00 00 01 00 00 00 "
)
RECEIVE_DIAGNOSTICS_RESULT_PAGE_04 = (
"04 00 00 4c 00 00 1c 7d 01 00 00 04 00 00 01 00"
"01 01 00 00 01 05 00 00 00 00 00 00 00 00 00 00"
"00 00 00 00 0a 30 30 40 0a 30 30 41 ff ff ff 00"
"00 00 00 00 00 00 00 00 00 01 00 00 02 00 02 00"
"01 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00"
)
RECEIVE_DIAGNOSTICS_RESULT_PAGE_07 = (
"07 00 09 20 00 00 00 01 00 00 00 00 00 00 00 0c"
"53 4c 4f 54 20 20 31 20 30 30 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 32 20 30 31 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 33 20 30 32 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 34 20 30 33 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 35 20 30 34 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 36 20 30 35 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 37 20 30 36 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 38 20 30 37 20 20 00 00 00 0c"
"53 4c 4f 54 20 20 39 20 30 38 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 30 20 30 39 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 31 20 30 41 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 32 20 30 42 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 33 20 31 30 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 34 20 31 31 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 35 20 31 32 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 36 20 31 33 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 37 20 31 34 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 38 20 31 35 20 20 00 00 00 0c"
"53 4c 4f 54 20 31 39 20 31 36 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 30 20 31 37 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 31 20 31 38 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 32 20 31 39 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 33 20 31 41 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 34 20 31 42 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 35 20 32 30 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 36 20 32 31 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 37 20 32 32 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 38 20 32 33 20 20 00 00 00 0c"
"53 4c 4f 54 20 32 39 20 32 34 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 30 20 32 35 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 31 20 32 36 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 32 20 32 37 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 33 20 32 38 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 34 20 32 39 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 35 20 32 41 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 36 20 32 42 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 37 20 33 30 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 38 20 33 31 20 20 00 00 00 0c"
"53 4c 4f 54 20 33 39 20 33 32 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 30 20 33 33 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 31 20 33 34 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 32 20 33 35 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 33 20 33 36 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 34 20 33 37 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 35 20 33 38 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 36 20 33 39 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 37 20 33 41 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 38 20 33 42 20 20 00 00 00 0c"
"53 4c 4f 54 20 34 39 20 34 30 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 30 20 34 31 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 31 20 34 32 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 32 20 34 33 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 33 20 34 34 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 34 20 34 35 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 35 20 34 36 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 36 20 34 37 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 37 20 34 38 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 38 20 34 39 20 20 00 00 00 0c"
"53 4c 4f 54 20 35 39 20 34 41 20 20 00 00 00 0c"
"53 4c 4f 54 20 36 30 20 34 42 20 20 00 00 00 00"
"00 00 00 a8 20 20 20 20 20 20 20 20 20 20 20 20"
"20 20 20 20 20 20 20 20 20 20 20 20 20 20 20 20"
"20 20 20 20 20 20 20 20 20 20 20 20 20 20 20 20"
"20 20 20 20 20 20 20 20 20 20 20 20 20 20 20 20"
"54 43 41 2d 30 30 33 34 31 2d 30 31 2d 42 2d 41"
"32 20 20 20 4d 58 45 33 34 30 30 30 33 33 51 52"
"42 30 30 33 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 4d 58 45 33 34 30 30 30 32 41 49 52"
"41 30 44 43 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 4d 58 45 33 34 30 30 30 32 38 56 52"
"41 30 45 36 30 30 30 42 30 30 30 31 00 00 00 00"
"00 00 00 5c 50 43 4d 20 41 20 20 20 50 57 52 2d"
"30 30 30 35 39 2d 30 31 2d 42 30 30 54 48 44 45"
"4c 30 30 30 31 33 56 52 45 30 34 33 41 31 30 30"
"30 30 30 30 54 44 50 53 2d 31 32 30 30 41 42 20"
"41 30 30 30 41 42 4d 54 31 31 31 33 30 30 30 36"
"36 36 30 30 53 34 46 30 30 30 30 30 30 31 30 39"
"00 00 00 5c 50 43 4d 20 42 20 20 20 50 57 52 2d"
"30 30 30 35 39 2d 30 31 2d 42 30 30 54 48 44 45"
"4c 30 30 30 31 33 56 52 45 30 33 46 41 31 30 30"
"30 30 30 30 54 44 50 53 2d 31 32 30 30 41 42 20"
"41 30 30 30 41 42 4d 54 31 31 31 33 30 30 30 36"
"36 32 30 30 53 34 46 30 30 30 30 30 30 31 30 39"
"00 00 00 00 00 00 00 28 50 43 4d 41 46 41 4e 31"
"50 57 52 2d 30 30 30 35 39 2d 30 31 2d 42 30 30"
"54 48 44 45 4c 30 30 30 31 33 56 52 45 30 34 33"
"00 00 00 28 50 43 4d 41 46 41 4e 32 50 57 52 2d"
"30 30 30 35 39 2d 30 31 2d 42 30 30 54 48 44 45"
"4c 30 30 30 31 33 56 52 45 30 34 33 00 00 00 28"
"50 43 4d 42 46 41 4e 31 50 57 52 2d 30 30 30 35"
"39 2d 30 31 2d 42 30 30 54 48 44 45 4c 30 30 30"
"31 33 56 52 45 30 33 46 00 00 00 28 50 43 4d 42"
"46 41 4e 32 50 57 52 2d 30 30 30 35 39 2d 30 31"
"2d 42 30 30 54 48 44 45 4c 30 30 30 31 33 56 52"
"45 30 33 46 00 00 00 00 00 00 00 2c 50 43 4d 20"
"41 20 20 20 50 57 52 2d 30 30 30 35 39 2d 30 31"
"2d 42 30 30 20 20 20 20 54 48 44 45 4c 30 30 30"
"31 33 56 52 45 30 34 33 00 00 00 2c 50 43 4d 20"
"42 20 20 20 50 57 52 2d 30 30 30 35 39 2d 30 31"
"2d 42 30 30 20 20 20 20 54 48 44 45 4c 30 30 30"
"31 33 56 52 45 30 33 46 00 00 00 2c 49 4f 4d 20"
"41 20 20 20 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 20 20 20 20 4d 58 45 33 34 30 30 30"
"32 41 49 52 41 30 44 43 00 00 00 2c 49 4f 4d 20"
"42 20 20 20 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 20 20 20 20 4d 58 45 33 34 30 30 30"
"32 38 56 52 41 30 45 36 00 00 00 2c 42 73 42 6f"
"61 72 64 31 54 43 41 2d 30 30 33 34 31 2d 30 31"
"2d 42 2d 41 32 20 20 20 4d 58 45 33 34 30 30 30"
"33 33 51 52 42 30 30 33 00 00 00 2c 42 73 42 6f"
"61 72 64 32 54 43 41 2d 30 30 33 34 31 2d 30 31"
"2d 42 2d 41 32 20 20 20 4d 58 45 33 34 30 30 30"
"33 33 51 52 42 30 30 33 00 00 00 00 00 00 00 2c"
"42 73 42 6f 61 72 64 20 54 43 41 2d 30 30 33 34"
"31 2d 30 31 2d 42 2d 41 32 20 20 20 4d 58 45 33"
"34 30 30 30 33 33 51 52 42 30 30 33 00 00 00 00"
"00 00 00 30 49 4f 4d 20 41 20 20 20 54 43 41 2d"
"30 30 33 34 30 2d 30 31 2d 42 20 20 4d 58 45 33"
"34 30 30 30 32 41 49 52 41 30 44 43 30 35 2e 30"
"39 2e 30 30 00 00 00 30 49 4f 4d 20 42 20 20 20"
"54 43 41 2d 30 30 33 34 30 2d 30 31 2d 42 20 20"
"4d 58 45 33 34 30 30 30 32 38 56 52 41 30 45 36"
"30 35 2e 30 39 2e 30 30 00 00 00 00 00 00 00 38"
"53 58 50 5f 41 5f 50 20 54 43 41 2d 30 30 33 34"
"30 2d 30 31 2d 42 20 20 4d 58 45 33 34 30 30 30"
"32 41 49 52 41 30 44 43 30 35 2e 30 39 2e 30 30"
"30 35 2e 30 39 2e 30 30 00 00 00 38 53 58 50 5f"
"41 5f 53 31 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 4d 58 45 33 34 30 30 30 32 41 49 52"
"41 30 44 43 30 35 2e 30 39 2e 30 30 30 35 2e 30"
"39 2e 30 30 00 00 00 38 53 58 50 5f 41 5f 53 32"
"54 43 41 2d 30 30 33 34 30 2d 30 31 2d 42 20 20"
"4d 58 45 33 34 30 30 30 32 41 49 52 41 30 44 43"
"30 35 2e 30 39 2e 30 30 30 35 2e 30 39 2e 30 30"
"00 00 00 38 53 58 50 5f 42 5f 50 20 54 43 41 2d"
"30 30 33 34 30 2d 30 31 2d 42 20 20 4d 58 45 33"
"34 30 30 30 32 38 56 52 41 30 45 36 30 35 2e 30"
"39 2e 30 30 30 35 2e 30 39 2e 30 30 00 00 00 38"
"53 58 50 5f 42 5f 53 31 54 43 41 2d 30 30 33 34"
"30 2d 30 31 2d 42 20 20 4d 58 45 33 34 30 30 30"
"32 38 56 52 41 30 45 36 30 35 2e 30 39 2e 30 30"
"30 35 2e 30 39 2e 30 30 00 00 00 38 53 58 50 5f"
"42 5f 53 32 54 43 41 2d 30 30 33 34 30 2d 30 31"
"2d 42 20 20 4d 58 45 33 34 30 30 30 32 38 56 52"
"41 30 45 36 30 35 2e 30 39 2e 30 30 30 35 2e 30"
"39 2e 30 30"
)
RECEIVE_DIAGNOSTICS_RESULT_PAGE_80 = (
"80 00 00 80 ff ff ff ff 05 00 ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff"
"01 00 00 00 "
)
def hex_to_bin(s):
return binascii.unhexlify(s.replace(' ', ''))
class TestSESPages(unittest.TestCase):
def test_page_00(self):
from infi.asi.cdb.diagnostic.ses_pages import DIAGNOSTIC_PAGE_SUPPORTED_PAGES
self.unpack_page(DIAGNOSTIC_PAGE_SUPPORTED_PAGES, RECEIVE_DIAGNOSTICS_RESULT_PAGE_00, None)
def test_page_01(self):
data = self.get_conf_page()
def test_page_01_alt(self):
from infi.asi.cdb.diagnostic.ses_pages import DIAGNOSTIC_PAGE_CONFIGURATION
data = self.unpack_page(DIAGNOSTIC_PAGE_CONFIGURATION, RECEIVE_DIAGNOSTICS_RESULT_PAGE_01_ALT, None)
def test_page_02(self):
from infi.asi.cdb.diagnostic.ses_pages import DIAGNOSTIC_PAGE_ENCLOSURE_STATUS
self.unpack_page(DIAGNOSTIC_PAGE_ENCLOSURE_STATUS, RECEIVE_DIAGNOSTICS_RESULT_PAGE_02, self.get_conf_page())
def test_page_07(self):
from infi.asi.cdb.diagnostic.ses_pages import DIAGNOSTIC_PAGE_ELEMENT_DESCRIPTOR
self.unpack_page(DIAGNOSTIC_PAGE_ELEMENT_DESCRIPTOR, RECEIVE_DIAGNOSTICS_RESULT_PAGE_07, self.get_conf_page())
def get_conf_page(self):
from infi.asi.cdb.diagnostic.ses_pages import DIAGNOSTIC_PAGE_CONFIGURATION
return self.unpack_page(DIAGNOSTIC_PAGE_CONFIGURATION, RECEIVE_DIAGNOSTICS_RESULT_PAGE_01, None)
def unpack_page(self, page, packed_data, conf_page):
page_class = get_ses_page(page)
data_class = get_ses_page_data(page)
data = data_class(conf_page=conf_page)
data.unpack(hex_to_bin(packed_data))
return data | 48.910072 | 118 | 0.60859 | 3,623 | 13,597 | 2.248413 | 0.038918 | 0.205745 | 0.180457 | 0.113921 | 0.859808 | 0.824454 | 0.821262 | 0.802603 | 0.792536 | 0.789344 | 0 | 0.683675 | 0.34993 | 13,597 | 278 | 119 | 48.910072 | 0.237923 | 0.004339 | 0 | 0.201581 | 0 | 0 | 0.744995 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.031621 | false | 0 | 0.031621 | 0.003953 | 0.079051 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
2d1df66b99c3566d880d243cac748d0c34698609 | 1,456 | py | Python | tests/mock-data-inzending-voor-toezicht-sparql-output.py | lblod/berichtencentrum-sync-with-kalliope-service | 23d8988bb72b0bacb3ac7181d233578d5001dc3e | [
"MIT"
] | 1 | 2020-12-17T20:05:47.000Z | 2020-12-17T20:05:47.000Z | tests/mock-data-inzending-voor-toezicht-sparql-output.py | lblod/berichtencentrum-sync-with-kalliope-service | 23d8988bb72b0bacb3ac7181d233578d5001dc3e | [
"MIT"
] | null | null | null | tests/mock-data-inzending-voor-toezicht-sparql-output.py | lblod/berichtencentrum-sync-with-kalliope-service | 23d8988bb72b0bacb3ac7181d233578d5001dc3e | [
"MIT"
] | null | null | null | # TODO: document is this valid data, or meant to trigger an error?
[
{
"inzending": {
"value": "http://data.lblod.info/submissions/2bc74060-9149-11ea-9b69-8543f48a35b0"
},
"bestuurseenheid": {
"value": "http://data.lblod.info/id/bestuurseenheden/974816591f269bb7d74aa1720922651529f3d3b2a787f5c60b73e5a0384950a4"
},
"decisionTypeLabel": {
"value": "Budget"
},
"sessionDate": {
"value": "2020-05-05T11:32:52.976Z"
},
"inzendingUuid": {
"value": "2bc74060-9149-11ea-9b69-8543f48a35b0"
},
"decisionType": {
"value": "https://data.vlaanderen.be/id/concept/BesluitType/40831a2c-771d-4b41-9720-0399998f1873"
}
},
{
"inzending": {
"value": "http://data.lblod.info/submissions/3bc74060-9149-11ea-9b69-8543f48a35b0"
},
"bestuurseenheid": {
"value": "http://data.lblod.info/id/bestuurseenheden/974816591f269bb7d74aa1720922651529f3d3b2a787f5c60b73e5a0384950a4"
},
"decisionTypeLabel": {
"value": "Budget"
},
"sessionDate": {
"value": "2020-06-06T11:32:52.976Z"
},
"inzendingUuid": {
"value": "3bc74060-9149-11ea-9b69-8543f48a35b0"
},
"decisionType": {
"value": "https://data.vlaanderen.be/id/concept/BesluitType/40831a2c-771d-4b41-9720-0399998f1873"
}
}
]
| 33.090909 | 129 | 0.578297 | 118 | 1,456 | 7.135593 | 0.432203 | 0.042755 | 0.061758 | 0.085511 | 0.923991 | 0.824228 | 0.824228 | 0.724466 | 0.724466 | 0.724466 | 0 | 0.262911 | 0.268544 | 1,456 | 43 | 130 | 33.860465 | 0.5277 | 0.043956 | 0 | 0.428571 | 0 | 0.047619 | 0.628777 | 0.086331 | 0 | 0 | 0 | 0.023256 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
742bd800561b1273840ab9c2b20d365a447c5e43 | 5,524 | py | Python | tests/pdf2text/test_integration.py | gowikel/savanamed | 97c3b507cff1bca6bf0f6750e1bc9934e8fe6033 | [
"MIT"
] | null | null | null | tests/pdf2text/test_integration.py | gowikel/savanamed | 97c3b507cff1bca6bf0f6750e1bc9934e8fe6033 | [
"MIT"
] | null | null | null | tests/pdf2text/test_integration.py | gowikel/savanamed | 97c3b507cff1bca6bf0f6750e1bc9934e8fe6033 | [
"MIT"
] | null | null | null | from io import BytesIO
def test_post_without_data_returns_an_error(client):
response = client.post('/api/v1/pdf2text/')
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert 'converted_data' in json.keys()
assert json['msg'] == 'There is no data'
assert json['converted_data'] == ''
def test_post_with_pdf_data_returns_text(client, pdffile):
data = {
'data': (pdffile, 'data.pdf'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert 'converted_data' in json.keys()
assert json['msg'] == 'OK'
assert ('This is a 59-year-old, right-handed woman '
'with a history of hypertension, '
'schizophrenia, and a') in json['converted_data']
def test_post_checks_detects_pdfs_with_wrong_extension(client, pdffile):
data = {
'data': (pdffile, 'data.gif'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert 'converted_data' in json.keys()
assert json['msg'] == 'OK'
assert ('This is a 59-year-old, right-handed woman '
'with a history of hypertension, '
'schizophrenia, and a') in json['converted_data']
def test_post_returns_empty_with_no_pdf_files(client):
plain_text_file = BytesIO(b'Umm, no PDF here.')
data = {
'data': (plain_text_file, 'data.txt'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert 'converted_data' in json.keys()
assert json['msg'] == 'Invalid mimetype: text/plain'
assert json['converted_data'] == ''
def test_post_returns_an_anonymized_test(client, pdffile):
data = {
'data': (pdffile, 'data.gif'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert 'converted_data' in json.keys()
assert json['msg'] == 'OK'
assert 'MD:' not in json['converted_data']
assert 'Debra Jones' not in json['converted_data']
assert 'DATE:' not in json['converted_data']
assert '12/01/10' not in json['converted_data']
assert 'RE:' not in json['converted_data']
assert 'DOB:' not in json['converted_data']
assert '12/01/65' not in json['converted_data']
def test_document_is_stored_on_db(client, db, pdffile):
data = {
'data': (pdffile, 'data.pdf'),
}
client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
assert db.patient_documents.count_documents({}) == 1
def test_a_repeated_document_is_only_stored_once(client, db, pdffile):
pdfdata = pdffile.read()
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
assert db.patient_documents.count_documents({}) == 1
def test_msg_informs_when_a_repeated_document_is_set(client, pdffile):
pdfdata = pdffile.read()
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'msg' in json.keys()
assert json['msg'] == ('OK. The document was already stored '
'in the database')
def test_converted_data_is_returned_filled_with_duplicated_documents(client,
pdffile):
pdfdata = pdffile.read()
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
data = {
'data': (BytesIO(pdfdata), 'data.pdf'),
}
response = client.post('/api/v1/pdf2text/',
content_type='multipart/form-data',
data=data)
status_code = response.status_code
json = response.json
assert status_code == 200
assert 'converted_data' in json.keys()
assert ('This is a 59-year-old, right-handed woman '
'with a history of hypertension, '
'schizophrenia, and a') in json['converted_data']
| 31.930636 | 78 | 0.58798 | 654 | 5,524 | 4.787462 | 0.155963 | 0.091983 | 0.065155 | 0.05749 | 0.844778 | 0.84382 | 0.773874 | 0.737464 | 0.717023 | 0.717023 | 0 | 0.016439 | 0.284214 | 5,524 | 172 | 79 | 32.116279 | 0.775417 | 0 | 0 | 0.746479 | 0 | 0 | 0.230811 | 0 | 0 | 0 | 0 | 0 | 0.274648 | 1 | 0.06338 | false | 0 | 0.007042 | 0 | 0.070423 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
74413d9b839c78472c96fff337c892eab1e76ad5 | 559 | py | Python | block_1/task_3rd.py | erdyneevzt/stepik_python | 014fb618426dbee7f76b317c539d7d0363c87d15 | [
"MIT"
] | null | null | null | block_1/task_3rd.py | erdyneevzt/stepik_python | 014fb618426dbee7f76b317c539d7d0363c87d15 | [
"MIT"
] | null | null | null | block_1/task_3rd.py | erdyneevzt/stepik_python | 014fb618426dbee7f76b317c539d7d0363c87d15 | [
"MIT"
] | null | null | null | num_1 = float(input())
num_2 = float(input())
oper = str(input())
if oper == "+":
print(num_1+num_2)
elif oper == "-":
print(num_1 - num_2)
elif oper == "/":
if num_2 == 0:
print("Деление на 0!")
else:
print(num_1/num_2)
elif oper == "*":
print(num_1 * num_2)
elif oper == "mod":
if num_2 == 0:
print("Деление на 0!")
else:
print(num_1 % num_2)
elif oper == "pow":
print(num_1**num_2)
elif oper == "div":
if num_2 == 0:
print("Деление на 0!")
else:
print(num_1//num_2)
| 19.964286 | 30 | 0.520572 | 90 | 559 | 3.022222 | 0.188889 | 0.161765 | 0.231618 | 0.308824 | 0.8125 | 0.8125 | 0.8125 | 0.735294 | 0.720588 | 0.720588 | 0 | 0.063291 | 0.293381 | 559 | 27 | 31 | 20.703704 | 0.625316 | 0 | 0 | 0.346154 | 0 | 0 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.384615 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
747114dbc4d368faa9bdbbdead7caef6cc0f314f | 357 | py | Python | chainer_compiler/__init__.py | vermashresth/chainer-compiler | 5f5ad365d14398d6ae0214fa012eb10360db8e7e | [
"MIT"
] | 116 | 2019-01-25T03:54:44.000Z | 2022-03-08T00:11:14.000Z | chainer_compiler/__init__.py | vermashresth/chainer-compiler | 5f5ad365d14398d6ae0214fa012eb10360db8e7e | [
"MIT"
] | 431 | 2019-01-25T10:18:44.000Z | 2020-06-17T05:28:55.000Z | chainer_compiler/__init__.py | vermashresth/chainer-compiler | 5f5ad365d14398d6ae0214fa012eb10360db8e7e | [
"MIT"
] | 26 | 2019-01-25T07:21:09.000Z | 2021-11-26T04:24:35.000Z | from chainer_compiler.chainer_compiler import compile # noqa
from chainer_compiler.chainer_compiler import compile_onnx # noqa
from chainer_compiler.chainer_compiler import export # noqa
from chainer_compiler.chainer_compiler import use_unified_memory_allocator # noqa
from chainer_compiler.chainer_compiler import use_chainerx_shared_allocator # noqa
| 59.5 | 83 | 0.87395 | 47 | 357 | 6.276596 | 0.297872 | 0.508475 | 0.322034 | 0.440678 | 0.8 | 0.8 | 0.8 | 0.318644 | 0 | 0 | 0 | 0 | 0.098039 | 357 | 5 | 84 | 71.4 | 0.916149 | 0.067227 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
77dc90876fec5ff3a9a033f9aebcc840f5629040 | 2,328 | py | Python | test/fstrings/empty2.py | kylebarron/MagicPython | da6fa0793e2c85d3bf7709ff1d4f65ccf468db11 | [
"MIT"
] | 1,482 | 2015-10-16T21:59:32.000Z | 2022-03-30T11:44:40.000Z | test/fstrings/empty2.py | kylebarron/MagicPython | da6fa0793e2c85d3bf7709ff1d4f65ccf468db11 | [
"MIT"
] | 226 | 2015-10-15T15:53:44.000Z | 2022-03-25T03:08:27.000Z | test/fstrings/empty2.py | kylebarron/MagicPython | da6fa0793e2c85d3bf7709ff1d4f65ccf468db11 | [
"MIT"
] | 129 | 2015-10-20T02:41:49.000Z | 2022-03-22T01:44:36.000Z | rf"{} { }"
rf"""{}
{ }
"""
rf : source.python, storage.type.string.python, string.interpolated.python, string.regexp.quoted.single.python
" : punctuation.definition.string.begin.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
{ : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
} : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
: source.python, string.interpolated.python, string.regexp.quoted.single.python
{ : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
: invalid.illegal.brace.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
} : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
" : punctuation.definition.string.end.python, source.python, string.interpolated.python, string.regexp.quoted.single.python
rf : source.python, storage.type.string.python, string.interpolated.python, string.regexp.quoted.multi.python
""" : punctuation.definition.string.begin.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
{ : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
} : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
{ : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
: invalid.illegal.brace.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
} : constant.character.format.placeholder.other.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
""" : punctuation.definition.string.end.python, source.python, string.interpolated.python, string.regexp.quoted.multi.python
| 86.222222 | 145 | 0.741409 | 258 | 2,328 | 6.689922 | 0.089147 | 0.236385 | 0.236385 | 0.295481 | 0.997683 | 0.997683 | 0.997683 | 0.997683 | 0.997683 | 0.997683 | 0 | 0 | 0.135739 | 2,328 | 26 | 146 | 89.538462 | 0.857853 | 0 | 0 | 0.266667 | 0 | 0.4 | 0.010101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
bb09f47d1412d5ce246a08062e976e36e94bdb0b | 4,425 | py | Python | atsapi/dbinfo.py | kcphysics/ATSAPI | 4a14133d9143d5cb1ecfc434dc49e37a72f94db4 | [
"MIT"
] | null | null | null | atsapi/dbinfo.py | kcphysics/ATSAPI | 4a14133d9143d5cb1ecfc434dc49e37a72f94db4 | [
"MIT"
] | 4 | 2020-08-30T03:52:50.000Z | 2020-09-03T02:51:01.000Z | atsapi/dbinfo.py | kcphysics/ATSAPI | 4a14133d9143d5cb1ecfc434dc49e37a72f94db4 | [
"MIT"
] | null | null | null | import json
from aiohttp import web
async def get_markets(request:web.Request):
"""
---
description: Gets a list of objects that have markets associated with them
tags:
- Information
parameters:
- in: query
name: organization
type: string
description: Organization to bound to, default is all
required: false
default: None
- in: query
name: withinfo
type: int
default: 1
description: 1 -> Gives information, 0 -> Gives only names
required: false
- in: query
name: output
default: csv
type: string
description: Determines wether to give csv or JSON output
required: false
produces:
- text/plain
responses:
"200":
description: Successfully fetched market information, even if nothing returns
"400":
description: Client side EWrror
"""
params = request.query
output = params.get('output', 'csv')
org = params.get('organization')
withinfo = int(params.get('withinfo', 1))
if org and org != "None":
markets = [v.tojson() for _, v in request.app['atsdb'].items() if v.market > 0 and org.lower() in v.empire.lower()]
else:
markets = [v.tojson() for _, v in request.app['atsdb'].items() if v.market > 0]
if not withinfo:
return web.Response(body="\n".join(v['name'] for v in markets), content_type="text/plain")
if output == "json":
return web.json_response(json.dumps(markets))
else:
rstring = "{0:<25s}\t{1:15s}\t{2:10s}\t{3:10s}\n".format(
"Name of Object",
"Empire",
"Type",
"Market"
)
for mark in markets:
rstring += "{name:<25s}\t{empire:15s}\t{type:10s}\t{market:6d}\n".format(**mark)
return web.Response(body=rstring, content_type="text/plain")
async def get_objs_by_org(request:web.Request):
"""
---
description: Gets a list of objects that have markets associated with them
tags:
- Information
parameters:
- in: query
name: organization
type: string
description: Organization to bound to, default is all
required: false
default: None
- in: query
name: withinfo
type: int
default: 1
description: 1 -> Gives information, 0 -> Gives only names
required: false
- in: query
name: type
default: all
type: string
description: Can be all, planets, stations
required: false
- in: query
name: output
default: csv
type: string
description: Determines wether to give csv or JSON output
required: false
produces:
- text/plain
responses:
"200":
description: Successfully fetched market information, even if nothing returns
"400":
description: Client side EWrror
"""
params = request.query
output = params.get('output', 'csv')
otype = params.get('type', 'all')
org = params.get('organization')
withinfo = int(params.get('withinfo', 1))
if org and org != "None":
markets = [v.tojson() for _, v in request.app['atsdb'].items() if org.lower() in v.empire.lower()]
else:
markets = [v.tojson() for _, v in request.app['atsdb'].items()]
if otype != "all":
markets = [x for x in markets if otype.lower() in x['type'].lower()]
if not withinfo:
return web.Response(body="\n".join(v['name'] for v in markets), content_type="text/plain")
if output == "json":
return web.json_response(markets)
else:
rstring = "{0:<25s}\t{1:15s}\t{2:10s}\t{3:10s}\n".format(
"Name of Object",
"Empire",
"Type",
"Market"
)
for mark in markets:
rstring += "{name:<25s}\t{empire:15s}\t{type:10s}\t{market:6d}\n".format(**mark)
return web.Response(body=rstring, content_type="text/plain")
| 35.119048 | 123 | 0.528814 | 501 | 4,425 | 4.642715 | 0.211577 | 0.021066 | 0.033104 | 0.029235 | 0.908856 | 0.908856 | 0.908856 | 0.908856 | 0.908856 | 0.908856 | 0 | 0.021164 | 0.359322 | 4,425 | 125 | 124 | 35.4 | 0.799295 | 0 | 0 | 0.705882 | 0 | 0.078431 | 0.182318 | 0.081539 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039216 | 0 | 0.156863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bb50f667527491c994dd85039d48552847f4553a | 116 | py | Python | QuickDemos/fuzzysearch.py | Crocster/CMPT-120L-201-22S | 7de05b152dba9975b7a5e56f1f1432cdb325d74e | [
"MIT"
] | 1 | 2022-02-19T17:57:57.000Z | 2022-02-19T17:57:57.000Z | QuickDemos/fuzzysearch.py | Crocster/CMPT-120L-201-22S | 7de05b152dba9975b7a5e56f1f1432cdb325d74e | [
"MIT"
] | null | null | null | QuickDemos/fuzzysearch.py | Crocster/CMPT-120L-201-22S | 7de05b152dba9975b7a5e56f1f1432cdb325d74e | [
"MIT"
] | 24 | 2022-02-03T01:37:44.000Z | 2022-02-17T00:00:10.000Z | from fuzzywuzzy import fuzz
# from fuzzywuzzy import process
print(fuzz.ratio("this is a test", "this is a test!")) | 29 | 54 | 0.75 | 19 | 116 | 4.578947 | 0.578947 | 0.321839 | 0.45977 | 0.252874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146552 | 116 | 4 | 54 | 29 | 0.878788 | 0.258621 | 0 | 0 | 0 | 0 | 0.341176 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
24d64f9ea31bd7c1a95e51ea4b26d686640f465e | 52,128 | py | Python | ornstein_auto_encoder/build_network.py | Young-won/ornstein_auto_encoders | 7ebe15f79f56cb75f4948dc7cf6d4633ccec84ec | [
"MIT"
] | 1 | 2021-06-08T10:28:32.000Z | 2021-06-08T10:28:32.000Z | ornstein_auto_encoder/build_network.py | Young-won/ornstein_auto_encoders | 7ebe15f79f56cb75f4948dc7cf6d4633ccec84ec | [
"MIT"
] | null | null | null | ornstein_auto_encoder/build_network.py | Young-won/ornstein_auto_encoders | 7ebe15f79f56cb75f4948dc7cf6d4633ccec84ec | [
"MIT"
] | null | null | null | import time
import json
import sys
import os
import abc
import numpy as np
import pandas as pd
from functools import partial
import keras
import keras.backend as k
from keras.models import Model, load_model, model_from_yaml
from keras.layers import Input, Concatenate, Conv2D, Lambda, Dense, Add, Average, Multiply
from keras.engine.training_utils import is_sequence, iter_sequence_infinite, should_run_validation
from keras.utils.data_utils import Sequence, OrderedEnqueuer, GeneratorEnqueuer
from keras.utils.generic_utils import Progbar, to_list, unpack_singleton
from keras.utils import multi_gpu_model
from keras.utils import to_categorical
import keras.callbacks as cbks
import tensorflow as tf
from . import loss_and_metric
from .tensorboard_utils import *
from .ops import *
from ._build_base_network import *
#####################################################################################################################
# ProductSpaceOAE_GAN Network with HSIC
#####################################################################################################################
class ProductSpaceOAEHSIC_GAN(WAE_GAN):
def __init__(self, log, path_info, network_info, n_label, is_profiling=False):
super(ProductSpaceOAEHSIC_GAN, self).__init__(log, path_info, network_info, n_label, is_profiling=is_profiling)
self.metrics_names = ['main_loss', 'reconstruction', 'penalty_e', 'penalty_b', 'penalty_hsic',
'discriminator_loss',
]
self.TB = ProductSpaceOAETensorBoardWrapper_GAN
self.b_sd = float(network_info['model_info']['b_sd'])
self.lambda_b = float(network_info['model_info']['lambda_b'])
self.lambda_hsic = float(network_info['model_info']['lambda_hsic'])
try: self.e_weight = float(network_info['model_info']['e_weight'])
except: self.e_weight = 1.
try: self.e_train = not ('false' == network_info['model_info']['e_train'].strip().lower())
except: self.e_train = True
try: self.b_train = not ('false' == network_info['model_info']['b_train'].strip().lower())
except: self.b_train = True
try: self.feature_b = 'true' == network_info['model_info']['feature_b'].strip().lower()
except: self.feature_b = False
try: self.reset_e = 'true' == network_info['model_info']['reset_e'].strip().lower()
except: self.reset_e = False
def build_model(self, model_yaml_dir=None, verbose=0):
"""
verbose
0: Not show any model
1: Show AE, Discriminator model
2: Show all models
"""
# Load Models : encoder, decoder, discriminator
if model_yaml_dir == None: model_yaml_dir = os.path.join(self.model_save_dir, self.path_info['model_info']['model_architecture'])
self._encoder_base_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_base'], verbose=verbose==2)
self._encoder_b_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_b'], verbose=verbose==2)
self._encoder_e_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_e'], verbose=verbose==2)
self.decoder_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_decoder'], verbose=verbose==2)
self._discriminator_e_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_discriminator'], verbose=verbose==2)
self.save_models = {"encoder_base":self._encoder_base_model,
"encoder_b":self._encoder_b_model,
"encoder_e":self._encoder_e_model,
"decoder":self.decoder_model,
"discriminator":self._discriminator_e_model
}
self._discriminator_e_model.name = 'discriminator_e'
# build blocks
self.image_shape = self._encoder_base_model.input_shape[1:]
if self.feature_b: self.feature_shape = self._encoder_b_model.input_shape[1:]
self.b_z_dim = self._encoder_b_model.output_shape[-1]
self.e_z_dim = self._encoder_e_model.output_shape[-1]
real_image = Input(shape=self.image_shape, name='real_image_input', dtype='float32')
cls_info = Input(shape=(1,), name='class_info_input', dtype='int32')
prior_e_noise = Input(shape=(self.e_z_dim,), name='prior_e_input', dtype='float32')
prior_b_noise = Input(shape=(self.b_z_dim,), name='prior_b_input', dtype='float32')
if self.feature_b: feature_for_b = Input(shape=self.feature_shape, name='feature_b_input', dtype='float32')
# Encoder base
last_h = self._encoder_base_model([real_image])
# B_i ~ Q_B|X=x^i
if self.feature_b: b_j_given_x_j = self._encoder_b_model([feature_for_b])
else: b_j_given_x_j = self._encoder_b_model([last_h])
sample_b, b_given_x = Lambda(get_b, name='get_b_given_x')([b_j_given_x_j, cls_info])
if self.feature_b: self.encoder_b_model = Model([feature_for_b, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
else: self.encoder_b_model = Model([real_image, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
# E^i_j ~Q_E_0|X_0,B=X^i_j,B_i
e_given_x_b = self._encoder_e_model([last_h, sample_b])
if self.feature_b: self.encoder_e_model = Model([real_image, feature_for_b, cls_info], [e_given_x_b], name='encoder_e_model')
else: self.encoder_e_model = Model([real_image, cls_info], [e_given_x_b], name='encoder_e_model')
# Z^i_j = (B_i, E^i_j)
b_input = Input(shape=(self.b_z_dim,), name='estimated_b_input', dtype='float32')
noise_input = Input(shape=(self.e_z_dim,), name='noise_input', dtype='float32')
if self.e_weight != 1.: noise_weighted = Lambda(lambda x : self.e_weight*x, name='noise_weighted')(noise_input)
else: noise_weighted = noise_input
latent = Concatenate(axis=1, name='concat_latent')([b_input, noise_weighted])
self.z_encoder_model = Model([b_input, noise_input], [latent], name='encoder_z_model')
# Build connections
if self.feature_b:
sample_b, b_given_x, b_j_given_x_j = self.encoder_b_model([feature_for_b, cls_info])
e_given_x_b = self.encoder_e_model([real_image, feature_for_b, cls_info])
else:
sample_b, b_given_x, b_j_given_x_j = self.encoder_b_model([real_image, cls_info])
e_given_x_b = self.encoder_e_model([real_image, cls_info])
fake_latent = self.z_encoder_model([sample_b, e_given_x_b])
recon_image = self.decoder_model(fake_latent)
if self.feature_b: self.ae_model = Model(inputs=[real_image, feature_for_b, cls_info], outputs=[recon_image], name='ae_model')
else: self.ae_model = Model(inputs=[real_image, cls_info], outputs=[recon_image], name='ae_model')
if verbose==2:
self.log.info('Auto-Encoder model')
self.ae_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
# GAN model
p_e = self._discriminator_e_model(prior_e_noise)
q_e = self._discriminator_e_model(e_given_x_b)
output = Concatenate(name='mlp_concat')([p_e, q_e]) ## TODO : fix..
if self.feature_b: self.gan_model = Model(inputs=[real_image, feature_for_b, cls_info, prior_e_noise], outputs=[output], name='GAN_model')
else: self.gan_model = Model(inputs=[real_image, cls_info, prior_e_noise], outputs=[output], name='GAN_model')
# WAE model
recon_error = Lambda(mean_reconstruction_l2sq_loss, name='mean_recon_error')([real_image, recon_image])
# recon_error = Lambda(mean_reconstruction_l2sq_loss_e, name='mean_recon_error')([real_image, recon_image, cls_info])
penalty_e = Lambda(get_qz_trick_loss, name='penalty_e')(q_e)
penalty_b = Lambda(get_b_penalty_loss, name='penalty_b',
arguments={'sigma':self.b_sd, 'zdim':self.b_z_dim, 'kernel':'RBF', 'p_z':'normal'})([prior_b_noise, b_given_x])
penalty_hsic = Lambda(get_hsic, name="penalty_hsic")([e_given_x_b, sample_b])
if self.feature_b: self.main_model = Model(inputs=[real_image, feature_for_b, cls_info, prior_b_noise],
outputs=[recon_error, penalty_e, penalty_b, penalty_hsic], name='main_model')
else: self.main_model = Model(inputs=[real_image, cls_info, prior_b_noise],
outputs=[recon_error, penalty_e, penalty_b, penalty_hsic], name='main_model')
# Blur information
prior_latent = Input(shape=(self.b_z_dim + self.e_z_dim,), name='prior_z_input', dtype='float32')
self.blurr_model = get_compute_blurriness_model(self.image_shape)
gen_image = self.decoder_model(prior_latent)
gen_sharpness = self.blurr_model(gen_image)
self.gen_blurr_model = Model(inputs=[prior_latent], outputs=[gen_sharpness], name='gen_blurr_model')
if verbose==2:
self.log.info('Generative sample blurr model')
self.gen_blurr_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
# cluster information
ssw, ssb, n_points_mean, n_l = Lambda(self._get_cluster_information_by_class_index,
name='get_cluster_information_by_class_index')([b_j_given_x_j, cls_info])
if self.feature_b:
self.cluster_info_model = Model(inputs=[feature_for_b, cls_info],
outputs=[ssw, ssb, n_points_mean, n_l], name='get_cluster_info')
else:
self.cluster_info_model = Model(inputs=[real_image, cls_info], outputs=[ssw, ssb, n_points_mean, n_l], name='get_cluster_info')
if verbose==2:
self.log.info('Cluster information model')
self.cluster_info_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
try:
self.parallel_main_model = multi_gpu_model(self.main_model, gpus=self.number_of_gpu)
self.parallel_gan_model = multi_gpu_model(self.gan_model, gpus=self.number_of_gpu)
self.log.info("Training using multiple GPUs")
except ValueError:
self.parallel_main_model = self.main_model
self.parallel_gan_model = self.gan_model
self.log.info("Training using single GPU or CPU")
self.train_models = {'discriminator':self.gan_model, 'main':self.main_model}
self.parallel_train_models = {'discriminator':self.parallel_gan_model, 'main':self.parallel_main_model}
self.train_models_lr = {'discriminator':{'lr':float(self.network_info['model_info']['lr_e_adv']),
'decay':float(self.network_info['model_info']['lr_e_adv_decay'])},
'main':{'lr':float(self.network_info['model_info']['lr_e']),
'decay':float(self.network_info['model_info']['lr_e_decay'])}}
if verbose:
self.log.info('Main model')
self.main_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Discriminator model')
self.gan_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
def model_compile(self, verbose=0):
self.log.info('Start models compile.')
if self.network_info['model_info']['optimizer'] =='adam':
optimizer_e = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['main']['lr'],
beta_1=float(self.network_info['model_info']['lr_e_beta1']))
optimizer_e_adv = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['discriminator']['lr'],
beta_1=float(self.network_info['model_info']['lr_e_adv_beta1']))
else:
optimizer_e = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['main']['lr'])
optimizer_e_adv = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['discriminator']['lr'])
if self.reset_e:
self.reset_weights(self._encoder_base_model)
self.reset_weights(self._encoder_e_model)
self.reset_weights(self._discriminator_e_model)
# GAN model compile
self._encoder_b_model.trainable = False
self._encoder_e_model.trainable = False
self._encoder_base_model.trainable = False
self.decoder_model.trainable = False
self._discriminator_e_model.trainable = self.e_train
self.parallel_gan_model.compile(loss=getattr(loss_and_metric, self.network_info['model_info']['discriminator_loss']),
optimizer=optimizer_e_adv, options=self.run_options, run_metadata=self.run_metadata)
# WAE model compile
self.decoder_model.trainable = True
self._encoder_b_model.trainable = self.b_train
self._encoder_e_model.trainable = self.e_train
self._encoder_base_model.trainable = self.e_train
self._discriminator_e_model.trainable = False
self.parallel_main_model.compile(loss={'mean_recon_error':getattr(loss_and_metric, self.network_info['model_info']['main_loss']),
'penalty_e':getattr(loss_and_metric, self.network_info['model_info']['penalty_e']),
'penalty_b':getattr(loss_and_metric, self.network_info['model_info']['penalty_b']),
'penalty_hsic':getattr(loss_and_metric, self.network_info['model_info']['penalty_b']),
},
loss_weights=[1., self.lambda_e, self.lambda_b, self.lambda_hsic],
optimizer=optimizer_e, options=self.run_options, run_metadata=self.run_metadata)
if verbose:
for name, model in self.parallel_train_models.items():
self.log.info('%s model' % name)
model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Model compile done.')
def save(self, filepath, is_compile=True, overwrite=True, include_optimizer=True):
model_path = self.path_info['model_info']['weight']
for name, model in self.save_models.items():
model.save("%s/%s_%s" % (filepath, name, model_path), overwrite=overwrite, include_optimizer=include_optimizer)
self.log.debug('Save model at %s' % filepath)
def load(self, filepath, verbose=0):
model_path = self.path_info['model_info']['weight']
loss_list = [self.network_info['model_info']['main_loss'],
self.network_info['model_info']['penalty_e'],
self.network_info['model_info']['discriminator_loss']]
load_dict = dict([(loss_name, getattr(loss_and_metric, loss_name)) for loss_name in loss_list])
load_dict['SelfAttention2D'] = SelfAttention2D
load_dict['get_qz_trick_loss'] = get_qz_trick_loss
load_dict['get_qz_trick_with_weight_loss'] = get_qz_trick_with_weight_loss
load_dict['get_hsic'] = get_hsic
load_dict['mmd_penalty'] = mmd_penalty
load_dict['get_b'] = get_b
load_dict['get_b_estimation_var'] = get_b_estimation_var
load_dict['get_b_penalty_loss'] = get_b_penalty_loss
load_dict['mean_reconstruction_l2sq_loss'] = mean_reconstruction_l2sq_loss
load_dict['get_class_mean_by_class_index'] = get_class_mean_by_class_index
load_dict['concat_with_uniform_sample'] = concat_with_uniform_sample
load_dict['get_batch_covariance'] = get_batch_covariance
load_dict['get_mutual_information_from_gaussian_sample'] = get_mutual_information_from_gaussian_sample
# TODO : fix save & load
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_base", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path), overwrite=False)
self._encoder_base_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_b", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path), overwrite=False)
self._encoder_b_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_e", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path), overwrite=False)
self._encoder_e_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "decoder", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "decoder", model_path), overwrite=False)
self.decoder_model.load_weights("%s/tmp_%s_%s" % (filepath, "decoder", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "decoder", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "discriminator", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "discriminator", model_path), overwrite=False)
self._discriminator_e_model.load_weights("%s/tmp_%s_%s" % (filepath, "discriminator", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "discriminator", model_path))
self._discriminator_e_model.name = 'discriminator_e'
# build blocks
self.image_shape = self._encoder_base_model.input_shape[1:]
if self.feature_b: self.feature_shape = self._encoder_b_model.input_shape[1:]
self.b_z_dim = self._encoder_b_model.output_shape[-1]
self.e_z_dim = self._encoder_e_model.output_shape[-1]
real_image = Input(shape=self.image_shape, name='real_image_input', dtype='float32')
cls_info = Input(shape=(1,), name='class_info_input', dtype='int32')
prior_e_noise = Input(shape=(self.e_z_dim,), name='prior_e_input', dtype='float32')
prior_b_noise = Input(shape=(self.b_z_dim,), name='prior_b_input', dtype='float32')
if self.feature_b: feature_for_b = Input(shape=self.feature_shape, name='feature_b_input', dtype='float32')
# Encoder base
last_h = self._encoder_base_model([real_image])
# B_i ~ Q_B|X=x^i
if self.feature_b: b_j_given_x_j = self._encoder_b_model([feature_for_b])
else: b_j_given_x_j = self._encoder_b_model([last_h])
sample_b, b_given_x = Lambda(get_b, name='get_b_given_x')([b_j_given_x_j, cls_info])
if self.feature_b: self.encoder_b_model = Model([feature_for_b, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
else: self.encoder_b_model = Model([real_image, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
# E^i_j ~Q_E_0|X_0,B=X^i_j,B_i
e_given_x_b = self._encoder_e_model([last_h, sample_b])
if self.feature_b: self.encoder_e_model = Model([real_image, feature_for_b, cls_info], [e_given_x_b], name='encoder_e_model')
else: self.encoder_e_model = Model([real_image, cls_info], [e_given_x_b], name='encoder_e_model')
# Z^i_j = (B_i, E^i_j)
b_input = Input(shape=(self.b_z_dim,), name='estimated_b_input', dtype='float32')
noise_input = Input(shape=(self.e_z_dim,), name='noise_input', dtype='float32')
if self.e_weight != 1.: noise_weighted = Lambda(lambda x : self.e_weight*x, name='noise_weighted')(noise_input)
else: noise_weighted = noise_input
latent = Concatenate(axis=1, name='concat_latent')([b_input, noise_weighted])
self.z_encoder_model = Model([b_input, noise_input], [latent], name='encoder_z_model')
# Build connections
if self.feature_b:
sample_b, b_given_x, b_j_given_x_j = self.encoder_b_model([feature_for_b, cls_info])
e_given_x_b = self.encoder_e_model([real_image, feature_for_b, cls_info])
else:
sample_b, b_given_x, b_j_given_x_j = self.encoder_b_model([real_image, cls_info])
e_given_x_b = self.encoder_e_model([real_image, cls_info])
fake_latent = self.z_encoder_model([sample_b, e_given_x_b])
recon_image = self.decoder_model(fake_latent)
if self.feature_b: self.ae_model = Model(inputs=[real_image, feature_for_b, cls_info], outputs=[recon_image], name='ae_model')
else: self.ae_model = Model(inputs=[real_image, cls_info], outputs=[recon_image], name='ae_model')
if verbose==2:
self.log.info('Auto-Encoder model')
self.ae_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
# GAN model
p_e = self._discriminator_e_model(prior_e_noise)
q_e = self._discriminator_e_model(e_given_x_b)
output = Concatenate(name='mlp_concat')([p_e, q_e]) ## TODO : fix..
if self.feature_b: self.gan_model = Model(inputs=[real_image, feature_for_b, cls_info, prior_e_noise], outputs=[output], name='GAN_model')
else: self.gan_model = Model(inputs=[real_image, cls_info, prior_e_noise], outputs=[output], name='GAN_model')
# WAE model
recon_error = Lambda(mean_reconstruction_l2sq_loss, name='mean_recon_error')([real_image, recon_image])
penalty_e = Lambda(get_qz_trick_loss, name='penalty_e')(q_e)
penalty_b = Lambda(get_b_penalty_loss, name='penalty_b',
arguments={'sigma':self.b_sd, 'zdim':self.b_z_dim, 'kernel':'RBF', 'p_z':'normal'})([prior_b_noise, b_given_x])
penalty_hsic = Lambda(get_hsic, name="penalty_hsic")([e_given_x_b, sample_b])
if self.feature_b: self.main_model = Model(inputs=[real_image, feature_for_b, cls_info, prior_b_noise],
outputs=[recon_error, penalty_e, penalty_b, penalty_hsic], name='main_model')
else: self.main_model = Model(inputs=[real_image, cls_info, prior_b_noise],
outputs=[recon_error, penalty_e, penalty_b, penalty_hsic], name='main_model')
# Blur information
prior_latent = Input(shape=(self.b_z_dim + self.e_z_dim,), name='prior_z_input', dtype='float32')
self.blurr_model = get_compute_blurriness_model(self.image_shape)
gen_image = self.decoder_model(prior_latent)
gen_sharpness = self.blurr_model(gen_image)
self.gen_blurr_model = Model(inputs=[prior_latent], outputs=[gen_sharpness], name='gen_blurr_model')
# cluster information
ssw, ssb, n_points_mean, n_l = Lambda(self._get_cluster_information_by_class_index,
name='get_cluster_information_by_class_index')([b_j_given_x_j, cls_info])
if self.feature_b:
self.cluster_info_model = Model(inputs=[feature_for_b, cls_info],
outputs=[ssw, ssb, n_points_mean, n_l], name='get_cluster_info')
else:
self.cluster_info_model = Model(inputs=[real_image, cls_info], outputs=[ssw, ssb, n_points_mean, n_l], name='get_cluster_info')
self.model_compile()
self.log.info('Loaded WAE model')
self.main_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Loaded Discriminator model: GAN')
self.gan_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
def discriminator_sampler(self, x, y):
e_noise = self.noise_sampler(y.shape[0], self.e_z_dim, self.e_sd)
if self.feature_b:
return [x[0], x[1], y[:, np.newaxis], e_noise], [np.zeros([x[0].shape[0],2], dtype='float32')]
else:
return [x, y[:, np.newaxis], e_noise], [np.zeros([x.shape[0],2], dtype='float32')]
def main_sampler(self, x, y):
b_noise = self.noise_sampler(y.shape[0], self.b_z_dim, self.b_sd) #, dist='spherical_uniform')
if self.feature_b:
return [x[0], x[1], y[:,np.newaxis], b_noise], [np.zeros(x[0].shape[0], dtype='float32')]*4
else:
return [x, y[:,np.newaxis], b_noise], [np.zeros(x.shape[0], dtype='float32')]*4
def train_on_batch(self, x, y, sample_weight=None, class_weight=None, reset_metrics=True):
wx, wy = self.main_sampler(x, y)
main_outs = self.parallel_main_model.train_on_batch(wx, wy,
sample_weight=sample_weight, class_weight=class_weight, reset_metrics=reset_metrics)
dx, dy = self.discriminator_sampler(x, y)
if self.e_train: d_outs = self.parallel_gan_model.train_on_batch(dx, dy,
sample_weight=sample_weight, class_weight=class_weight, reset_metrics=reset_metrics)
else: d_outs = 0
return (main_outs +
[d_outs]
)
def test_on_batch(self, x, y, sample_weight=None, reset_metrics=True):
wx, wy = self.main_sampler(x, y)
main_outs = self.parallel_main_model.test_on_batch(wx, wy,
sample_weight=sample_weight, reset_metrics = reset_metrics)
dx, dy = self.discriminator_sampler(x, y)
if self.e_train: d_outs = self.parallel_gan_model.test_on_batch(dx, dy,
sample_weight=sample_weight, reset_metrics = reset_metrics)
else: d_outs = 0
return (main_outs +
[d_outs]
)
def get_reference_images(self, generator):
batches = [generator[i] for i in range(4)]
self.fixed_classes = np.concatenate([batch[1] for batch in batches])
if self.feature_b:
self.fixed_feature = np.concatenate([batch[0][1] for batch in batches])
return np.concatenate([batch[0][0] for batch in batches])
else:
return np.concatenate([batch[0] for batch in batches])
def on_train_begin(self, x):
self.fixed_images = x[self.fixed_classes == self.fixed_classes[0]]
if self.feature_b: self.fixed_feature = self.fixed_feature[self.fixed_classes == self.fixed_classes[0]]
self.fixed_classes = self.fixed_classes[self.fixed_classes == self.fixed_classes[0]]
real_image_blurriness = self.blurr_model.predict_on_batch(x)
self.fixed_noise = self.noise_sampler(x.shape[0], self.e_z_dim, self.e_sd)
self.log.info("Real image's sharpness = %.5f" % np.min(real_image_blurriness))
def on_epoch_end(self, epoch):
for name in self.train_models_lr.keys():
if self.train_models_lr[name]['decay'] > 0.:
self.train_models_lr[name]['lr'] = self._update_lr(epoch, lr=self.train_models_lr[name]['lr'],
decay=self.train_models_lr[name]['decay'])
k.set_value(self.parallel_train_models[name].optimizer.lr, self.train_models_lr[name]['lr'])
#####################################################################################################################
# ProductSpaceOAE using fixed b and HSIC GAN Network
#####################################################################################################################
class ProductSpaceOAEFixedBHSIC_GAN(WAE_GAN):
def __init__(self, log, path_info, network_info, n_label, is_profiling=False):
super(ProductSpaceOAEFixedBHSIC_GAN, self).__init__(log, path_info, network_info, n_label, is_profiling=is_profiling)
self.metrics_names = ['main_loss', 'reconstruction', 'penalty_e', 'penalty_hsic',
'discriminator_loss',
]
self.TB = ProductSpaceOAEFixedBTensorBoardWrapper_GAN
self.b_sd = float(network_info['model_info']['b_sd'])
self.lambda_hsic = float(network_info['model_info']['lambda_hsic'])
try: self.e_weight = float(network_info['model_info']['e_weight'])
except: self.e_weight = 1.
try: self.e_train = not ('false' == network_info['model_info']['e_train'].strip().lower())
except: self.e_train = True
try: self.reset_e = 'true' == network_info['model_info']['reset_e'].strip().lower()
except: self.reset_e = False
try: self.feature_b = 'true' == network_info['model_info']['feature_b'].strip().lower()
except: self.feature_b = False
try: self.fixed_b_path = network_info['training_info']['fixed_b_path'].strip()
except: raise ValueError("Need to set fixed_b_path")
def build_model(self, model_yaml_dir=None, verbose=0):
"""
verbose
0: Not show any model
1: Show AE, Discriminator model
2: Show all models
"""
# Load Models : encoder, decoder, discriminator
if model_yaml_dir == None: model_yaml_dir = os.path.join(self.model_save_dir, self.path_info['model_info']['model_architecture'])
self._encoder_base_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_base'], verbose=verbose==2)
self._encoder_b_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_b'], verbose=verbose==2)
self._encoder_e_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_encoder_e'], verbose=verbose==2)
self.decoder_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_decoder'], verbose=verbose==2)
self._discriminator_e_model = self._load_model(model_yaml_dir+self.path_info['model_info']['model_discriminator'], verbose=verbose==2)
self.save_models = {"encoder_base":self._encoder_base_model,
"encoder_b":self._encoder_b_model,
"encoder_e":self._encoder_e_model,
"decoder":self.decoder_model,
"discriminator":self._discriminator_e_model
}
self._discriminator_e_model.name = 'discriminator_e'
# build blocks
self.image_shape = self._encoder_base_model.input_shape[1:]
if self.feature_b: self.feature_shape = self._encoder_b_model.input_shape[1:]
self.b_z_dim = self._encoder_b_model.output_shape[-1]
self.e_z_dim = self._encoder_e_model.output_shape[-1]
real_image = Input(shape=self.image_shape, name='real_image_input', dtype='float32')
cls_info = Input(shape=(1,), name='class_info_input', dtype='int32')
prior_e_noise = Input(shape=(self.e_z_dim,), name='prior_e_input', dtype='float32')
prior_b_noise = Input(shape=(self.b_z_dim,), name='prior_b_input', dtype='float32')
b_input = Input(shape=(self.b_z_dim,), name='b_input', dtype='float32')
if self.feature_b: feature_for_b = Input(shape=self.feature_shape, name='feature_b_input', dtype='float32')
# Encoder base
last_h = self._encoder_base_model([real_image])
# B_i ~ Q_B|X=x^i
if self.feature_b: b_j_given_x_j = self._encoder_b_model([feature_for_b])
else: b_j_given_x_j = self._encoder_b_model([last_h])
sample_b, b_given_x = Lambda(get_b, name='get_b_given_x')([b_j_given_x_j, cls_info])
if self.feature_b: self.encoder_b_model = Model([feature_for_b, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
else: self.encoder_b_model = Model([real_image, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
# E^i_j ~Q_E_0|X_0,B=X^i_j,B_i
e_given_x_b = self._encoder_e_model([last_h, b_input])
self.encoder_e_model = Model([real_image, b_input], [e_given_x_b], name='encoder_e_model')
# Z^i_j = (B_i, E^i_j)
noise_input = Input(shape=(self.e_z_dim,), name='noise_input', dtype='float32')
if self.e_weight != 1.: noise_weighted = Lambda(lambda x : self.e_weight*x, name='noise_weighted')(noise_input)
else: noise_weighted = noise_input
latent = Concatenate(axis=1, name='concat_latent')([b_input, noise_weighted])
self.z_encoder_model = Model([b_input, noise_input], [latent], name='encoder_z_model')
# Build connections
e_given_x_b = self.encoder_e_model([real_image, b_input])
fake_latent = self.z_encoder_model([b_input, e_given_x_b])
recon_image = self.decoder_model(fake_latent)
self.ae_model = Model(inputs=[real_image, b_input], outputs=[recon_image], name='ae_model')
if verbose==2:
self.log.info('Auto-Encoder model')
self.ae_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
# GAN model
p_e = self._discriminator_e_model(prior_e_noise)
q_e = self._discriminator_e_model(e_given_x_b)
output = Concatenate(name='mlp_concat')([p_e, q_e]) ## TODO : fix..
self.gan_model = Model(inputs=[real_image, b_input, prior_e_noise], outputs=[output], name='GAN_model')
# WAE model
recon_error = Lambda(mean_reconstruction_l2sq_loss, name='mean_recon_error')([real_image, recon_image])
penalty_e = Lambda(get_qz_trick_loss, name='penalty_e')(q_e)
penalty_hsic = Lambda(get_hsic, name="penalty_hsic")([e_given_x_b, b_input])
self.main_model = Model(inputs=[real_image, b_input, cls_info],
outputs=[recon_error, penalty_e, penalty_hsic], name='main_model')
# Blur information
prior_latent = Input(shape=(self.b_z_dim + self.e_z_dim,), name='prior_z_input', dtype='float32')
self.blurr_model = get_compute_blurriness_model(self.image_shape)
gen_image = self.decoder_model(prior_latent)
gen_sharpness = self.blurr_model(gen_image)
self.gen_blurr_model = Model(inputs=[prior_latent], outputs=[gen_sharpness], name='gen_blurr_model')
if verbose==2:
self.log.info('Generative sample blurr model')
self.gen_blurr_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
try:
self.parallel_main_model = multi_gpu_model(self.main_model, gpus=self.number_of_gpu)
self.parallel_gan_model = multi_gpu_model(self.gan_model, gpus=self.number_of_gpu)
self.log.info("Training using multiple GPUs")
except ValueError:
self.parallel_main_model = self.main_model
self.parallel_gan_model = self.gan_model
self.log.info("Training using single GPU or CPU")
self.train_models = {'discriminator':self.gan_model, 'main':self.main_model}
self.parallel_train_models = {'discriminator':self.parallel_gan_model, 'main':self.parallel_main_model}
self.train_models_lr = {'discriminator':{'lr':float(self.network_info['model_info']['lr_e_adv']),
'decay':float(self.network_info['model_info']['lr_e_adv_decay'])},
'main':{'lr':float(self.network_info['model_info']['lr_e']),
'decay':float(self.network_info['model_info']['lr_e_decay'])}}
if verbose:
self.log.info('Main model')
self.main_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Discriminator model')
self.gan_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
def model_compile(self, verbose=0):
self.log.info('Start models compile.')
if self.network_info['model_info']['optimizer'] =='adam':
optimizer_e = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['main']['lr'],
beta_1=float(self.network_info['model_info']['lr_e_beta1']))
optimizer_e_adv = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['discriminator']['lr'],
beta_1=float(self.network_info['model_info']['lr_e_adv_beta1']))
else:
optimizer_e = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['main']['lr'])
optimizer_e_adv = getattr(keras.optimizers,
self.network_info['model_info']['optimizer'])(lr=self.train_models_lr['discriminator']['lr'])
if self.reset_e:
self.reset_weights(self._encoder_base_model)
self.reset_weights(self._encoder_e_model)
self.reset_weights(self._discriminator_e_model)
# GAN model compile
self._encoder_b_model.trainable = False
self._encoder_e_model.trainable = False
self._encoder_base_model.trainable = False
self.decoder_model.trainable = False
self._discriminator_e_model.trainable = self.e_train
self.parallel_gan_model.compile(loss=getattr(loss_and_metric, self.network_info['model_info']['discriminator_loss']),
optimizer=optimizer_e_adv, options=self.run_options, run_metadata=self.run_metadata)
# WAE model compile
self.decoder_model.trainable = True
self._encoder_b_model.trainable = False
self._encoder_e_model.trainable = self.e_train
self._encoder_base_model.trainable = self.e_train
self._discriminator_e_model.trainable = False
self.parallel_main_model.compile(loss={'mean_recon_error':getattr(loss_and_metric, self.network_info['model_info']['main_loss']),
'penalty_e':getattr(loss_and_metric, self.network_info['model_info']['penalty_e']),
'penalty_hsic':getattr(loss_and_metric, self.network_info['model_info']['penalty_b']),
},
loss_weights=[1., self.lambda_e, self.lambda_hsic],
optimizer=optimizer_e, options=self.run_options, run_metadata=self.run_metadata)
if verbose:
for name, model in self.parallel_train_models.items():
self.log.info('%s model' % name)
model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Model compile done.')
def save(self, filepath, is_compile=True, overwrite=True, include_optimizer=True):
model_path = self.path_info['model_info']['weight']
for name, model in self.save_models.items():
model.save("%s/%s_%s" % (filepath, name, model_path), overwrite=overwrite, include_optimizer=include_optimizer)
self.log.debug('Save model at %s' % filepath)
def load(self, filepath, verbose=0):
model_path = self.path_info['model_info']['weight']
loss_list = [self.network_info['model_info']['main_loss'],
self.network_info['model_info']['penalty_e'],
self.network_info['model_info']['discriminator_loss']]
load_dict = dict([(loss_name, getattr(loss_and_metric, loss_name)) for loss_name in loss_list])
load_dict['SelfAttention2D'] = SelfAttention2D
load_dict['get_qz_trick_loss'] = get_qz_trick_loss
load_dict['get_qz_trick_with_weight_loss'] = get_qz_trick_with_weight_loss
load_dict['get_entropy_loss_with_logits'] = get_entropy_loss_with_logits
load_dict['mmd_penalty'] = mmd_penalty
load_dict['get_b'] = get_b
load_dict['get_b_estimation_var'] = get_b_estimation_var
load_dict['get_b_penalty_loss'] = get_b_penalty_loss
load_dict['mean_reconstruction_l2sq_loss'] = mean_reconstruction_l2sq_loss
load_dict['get_class_mean_by_class_index'] = get_class_mean_by_class_index
load_dict['concat_with_uniform_sample'] = concat_with_uniform_sample
load_dict['get_batch_covariance'] = get_batch_covariance
load_dict['get_mutual_information_from_gaussian_sample'] = get_mutual_information_from_gaussian_sample
# TODO : fix save & load
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_base", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path), overwrite=False)
self._encoder_base_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_base", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_b", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path), overwrite=False)
self._encoder_b_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_b", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "encoder_e", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path), overwrite=False)
self._encoder_e_model.load_weights("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "encoder_e", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "decoder", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "decoder", model_path), overwrite=False)
self.decoder_model.load_weights("%s/tmp_%s_%s" % (filepath, "decoder", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "decoder", model_path))
tmp_model = load_model("%s/%s_%s" % (filepath, "discriminator", model_path), custom_objects=load_dict)
tmp_model.save_weights("%s/tmp_%s_%s" % (filepath, "discriminator", model_path), overwrite=False)
self._discriminator_e_model.load_weights("%s/tmp_%s_%s" % (filepath, "discriminator", model_path), by_name=True)
os.remove("%s/tmp_%s_%s" % (filepath, "discriminator", model_path))
self._discriminator_e_model.name = 'discriminator_e'
# build blocks
self.image_shape = self._encoder_base_model.input_shape[1:]
if self.feature_b: self.feature_shape = self._encoder_b_model.input_shape[1:]
self.b_z_dim = self._encoder_b_model.output_shape[-1]
self.e_z_dim = self._encoder_e_model.output_shape[-1]
real_image = Input(shape=self.image_shape, name='real_image_input', dtype='float32')
cls_info = Input(shape=(1,), name='class_info_input', dtype='int32')
prior_e_noise = Input(shape=(self.e_z_dim,), name='prior_e_input', dtype='float32')
prior_b_noise = Input(shape=(self.b_z_dim,), name='prior_b_input', dtype='float32')
b_input = Input(shape=(self.b_z_dim,), name='b_input', dtype='float32')
if self.feature_b: feature_for_b = Input(shape=self.feature_shape, name='feature_b_input', dtype='float32')
# Encoder base
last_h = self._encoder_base_model([real_image])
# B_i ~ Q_B|X=x^i
if self.feature_b: b_j_given_x_j = self._encoder_b_model([feature_for_b])
else: b_j_given_x_j = self._encoder_b_model([last_h])
sample_b, b_given_x = Lambda(get_b, name='get_b_given_x')([b_j_given_x_j, cls_info])
if self.feature_b: self.encoder_b_model = Model([feature_for_b, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
else: self.encoder_b_model = Model([real_image, cls_info], [sample_b, b_given_x, b_j_given_x_j], name='encoder_b_model')
# E^i_j ~Q_E_0|X_0,B=X^i_j,B_i
e_given_x_b = self._encoder_e_model([last_h, b_input])
self.encoder_e_model = Model([real_image, b_input], [e_given_x_b], name='encoder_e_model')
# Z^i_j = (B_i, E^i_j)
noise_input = Input(shape=(self.e_z_dim,), name='noise_input', dtype='float32')
if self.e_weight != 1.: noise_weighted = Lambda(lambda x : self.e_weight*x, name='noise_weighted')(noise_input)
else: noise_weighted = noise_input
latent = Concatenate(axis=1, name='concat_latent')([b_input, noise_weighted])
self.z_encoder_model = Model([b_input, noise_input], [latent], name='encoder_z_model')
# Build connections
e_given_x_b = self.encoder_e_model([real_image, b_input])
fake_latent = self.z_encoder_model([b_input, e_given_x_b])
recon_image = self.decoder_model(fake_latent)
self.ae_model = Model(inputs=[real_image, b_input], outputs=[recon_image], name='ae_model')
if verbose==2:
self.log.info('Auto-Encoder model')
self.ae_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
# GAN model
p_e = self._discriminator_e_model(prior_e_noise)
q_e = self._discriminator_e_model(e_given_x_b)
output = Concatenate(name='mlp_concat')([p_e, q_e]) ## TODO : fix..
self.gan_model = Model(inputs=[real_image, b_input, prior_e_noise], outputs=[output], name='GAN_model')
# WAE model
recon_error = Lambda(mean_reconstruction_l2sq_loss, name='mean_recon_error')([real_image, recon_image])
penalty_e = Lambda(get_qz_trick_loss, name='penalty_e')(q_e)
penalty_hsic = Lambda(get_hsic, name="penalty_hsic")([e_given_x_b, b_input])
self.main_model = Model(inputs=[real_image, b_input, cls_info],
outputs=[recon_error, penalty_e, penalty_hsic], name='main_model')
# Blur information
prior_latent = Input(shape=(self.b_z_dim + self.e_z_dim,), name='prior_z_input', dtype='float32')
self.blurr_model = get_compute_blurriness_model(self.image_shape)
gen_image = self.decoder_model(prior_latent)
gen_sharpness = self.blurr_model(gen_image)
self.gen_blurr_model = Model(inputs=[prior_latent], outputs=[gen_sharpness], name='gen_blurr_model')
if verbose==2:
self.log.info('Generative sample blurr model')
self.gen_blurr_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.model_compile()
self.log.info('Loaded WAE model')
self.main_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
self.log.info('Loaded Discriminator model: GAN')
self.gan_model.summary(line_length=200, print_fn=self.log.info)
sys.stdout.flush()
def discriminator_sampler(self, x, y):
e_noise = self.noise_sampler(y.shape[0], self.e_z_dim, self.e_sd)
## TODO: not using feature_b
return [x[0], x[2], e_noise], [np.zeros([y.shape[0],2], dtype='float32')]
def main_sampler(self, x, y):
## TODO: not using feature_b
return [x[0], x[2], y[:,np.newaxis]], [np.zeros(y.shape[0], dtype='float32')]*3
def train_on_batch(self, x, y, sample_weight=None, class_weight=None, reset_metrics=True):
wx, wy = self.main_sampler(x, y)
main_outs = self.parallel_main_model.train_on_batch(wx, wy,
sample_weight=sample_weight, class_weight=class_weight, reset_metrics=reset_metrics)
dx, dy = self.discriminator_sampler(x, y)
if self.e_train: d_outs = self.parallel_gan_model.train_on_batch(dx, dy,
sample_weight=sample_weight, class_weight=class_weight, reset_metrics=reset_metrics)
else: d_outs = 0
return (main_outs + [d_outs]
)
def test_on_batch(self, x, y, sample_weight=None, reset_metrics=True):
wx, wy = self.main_sampler(x, y)
main_outs = self.parallel_main_model.test_on_batch(wx, wy,
sample_weight=sample_weight, reset_metrics = reset_metrics)
dx, dy = self.discriminator_sampler(x, y)
if self.e_train: d_outs = self.parallel_gan_model.test_on_batch(dx, dy,
sample_weight=sample_weight, reset_metrics = reset_metrics)
else: d_outs = 0
return (main_outs + [d_outs]
)
def get_reference_images(self, generator):
## TODO: not using feature_b
batches = [generator[i] for i in range(4)]
self.fixed_classes = np.concatenate([batch[1] for batch in batches])
if self.feature_b:
self.fixed_feature = np.concatenate([batch[0][1] for batch in batches])
return np.concatenate([batch[0][0] for batch in batches])
else:
return np.concatenate([batch for batches in batches])
def on_train_begin(self, x):
self.fixed_images = x[self.fixed_classes == self.fixed_classes[0]]
if self.feature_b: self.fixed_feature = self.fixed_feature[self.fixed_classes == self.fixed_classes[0]]
self.fixed_classes = self.fixed_classes[self.fixed_classes == self.fixed_classes[0]]
real_image_blurriness = self.blurr_model.predict_on_batch(x)
self.fixed_noise = self.noise_sampler(x.shape[0], self.e_z_dim, self.e_sd)
self.log.info("Real image's sharpness = %.5f" % np.min(real_image_blurriness))
def on_epoch_end(self, epoch):
for name in self.train_models_lr.keys():
if self.train_models_lr[name]['decay'] > 0.:
self.train_models_lr[name]['lr'] = self._update_lr(epoch, lr=self.train_models_lr[name]['lr'],
decay=self.train_models_lr[name]['decay'])
k.set_value(self.parallel_train_models[name].optimizer.lr, self.train_models_lr[name]['lr']) | 62.131108 | 148 | 0.634822 | 7,124 | 52,128 | 4.257861 | 0.039725 | 0.032638 | 0.028715 | 0.033627 | 0.962285 | 0.956615 | 0.951934 | 0.948999 | 0.944219 | 0.944219 | 0 | 0.006931 | 0.236073 | 52,128 | 839 | 149 | 62.131108 | 0.754784 | 0.026742 | 0 | 0.903963 | 0 | 0 | 0.117584 | 0.008302 | 0 | 0 | 0 | 0.002384 | 0 | 1 | 0.036585 | false | 0 | 0.035061 | 0.001524 | 0.096037 | 0.027439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
560fe36c0c65d3b9f7c57bb8acf0378e9408fa5f | 2,890 | py | Python | main.py | famouskaykay/dontplaywithme | e7b6d1c8f715d1c1f24c6a718238b50d8073e5d7 | [
"Apache-2.0"
] | null | null | null | main.py | famouskaykay/dontplaywithme | e7b6d1c8f715d1c1f24c6a718238b50d8073e5d7 | [
"Apache-2.0"
] | null | null | null | main.py | famouskaykay/dontplaywithme | e7b6d1c8f715d1c1f24c6a718238b50d8073e5d7 | [
"Apache-2.0"
] | null | null | null | from pyrogram.types import (
Message,
)
from pyrogram import filters, Client
from pyrogram.types import InlineKeyboardButton, InlineKeyboardMarkup
import requests
import os
import re
API_ID = os.environ.get("API_ID", None)
API_HASH = os.environ.get("API_HASH", None)
TOKEN = os.environ.get("TOKEN", None)
BOT_ID = os.environ.get("BOT_ID", None)
bot = Client(
"KukiBot",
api_id=API_ID,
api_hash=API_HASH,
bot_token=TOKEN,
)
@bot.on_message(
filters.text
& filters.reply
& ~filters.bot
& ~filters.edited,
group=2,
)
async def kukiai(client: Client, message: Message):
msg = message.text
chat_id = message.chat.id
kuki = requests.get(f"https://kuki-api.tk/api/botname/owner/message={msg}").json()
moezilla = f"{Kuki['reply']}"
await client.send_chat_action(message.chat.id, "typing")
await message.reply_text(moezilla)
messageprivate = '''
Hi, I'm Kuki Chat Bot
'''
messagegroup = '''
Hi, I'm Kuki Chat Bot
'''
@kuki.on_message(filters.command("start"))
async def start(_, message):
self = await kuki.get_me()
busername = self.username
if message.chat.type != "private":
await message.reply_text(messagegroup)
return
else:
buttons = [[InlineKeyboardButton("Github", url="from pyrogram.types import (
Message,
)
from pyrogram import filters, Client
from pyrogram.types import InlineKeyboardButton, InlineKeyboardMarkup
import requests
import os
import re
API_ID = os.environ.get("API_ID", None)
API_HASH = os.environ.get("API_HASH", None)
TOKEN = os.environ.get("TOKEN", None)
BOT_ID = os.environ.get("BOT_ID", None)
kuki = Client(
"KukiBot",
api_id=API_ID,
api_hash=API_HASH,
bot_token=TOKEN,
)
@kuki.on_message(
filters.text
& filters.reply
& ~filters.bot
& ~filters.edited,
group=2,
)
async def kukiai(client: Client, message: Message):
msg = message.text
chat_id = message.chat.id
Kuki = requests.get(f"https://kuki-api.tk/api/botname/owner/message={msg}").json()
moezilla = f"{Kuki['reply']}"
await client.send_chat_action(message.chat.id, "typing")
await message.reply_text(moezilla)
messageprivate = '''
Hi, I'm Kuki Chat Bot
'''
messagegroup = '''
Hi, I'm Kuki Chat Bot
'''
@kuki.on_message(filters.command("start"))
async def start(_, message):
self = await kuki.get_me()
busername = self.username
if message.chat.type != "private":
await message.reply_text(messagegroup)
return
else:
buttons = [[InlineKeyboardButton("Github", url="https://github.com/famouskaykay"),
]]
await message.reply_text(messageprivate, reply_markup=InlineKeyboardMarkup(buttons))
kuki.run()"),
]]
await message.reply_text(messageprivate, reply_markup=InlineKeyboardMarkup(buttons))
kuki.run()
| 20.942029 | 92 | 0.669896 | 374 | 2,890 | 5.058824 | 0.179144 | 0.021142 | 0.05074 | 0.066596 | 0.978858 | 0.978858 | 0.978858 | 0.978858 | 0.978858 | 0.978858 | 0 | 0.000861 | 0.196194 | 2,890 | 137 | 93 | 21.094891 | 0.813603 | 0 | 0 | 0.828283 | 0 | 0 | 0.12699 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.121212 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
562dc5a3b5c3e1e5a04161739d086d5bf522f1a5 | 3,365 | py | Python | Hackerearth Set/MelodiuosPassword.py | Siddharth2016/PYTHON3_prog | 9dfa258d87f5b00779d39d9de9a49c1c6cea06be | [
"MIT"
] | 2 | 2019-02-26T14:06:53.000Z | 2019-02-27T17:13:01.000Z | Hackerearth Set/MelodiuosPassword.py | Siddharth2016/PYTHON3_prog | 9dfa258d87f5b00779d39d9de9a49c1c6cea06be | [
"MIT"
] | null | null | null | Hackerearth Set/MelodiuosPassword.py | Siddharth2016/PYTHON3_prog | 9dfa258d87f5b00779d39d9de9a49c1c6cea06be | [
"MIT"
] | 2 | 2017-12-26T07:59:57.000Z | 2018-06-24T03:35:05.000Z | # MELODIUOS PASSWORD
from sys import stdin,stdout
cons = ['b','c','d','f','g','h','j','k','l','m','n','p','q','r','s','t','v','w','x','z']
vow = ['a','e','i','o','u']
n = int(input())
if n==1:
for a in cons:
print(a)
for a in vow:
print(a)
elif n==2:
i=j=0
while i<20:
A = ['']*n
for j in range(0,n,2):
A[j] = cons[i]
for a in vow:
for j in range(1,n,2):
A[j] = a
stdout.write(''.join(A)+"\n")
i += 1
i = 0
while i<5:
A = ['']*n
for j in range(0,n,2):
A[j] = vow[i]
for a in cons:
for j in range(1,n,2):
A[j] = a
stdout.write(''.join(A)+"\n")
i += 1
elif n==3:
i=j=k=0
A = ['']*n
for i in cons:
A[j] = i
for a in vow:
A[j+1] = a
for b in cons:
A[j+2] = b
stdout.write(''.join(A)+"\n")
for i in vow:
A[j] = i
for a in cons:
A[j+1] = a
for b in vow:
A[j+2] = b
stdout.write(''.join(A)+"\n")
elif n==4:
i=j=k=0
A = ['']*n
for i in cons:
A[j] = i
for a in vow:
A[j+1] = a
for b in cons:
A[j+2] = b
for c in vow:
A[j+3] = c
stdout.write(''.join(A)+"\n")
for i in vow:
A[j] = i
for a in cons:
A[j+1] = a
for b in vow:
A[j+2] = b
for c in cons:
A[j+3] = c
stdout.write(''.join(A)+"\n")
elif n==5:
i=j=0
#count = 0
A = ['']*n
for i in cons:
A[j] = i
for a in vow:
A[j+1] = a
for b in cons:
A[j+2] = a
for c in vow:
A[j+3] = c
for d in cons:
A[j+4] = d
stdout.write(''.join(A)+"\n")
#count += 1
for i in vow:
A[j] = i
for a in cons:
A[j+1] = a
for b in vow:
A[j+2] = a
for c in cons:
A[j+3] = c
for d in vow:
A[j+4] = d
stdout.write(''.join(A)+"\n")
#count += 1
#print(count)
elif n==6:
i=j=0
A = ['']*n
for i in cons:
A[j] = i
for a in vow:
A[j+1] = a
for b in cons:
A[j+2] = b
for c in vow:
A[j+3] = c
for d in cons:
A[j+4] = d
for e in vow:
A[j+5] = e
stdout.write(''.join(A)+"\n")
for i in vow:
A[j] = i
for a in cons:
A[j+1] = a
for b in vow:
A[j+2] = b
for c in cons:
A[j+3] = c
for d in vow:
A[j+4] = d
for e in cons:
A[j+5] = e
stdout.write(''.join(A)+"\n")
| 24.742647 | 88 | 0.28737 | 496 | 3,365 | 1.949597 | 0.108871 | 0.08273 | 0.1303 | 0.148914 | 0.802482 | 0.780765 | 0.780765 | 0.755946 | 0.755946 | 0.694933 | 0 | 0.037966 | 0.561664 | 3,365 | 135 | 89 | 24.925926 | 0.617627 | 0.017533 | 0 | 0.860656 | 0 | 0 | 0.013636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008197 | 0 | 0.008197 | 0.016393 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
564e2bdf86459f68183da82bd9895add43bd337d | 11,151 | py | Python | tests/test_v1/test_meal.py | matthewacha/BookAMeal | 6af638a4cf71d72dd2a5fa80ba0e908b7ef70cf5 | [
"MIT"
] | null | null | null | tests/test_v1/test_meal.py | matthewacha/BookAMeal | 6af638a4cf71d72dd2a5fa80ba0e908b7ef70cf5 | [
"MIT"
] | null | null | null | tests/test_v1/test_meal.py | matthewacha/BookAMeal | 6af638a4cf71d72dd2a5fa80ba0e908b7ef70cf5 | [
"MIT"
] | 1 | 2018-08-20T11:57:23.000Z | 2018-08-20T11:57:23.000Z | import unittest
import json
import random
from app import APP
def login(tester):
emails=random.choice(['an@gmail.com','me@gmail.com','dou@gmail.com'])
tester.post('api/v1/auth/signup',content_type='application/json',
data =json.dumps( dict(email=emails,
password='lantern')))
login = tester.post('api/v1/auth/login',content_type='application/json',
data =json.dumps( dict(email=emails,
password='lantern')))
return login
class TestMeal(unittest.TestCase):
def setUp(self):
self.tester = APP.test_client(self)
def test_create_meal(self):
"""test that a meal option can be succesfully added"""
login_=login(self.tester)
result = json.loads(login_.data.decode())
response= self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict(access_token = result['token']))
self.assertIn(u'Successfully added meal option', response.data)
self.assertEqual(response.status_code, 201)
def test_unauthorized_create_meal(self):
"""test that a meal option cannot be succesfully added"""
response= self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict(access_token = u'123456'))
self.assertIn(u'Unauthorized access, please login', response.data)
self.assertEqual(response.status_code, 401)
response= self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict())
self.assertIn(u"Token is missing", response.data)
self.assertEqual(response.status_code, 401)
def test_fail_create_meal(self):
"""test that a meal option cannot be added if string price
is passed instead of integer"""
login_=login(self.tester)
resv = json.loads(login_.data.decode())
response= self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries and wine',
price='678addf')),
headers =dict(access_token = resv['token']))
result=json.loads(response.data.decode())
self.assertEqual(result['message'], u"Please put in an integer")
self.assertEqual(response.status_code, 401)
response= self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries and wine',
price='67800')),
headers =dict(access_token = resv['token']))
result=json.loads(response.data.decode())
self.assertEqual(result['message'], u"Please put in an integer")
self.assertEqual(response.status_code, 401)
def test_get_all_meals(self):
"""test that all meal options can be retrieved"""
self.tester.post('api/v1/auth/signup',content_type='application/json',
data =json.dumps( dict(email="jones@gmail.com",
password='lantern')))
login = self.tester.post('api/v1/auth/login',content_type='application/json',
data =json.dumps( dict(email="jones@gmail.com",
password='lantern')))
resv = json.loads(login.data.decode())
self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Friess',
price=4000)),
headers =dict(access_token = resv['token']))
self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Pizza pepper',
price=7000)),
headers =dict(access_token = resv['token']))
response=self.tester.get('api/v1/meals/',headers =dict(access_token = resv['token']))
result=json.loads(response.data.decode())
self.assertEqual(len(result["Meals"]), 2)
def test_update_meal(self):
"""test that a meal option can be updated"""
self.tester.post('api/v1/auth/signup',content_type='application/json',
data =json.dumps( dict(email="amos@gmail.com",
password='lantern')))
login = self.tester.post('api/v1/auth/login',content_type='application/json',
data =json.dumps( dict(email="amos@gmail.com",
password='lantern')))
resv = json.loads(login.data.decode())
self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Yorghut',
price=55000)),
headers =dict(access_token = resv['token']))
self.tester.post('api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fishes',
price=4500)),
headers =dict(access_token = resv['token']))
response=self.tester.put('api/v1/meals/6', data =json.dumps( dict(name='Fries',
price=55000)),
headers =dict(access_token = resv['token']))
result=json.loads(response.data.decode())
self.assertEqual(result['message'], "Successfully edited")
def test_fail_update_meal(self):
"""test that a non existent meal option cannot be updated """
self.tester.post('api/v1/auth/signup',content_type='application/json',
data =json.dumps( dict(email="many@gmail.com",
password='lantern')))
login = self.tester.post('api/v1/auth/login',content_type='application/json',
data =json.dumps( dict(email="many@gmail.com",
password='lantern')))
resv = json.loads(login.data.decode())
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict(access_token = resv['token']))
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Beans',
price=5000)),
headers =dict(access_token = resv['token']))
response=self.tester.put('api/v1/meals/17', data =json.dumps( dict(name='Fries',
price=55000)),
headers =dict(access_token = resv['token']))
self.assertIn(u'Meal option does not exist', response.data)
self.assertEqual(response.status_code, 404)
self.tester.post('api/v1/auth/signup',content_type='application/json',
data =json.dumps( dict(email="andrew@gmail.com",
password='lantern')))
login = self.tester.post('api/v1/auth/login',content_type='application/json',
data =json.dumps( dict(email="andrew@gmail.com",
password='lantern')))
resv = json.loads(login.data.decode())
response=self.tester.put('api/v1/meals/2', data =json.dumps( dict(name='Fries',
price=55000)),
headers =dict(access_token = resv['token']))
self.assertIn(u"Youre not authorized to do this", response.data)
self.assertEqual(response.status_code, 404)
def test_delete_meal(self):
"""test that a meal can be deleted """
login_=login(self.tester)
resv = json.loads(login_.data.decode())
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict(access_token = resv['token']))
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Beans',
price=5000)),
headers =dict(access_token = resv['token']))
response=self.tester.delete('/api/v1/meals/2',
headers =dict(access_token = resv['token']))
self.assertIn(u'Successfully deleted meal', response.data)
self.assertEqual(response.status_code, 200)
def test_fail_to_delete_meal(self):
"""test that a meal option can only be deleted by admin"""
login_=login(self.tester)
resv = json.loads(login_.data.decode())
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Fries',
price=5000)),
headers =dict(access_token = resv['token']))
self.tester.post('/api/v1/meals/', content_type='application/json',
data =json.dumps( dict(name='Beans',
price=5000)),
headers =dict(access_token = resv['token']))
response=self.tester.delete('/api/v1/meals/34',
headers =dict(access_token = resv['token']))
self.assertIn(u'Youre not authorized to do this', response.data)
self.assertEqual(response.status_code, 401)
if __name__=='__main__':
unittest.main()#pragma:no cover
| 56.604061 | 100 | 0.489104 | 1,094 | 11,151 | 4.900366 | 0.1234 | 0.063421 | 0.067898 | 0.088789 | 0.85637 | 0.848536 | 0.843126 | 0.817758 | 0.787167 | 0.753031 | 0 | 0.021771 | 0.390369 | 11,151 | 196 | 101 | 56.892857 | 0.766843 | 0.038203 | 0 | 0.675 | 0 | 0 | 0.151087 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.0625 | false | 0.0625 | 0.025 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
56a5de78977a445272edb0f1b94c18d43ede2360 | 107 | py | Python | kobra/__init__.py | agaev1/kobra | d4c6ee0883fea53b4667524adac2e5d4cce20be2 | [
"MIT"
] | null | null | null | kobra/__init__.py | agaev1/kobra | d4c6ee0883fea53b4667524adac2e5d4cce20be2 | [
"MIT"
] | null | null | null | kobra/__init__.py | agaev1/kobra | d4c6ee0883fea53b4667524adac2e5d4cce20be2 | [
"MIT"
] | null | null | null | from kobra.functions_general import *
from kobra.enums_currencies import *
from kobra.utils_dates import *
| 26.75 | 37 | 0.831776 | 15 | 107 | 5.733333 | 0.6 | 0.313953 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11215 | 107 | 3 | 38 | 35.666667 | 0.905263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3b0fb8a7a09bc46c8049399d31873ea5f03c942f | 913 | py | Python | kiki/commons/arg/__init__.py | deuxksy/Queen | d673ebabcd52d557c690edeb77b781d57a5f5e65 | [
"MIT"
] | null | null | null | kiki/commons/arg/__init__.py | deuxksy/Queen | d673ebabcd52d557c690edeb77b781d57a5f5e65 | [
"MIT"
] | null | null | null | kiki/commons/arg/__init__.py | deuxksy/Queen | d673ebabcd52d557c690edeb77b781d57a5f5e65 | [
"MIT"
] | null | null | null | #-*- coding: utf-8 -*-
# def get_args_config():
# try:
# parser = argparse.ArgumentParser(description='default argument')
# parser.add_argument("-config", "--config_path", type=str, help="사용할 config 파일의 위치를 입력해주세요", required=True)
# return parser.parse_args()
# except:
# sys.exit(1)
#
# def get_args():
# try:
# parser = argparse.ArgumentParser(description='default argument')
# parser.add_argument("-sport", "--sport_id", help="oddsbox.meta_cmm_sport.sport_id", required=True)
# return parser.parse_args()
# except:
# sys.exit(1)
#
#
# def get_args_sync_meta():
# try:
# parser = argparse.ArgumentParser(description='default argument')
# parser.add_argument("-sport", "--sport_id", help="oddsbox.meta_cmm_sport.sport_id", required=True)
# return parser.parse_args()
# except:
# sys.exit(1)
| 35.115385 | 116 | 0.624315 | 106 | 913 | 5.179245 | 0.349057 | 0.07286 | 0.087432 | 0.169399 | 0.850638 | 0.850638 | 0.850638 | 0.850638 | 0.850638 | 0.850638 | 0 | 0.005602 | 0.217963 | 913 | 25 | 117 | 36.52 | 0.763305 | 0.945235 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
3b480932bd5949b551dbc09f760ebef5b995d503 | 24,400 | py | Python | lib/VisualizeMars.py | bineet-coderep/Uncertain-Linear-System | 911ff9d66ab97b156e5b51b3583d6b6a584870ac | [
"AFL-1.1"
] | 1 | 2020-11-20T07:38:40.000Z | 2020-11-20T07:38:40.000Z | lib/VisualizeMars.py | bineet-coderep/Uncertain-Linear-System | 911ff9d66ab97b156e5b51b3583d6b6a584870ac | [
"AFL-1.1"
] | null | null | null | lib/VisualizeMars.py | bineet-coderep/Uncertain-Linear-System | 911ff9d66ab97b156e5b51b3583d6b6a584870ac | [
"AFL-1.1"
] | null | null | null | import os,sys
PROJECT_ROOT = os.environ['ULS_ROOT_DIR']
sys.path.append(PROJECT_ROOT)
from Parameters import *
import pickle
import matplotlib.pyplot as plt
import statistics as stat
import numpy as np
import seaborn as sns
import pandas as pd
from matplotlib.patches import Ellipse
import sys,os
import matplotlib
from gurobipy import *
from lib.StarOperations import *
class VisualizeRS:
'''
Visualize a reachable set
'''
def vizRS(appx,th1,th2,fname="reachSet"):
'''
Visualize the given reachable set
'''
plt.autoscale(enable=True, axis='both', tight=False)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
(X,Y)=VizRS.getPlotsLineFine(appx,th1,th2)
plt.plot(X,Y,'bo')
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizTraj(traj,th1,th2,fname="trajs"):
'''
Visualize a trajectory, i.e. a set of reachable sets
'''
plt.autoscale(enable=True, axis='both', tight=False)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
pltList=VizRS.getPlotsLineFineList(traj,th1,th2)
for pt in pltList:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt)
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizTrajRobotCloud(traj_rbt,traj_cld,th1,th2,fname="trajsRC"):
'''
Visualize a trajectory, i.e. a set of reachable sets
for both robot and cloud
'''
plt.autoscale(enable=True, axis='both', tight=False)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
pltList_rbt=VizRS.getPlotsLineFineList(traj_rbt,th1,th2)
pltList_cld=VizRS.getPlotsLineFineList(traj_cld,th1,th2)
for pt in pltList_cld:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'g.')
for pt in pltList_rbt:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'b.')
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizTrajRobotCloudData(traj_rbt,traj_cld,title,th1=0,th2=1,fname="data_info"):
'''
Visualize a trajectory, i.e. a set of reachable sets
for both robot and cloud
'''
plt.autoscale(enable=True, axis='both', tight=False)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
plt.title(title)
pltList_rbt=VizRS.getPlotsLineFineList(traj_rbt,th1,th2)
pltList_cld=VizRS.getPlotsLineFineList(traj_cld,th1,th2)
for pt in pltList_cld:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'g.')
for pt in pltList_rbt:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'b.')
if True:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizUnsafeSet(unsafelist,pointClouds):
n=len(unsafelist[0][0])
for q in it.combinations(list(range(n)),2):
th1=q[0]
th2=q[1]
#print(th1,th2)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
pltList_unsafe=VizRS.getPlotsLineFineList(unsafelist,th1,th2)
for pt in pltList_unsafe:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'g.')
# Plot the scatter points
for pointCloud in pointClouds:
X=[p[th1] for p in pointCloud]
Y=[p[th2] for p in pointCloud]
plt.scatter(X, Y, c='r',s=100)
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizTrajRobotCloudAllSides(traj_rbt,traj_cld,unsafelist,pointClouds,fname="trajsRC"):
'''
Visualize a trajectory, i.e. a set of reachable sets
for both robot and cloud
'''
plt.autoscale(enable=True, axis='both', tight=False)
n=len(traj_cld[0][0])
for q in it.combinations(list(range(n)),2):
th1=q[0]
th2=q[1]
#print(th1,th2)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
pltList_rbt=VizRS.getPlotsLineFineList(traj_rbt,th1,th2)
pltList_cld=VizRS.getPlotsLineFineList(traj_cld,th1,th2)
pltList_unsafe=VizRS.getPlotsLineFineList(unsafelist,th1,th2)
# Plot the scatter points
for pointCloud in pointClouds:
X=[p[th1] for p in pointCloud]
Y=[p[th2] for p in pointCloud]
plt.scatter(X, Y, c='b',s=2)
for pt in pltList_unsafe:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'r.')
for pt in pltList_cld:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'g.')
for pt in pltList_rbt:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'b.')
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizTrajRobotCloudExpAllSides(traj_rbt,traj_cld,traj_exp,unsafelist,title,fname="trajsRC_Exp"):
'''
Visualize a trajectory, i.e. a set of reachable sets
for both robot and cloud
'''
plt.autoscale(enable=True, axis='both', tight=False)
plt.title(title)
n=len(traj_cld[0][0])
for q in it.combinations(list(range(n)),2):
th1=q[0]
th2=q[1]
#print(th1,th2)
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
pltList_rbt=VizRS.getPlotsLineFineList(traj_rbt,th1,th2)
pltList_cld=VizRS.getPlotsLineFineList(traj_cld,th1,th2)
pltList_expCld=VizRS.getPlotsLineFineList(traj_exp,th1,th2)
pltList_unsafe=VizRS.getPlotsLineFineList(unsafelist,th1,th2)
for pt in pltList_unsafe:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'r.')
for pt in pltList_expCld:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'c.')
for pt in pltList_cld:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'k.')
for pt in pltList_rbt:
Xt=pt[0] # RS X
Yt=pt[1] # RS Y
plt.plot(Xt,Yt,'b.')
if False:
plt.savefig(ROVER_RESULTS+fname)
plt.close()
else:
plt.show()
def vizBloating(star,bloatedStar):
'''
This is just for debugging purposes
'''
plt.autoscale(enable=True, axis='both', tight=False)
n=len(star[0])
for q in it.combinations(list(range(n)),2):
th1=q[0]
th2=q[1]
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
(X,Y)=VizRS.getPlotsLineFine(star,th1,th2)
(X_bloat,Y_bloat)=VizRS.getPlotsLineFine(bloatedStar,th1,th2)
plt.plot(X,Y,'bo')
plt.plot(X_bloat,Y_bloat,'co')
plt.show()
def vizBloating3(robot,cloud,bloatedRobot):
'''
This is just for debugging purposes
'''
plt.autoscale(enable=True, axis='both', tight=False)
n=len(robot[0])
for q in it.combinations(list(range(n)),2):
th1=q[0]
th2=q[1]
plt.xlabel("State "+str(th1))
plt.ylabel("State "+str(th2))
(X_r,Y_r)=VizRS.getPlotsLineFine(robot,th1,th2)
(X_c,Y_c)=VizRS.getPlotsLineFine(cloud,th1,th2)
(X_rb,Y_rb)=VizRS.getPlotsLineFine(bloatedRobot,th1,th2)
plt.plot(X_r,Y_r,'bo')
plt.plot(X_rb,Y_rb,'co')
plt.plot(X_c,Y_c,'ro')
plt.show()
class VisualizeMarsRover:
'''
Visualization APIs related to Mars Rover Data
'''
def getPlotsLineFine(star,theta1,theta2):
'''
Returns the list of points (x,y)
for the reachable set st1
'''
C=star[0]
V=star[1]
P=star[2]
X_list=[]
Y_list=[]
sv=V.shape[0]
aS=V.shape[1]
semiDefFlag=False
model = Model("qp")
model.setParam( 'OutputFlag', False )
# Create Predicate Variables
predVars=[]
for i in range(aS):
name="Pred"+str(i)
predVars.append(model.addVar(-GRB.INFINITY,GRB.INFINITY,name=name,vtype='C'))
#-----------------------
# Axes Variables
X=model.addVar(-GRB.INFINITY,GRB.INFINITY,name="X",vtype='C')
Y=model.addVar(-GRB.INFINITY,GRB.INFINITY,name="Y",vtype='C')
#-------------------------
# Create the Star Constraints
objX=0
for i in range(aS):
objX=objX+(predVars[i]*V[theta1][i])
objX=C[theta1]+objX
objY=0
for i in range(aS):
objY=objY+(predVars[i]*V[theta2][i])
objY=C[theta2]+objY
model.addConstr(X==objX,"X Axis")
model.addConstr(Y==objY,"Y Axis")
#-----------------------------------
# Predicate Constraints
for i in range(aS):
a=P[i][0]
b=P[i][1]
if a==b:
model.addConstr(predVars[i]==a,name)
else:
model.addConstr(predVars[i]>=min(a,b),name+".1")
model.addConstr(predVars[i]<=max(a,b),name+".2")
#-----------------------------------
# Quadrant Specific Constraints
# 1st Quadrant
obj=X+Y
model.setObjective(obj,GRB.MAXIMIZE)
for a in range(900):
an=a/10
if an==90:
model.addConstr(X==0,"Angle")
else:
m=math.tan(math.radians(an))
model.addConstr(Y==m*(X),"Angle")
try:
model.optimize()
#model.write("dump.bas")
status = model.Status
if status==GRB.Status.UNBOUNDED:
print("UNBOUNDED ")
else:
if status == GRB.Status.INF_OR_UNBD or \
status == GRB.Status.INFEASIBLE or \
status == GRB.Status.UNBOUNDED:
0;
#print('**The model cannot be solved because it is infeasible or unbounded**')
else:
xVal=model.getVarByName("X").x
yVal=model.getVarByName("Y").x
X_list.append(xVal)
Y_list.append(yVal)
except:
semiDefFlag=True
if semiDefFlag==True:
print("Shoot!!")
semiDefFlag=False
model.remove(model.getConstrByName("Angle"))
#-----------------------------
#'''
# 2nd Quadrant
obj=X-Y
model.setObjective(obj,GRB.MAXIMIZE)
for a in range(0,-900,-1):
an=a/10
if an==-90:
model.addConstr(X==0,"Angle")
else:
m=math.tan(math.radians(an))
model.addConstr(Y==m*X,"Angle")
try:
model.optimize()
#model.write("dump.bas")
status = model.Status
if status==GRB.Status.UNBOUNDED:
print("UNBOUNDED ")
else:
if status == GRB.Status.INF_OR_UNBD or \
status == GRB.Status.INFEASIBLE or \
status == GRB.Status.UNBOUNDED:
0;
#print('**The model cannot be solved because it is infeasible or unbounded**')
else:
xVal=model.getVarByName("X").x
yVal=model.getVarByName("Y").x
X_list.append(xVal)
Y_list.append(yVal)
except:
semiDefFlag=True
if semiDefFlag==True:
print("Shoot!!")
semiDefFlag=False
model.remove(model.getConstrByName("Angle"))
#-----------------------------
# 3rd Quadrant
obj=-X-Y
model.setObjective(obj,GRB.MAXIMIZE)
for a in range(-900,-1800,-1):
an=a/10
if an==-90:
model.addConstr(X==0,"Angle")
else:
m=math.tan(math.radians(an))
model.addConstr(Y==m*X,"Angle")
try:
model.optimize()
#model.write("dump.bas")
status = model.Status
if status==GRB.Status.UNBOUNDED:
print("UNBOUNDED ")
else:
if status == GRB.Status.INF_OR_UNBD or \
status == GRB.Status.INFEASIBLE or \
status == GRB.Status.UNBOUNDED:
0;
#print('**The model cannot be solved because it is infeasible or unbounded**')
else:
xVal=model.getVarByName("X").x
yVal=model.getVarByName("Y").x
X_list.append(xVal)
Y_list.append(yVal)
except:
semiDefFlag=True
if semiDefFlag==True:
print("Shoot!!")
semiDefFlag=False
model.remove(model.getConstrByName("Angle"))
#-----------------------------
# 4th Quadrant
obj=-X+Y
model.setObjective(obj,GRB.MAXIMIZE)
for a in range(900,1800):
an=a/10
if an==90:
model.addConstr(X==0,"Angle")
else:
m=math.tan(math.radians(an))
model.addConstr(Y==m*X,"Angle")
try:
model.optimize()
#model.write("dump.bas")
status = model.Status
if status==GRB.Status.UNBOUNDED:
print("UNBOUNDED ")
else:
if status == GRB.Status.INF_OR_UNBD or \
status == GRB.Status.INFEASIBLE or \
status == GRB.Status.UNBOUNDED:
0;
#print('**The model cannot be solved because it is infeasible or unbounded**')
else:
xVal=model.getVarByName("X").x
yVal=model.getVarByName("Y").x
X_list.append(xVal)
Y_list.append(yVal)
except:
semiDefFlag=True
if semiDefFlag==True:
print("Shoot!!")
semiDefFlag=False
model.remove(model.getConstrByName("Angle"))
#-----------------------------
#------------------------------
#print(X_list,Y_list)
#exit(0)
#'''
return (X_list,Y_list)
def vizMarsPointClouds(obs, terrain_img, fname="PointCloud", xlim=(0,224), ylim=(0,224), title_str="Point Clouds"):
'''
Modified from: https://bitbucket.org/nakanoya/tasknet-icra2021/src/master/Motion_Planning/utils/visualize_utils.py
'''
xmin = xlim[0]
xmax = xlim[1]
ymin = ylim[0]
ymax = ylim[1]
fig = plt.figure()
# point cloud obstacle
plt.scatter(obs[:,0], obs[:,1], c='red', s=2)
linestyle = 'solid'
color = 'b'
ax = fig.gca()
ax.set_xticks(np.arange(xmin,xmax,50))
ax.set_yticks(np.arange(ymin,ymax,50))
ax.set_aspect('equal')
ax.imshow(terrain_img, cmap='gray')
plt.xlim(xmin, xmax)
plt.ylim(ymax, ymin)
plt.grid()
plt.title(title_str)
if True:
plt.savefig(ROVER_RESULTS+"/"+fname, dpi=100, bbox_inches='tight')
plt.close()
else:
plt.show()
def vizMarsTraj(obs, terrain_img, planned_path_xy, fname="Traj", xlim=(0,224), ylim=(0,224), title_str="Point Clouds"):
'''
Modified from: https://bitbucket.org/nakanoya/tasknet-icra2021/src/master/Motion_Planning/utils/visualize_utils.py
'''
xmin = xlim[0]
xmax = xlim[1]
ymin = ylim[0]
ymax = ylim[1]
fig = plt.figure()
# point cloud obstacle
plt.scatter(obs[:,0], obs[:,1], c='red', s=2)
linestyle = 'solid'
color = 'b'
for path_xy in planned_path_xy:
path_x = path_xy[0]
path_y = path_xy[1]
plt.plot(path_x, path_y, c=color, marker='o', linewidth=0.5, markersize=0.5, linestyle=linestyle, alpha=1.0)
# start
#print(planned_path_xy[0][0], planned_path_xy[1][0])
plt.plot(planned_path_xy[0][0], planned_path_xy[0][1], c='g', marker='o', markersize=6)
# end
plt.plot(planned_path_xy[-1][0], planned_path_xy[-1][1], c='y', marker='s', markersize=6)
ax = fig.gca()
ax.set_xticks(np.arange(xmin,xmax,50))
ax.set_yticks(np.arange(ymin,ymax,50))
ax.set_aspect('equal')
ax.imshow(terrain_img, cmap='gray')
plt.xlim(xmin, xmax)
plt.ylim(ymax, ymin)
plt.grid()
plt.title(title_str)
if True:
plt.savefig(ROVER_RESULTS+"/"+fname, dpi=100, bbox_inches='tight')
plt.close()
else:
plt.show()
def vizMarsTrajMPC(obs, terrain_img, path_ref, spline_path, path_mpc, fname="TrajMPC", xlim=(0,224), ylim=(0,224), title_str="Point Clouds"):
'''
Modified from: https://bitbucket.org/nakanoya/tasknet-icra2021/src/master/Motion_Planning/utils/visualize_utils.py
'''
xmin = xlim[0]
xmax = xlim[1]
ymin = ylim[0]
ymax = ylim[1]
fig = plt.figure()
# point cloud obstacle
plt.scatter(obs[:,0], obs[:,1], c='red', s=2)
linestyle = 'solid'
color = 'b'
plt.plot(spline_path[0], spline_path[1], c=color, marker='o', linewidth=1, markersize=1, linestyle=linestyle, alpha=1.0)
for path in path_ref:
x_ref=[p[0] for p in path]
y_ref=[p[1] for p in path]
plt.scatter(x_ref, y_ref, marker='*', c='blue', s=70)
# start
plt.plot(path_ref[0][0][0], path_ref[0][0][1], c='g', marker='o', markersize=7)
# end
plt.plot(path_ref[-1][-1][0], path_ref[-1][-1][1], c='y', marker='s', markersize=7)
for path_xy in path_mpc:
path_x = path_xy[0]
path_y = path_xy[1]
plt.plot(path_x, path_y, c='m', marker='o', linewidth=4, markersize=4, linestyle=linestyle, alpha=0.7)
ax = fig.gca()
ax.set_xticks(np.arange(xmin,xmax,50))
ax.set_yticks(np.arange(ymin,ymax,50))
ax.set_aspect('equal')
ax.imshow(terrain_img, cmap='gray')
plt.xlim(xmin, xmax)
plt.ylim(ymax, ymin)
plt.grid()
plt.title(title_str)
if True:
plt.savefig(ROVER_RESULTS+"/"+fname, dpi=100, bbox_inches='tight')
plt.close()
else:
plt.show()
def vizMarsTrajRS(obs, terrain_img, spline_path, mpc_path, ref_path, traj, fname="TrajRS", xlim=(0,224), ylim=(0,224), title_str="Point Clouds"):
'''
Modified from: https://bitbucket.org/nakanoya/tasknet-icra2021/src/master/Motion_Planning/utils/visualize_utils.py
'''
xmin = xlim[0]
xmax = xlim[1]
ymin = ylim[0]
ymax = ylim[1]
fig = plt.figure()
linestyle = 'solid'
color = 'b'
th1=0
th2=1
plt.plot(spline_path[0], spline_path[1], c='black', marker='o', linewidth=1, markersize=1, linestyle='-', alpha=1.0)
for rsList in traj:
for rs in rsList:
(X,Y)=VisualizeMarsRover.getPlotsLineFine(rs,th1,th2)
fgSafe=StarOp.checkIntersectionPoints(rs,obs)
if fgSafe==True:
plt.plot(X,Y,'r.',alpha=0.05)
else:
plt.plot(X,Y,'b.',alpha=0.05)
for path_xy in mpc_path:
path_x = path_xy[0]
path_y = path_xy[1]
plt.plot(path_x, path_y, c='m', marker='o', linewidth=0.5, markersize=1, linestyle=linestyle, alpha=1.0)
for path in ref_path:
x_ref=[p[0] for p in path]
y_ref=[p[1] for p in path]
plt.scatter(x_ref, y_ref, marker='*', c='m', s=70, alpha=1)
# start
plt.plot(ref_path[0][0][0], ref_path[0][0][1], c='g', marker='o', markersize=7)
# end
plt.plot(ref_path[-1][-1][0], ref_path[-1][-1][1], c='y', marker='s', markersize=7)
# point cloud obstacle
plt.scatter(obs[:,0], obs[:,1], c='red', s=2)
ax = fig.gca()
ax.set_xticks(np.arange(xmin,xmax,50))
ax.set_yticks(np.arange(ymin,ymax,50))
ax.set_aspect('equal')
ax.imshow(terrain_img, cmap='gray')
plt.xlim(xmin, xmax)
plt.ylim(ymax, ymin)
plt.grid()
plt.title(title_str)
if True:
plt.savefig(ROVER_RESULTS+fname, dpi=100, bbox_inches='tight')
plt.close()
else:
plt.show()
def vizMarsTrajRSComp(obs, terrain_img, spline_path, mpc_path, ref_path, trajTop, trajBot, fname="TrajRSComp", xlim=(0,224), ylim=(0,224), title_str="Point Clouds"):
'''
Modified from: https://bitbucket.org/nakanoya/tasknet-icra2021/src/master/Motion_Planning/utils/visualize_utils.py
'''
xmin = xlim[0]
xmax = xlim[1]
ymin = ylim[0]
ymax = ylim[1]
fig = plt.figure()
linestyle = 'solid'
color = 'b'
th1=0
th2=1
plt.plot(spline_path[0], spline_path[1], c='black', marker='o', linewidth=1, markersize=1, linestyle='-', alpha=1.0)
for rsList in trajTop:
for rs in rsList:
(X,Y)=VisualizeMarsRover.getPlotsLineFine(rs,th1,th2)
fgSafe=StarOp.checkIntersectionPoints(rs,obs)
if fgSafe==True:
plt.plot(X,Y,'r.',alpha=0.05)
else:
plt.plot(X,Y,'b.',alpha=0.05)
for rsList in trajBot:
for rs in rsList:
(X,Y)=VisualizeMarsRover.getPlotsLineFine(rs,th1,th2)
fgSafe=StarOp.checkIntersectionPoints(rs,obs)
if fgSafe==True:
plt.plot(X,Y,'m.',alpha=0.1)
else:
plt.plot(X,Y,'g.',alpha=0.1)
# point cloud obstacle
plt.scatter(obs[:,0], obs[:,1], c='red', s=2)
for path_xy in mpc_path:
path_x = path_xy[0]
path_y = path_xy[1]
plt.plot(path_x, path_y, c='m', marker='o', linewidth=0.5, markersize=1, linestyle=linestyle, alpha=1.0)
for path in ref_path:
x_ref=[p[0] for p in path]
y_ref=[p[1] for p in path]
plt.scatter(x_ref, y_ref, marker='*', c='m', s=70, alpha=1)
# start
plt.plot(ref_path[0][0][0], ref_path[0][0][1], c='g', marker='o', markersize=7)
# end
plt.plot(ref_path[-1][-1][0], ref_path[-1][-1][1], c='y', marker='s', markersize=7)
ax = fig.gca()
ax.set_xticks(np.arange(xmin,xmax,50))
ax.set_yticks(np.arange(ymin,ymax,50))
ax.set_aspect('equal')
ax.imshow(terrain_img, cmap='gray')
plt.xlim(xmin, xmax)
plt.ylim(ymax, ymin)
plt.grid()
plt.title(title_str)
if True:
plt.savefig(ROVER_RESULTS+fname, dpi=100, bbox_inches='tight')
plt.close()
else:
plt.show()
| 32.020997 | 169 | 0.501967 | 3,037 | 24,400 | 3.954231 | 0.098123 | 0.023316 | 0.019985 | 0.015155 | 0.811558 | 0.800566 | 0.794238 | 0.776917 | 0.772254 | 0.757515 | 0 | 0.030877 | 0.354918 | 24,400 | 761 | 170 | 32.063075 | 0.732084 | 0.100984 | 0 | 0.797753 | 0 | 0 | 0.03186 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02809 | false | 0 | 0.024345 | 0 | 0.058052 | 0.014981 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b5cce1656abd53bcbd9e79067bb91dec5028df6 | 1,577 | py | Python | src/header_filter/rules.py | sanjioh/django-header-filter | d348449619c71bdd6a2c957ee47c1c67a57bdec2 | [
"MIT"
] | 11 | 2016-12-03T21:45:30.000Z | 2022-01-11T08:57:55.000Z | src/header_filter/rules.py | sanjioh/django-header-filter | d348449619c71bdd6a2c957ee47c1c67a57bdec2 | [
"MIT"
] | 17 | 2019-07-12T20:36:40.000Z | 2020-01-09T15:03:40.000Z | src/header_filter/rules.py | sanjioh/django-header-filter | d348449619c71bdd6a2c957ee47c1c67a57bdec2 | [
"MIT"
] | null | null | null | """Common useful rule implementations."""
from django.http import HttpResponseBadRequest
class Enforce:
"""Rule that enforces headers as defined by a matcher."""
def __init__(self, matcher, reject_response=None):
"""
Initialize the instance.
`matcher`: a matcher.
`reject_response`: a Django response.
"""
self._matcher = matcher
self._reject_response = reject_response or HttpResponseBadRequest()
def process(self, request):
"""
Apply the rule to a request.
Requests that don't satisfy the criteria of the matcher will be
rejected.
`request`: a Django request.
returns: a Django response or None.
"""
if self._matcher.match(request):
return None
return self._reject_response
class Forbid:
"""Rule that forbids headers as defined by a matcher."""
def __init__(self, matcher, reject_response=None):
"""
Initialize the instance.
`matcher`: a matcher.
`reject_response`: a Django response.
"""
self._matcher = matcher
self._reject_response = reject_response or HttpResponseBadRequest()
def process(self, request):
"""
Apply the rule to a request.
Requests that satisfy the criteria of the matcher will be
rejected.
`request`: a Django request.
returns: a Django response or None.
"""
if not self._matcher.match(request):
return None
return self._reject_response
| 26.728814 | 75 | 0.618897 | 174 | 1,577 | 5.45977 | 0.281609 | 0.147368 | 0.088421 | 0.037895 | 0.858947 | 0.858947 | 0.858947 | 0.858947 | 0.858947 | 0.858947 | 0 | 0 | 0.300571 | 1,577 | 58 | 76 | 27.189655 | 0.861287 | 0.40837 | 0 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3b85025ef24ee299e2f3d67adcc4b4a82ee9f897 | 180,666 | py | Python | release/stubs.min/Grasshopper/GUI/Ribbon.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 182 | 2017-06-27T02:26:15.000Z | 2022-03-30T18:53:43.000Z | release/stubs.min/Grasshopper/GUI/Ribbon.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 28 | 2017-06-27T13:38:23.000Z | 2022-03-15T11:19:44.000Z | release/stubs.min/Grasshopper/GUI/Ribbon.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 67 | 2017-06-28T09:43:59.000Z | 2022-03-20T21:17:10.000Z | # encoding: utf-8
# module Grasshopper.GUI.Ribbon calls itself Ribbon
# from Grasshopper,Version=1.0.0.20,Culture=neutral,PublicKeyToken=dda4f5ec2cd80803
# by generator 1.145
""" NamespaceTracker represent a CLS namespace. """
# no imports
# no functions
# classes
class GH_Layout(object,GH_ISerializable):
"""
GH_Layout()
GH_Layout(other: GH_Layout)
"""
def AddItem(self,category,subcategory,*__args):
""" AddItem(self: GH_Layout,category: str,subcategory: str,item: GH_LayoutItem)AddItem(self: GH_Layout,category: str,subcategory: str,id: Guid,exposure: GH_Exposure)AddItem(self: GH_Layout,category: str,subcategory: str,id: Guid) """
pass
def AddTab(self,name=None):
"""
AddTab(self: GH_Layout,name: str) -> GH_LayoutTab
AddTab(self: GH_Layout) -> GH_LayoutTab
"""
pass
def Deserialize(self,filepath):
""" Deserialize(self: GH_Layout,filepath: str) -> bool """
pass
def FindTab(self,name):
""" FindTab(self: GH_Layout,name: str) -> GH_LayoutTab """
pass
def Read(self,reader):
""" Read(self: GH_Layout,reader: GH_IReader) -> bool """
pass
def Serialize(self,filepath=None):
"""
Serialize(self: GH_Layout,filepath: str) -> bool
Serialize(self: GH_Layout) -> str
"""
pass
def Write(self,writer):
""" Write(self: GH_Layout,writer: GH_IWriter) -> bool """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,other=None):
"""
__new__(cls: type)
__new__(cls: type,other: GH_Layout)
"""
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
FilePath=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: FilePath(self: GH_Layout) -> str
Set: FilePath(self: GH_Layout)=value
"""
ItemCount=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ItemCount(self: GH_Layout) -> int
"""
Items=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Items(self: GH_Layout) -> IEnumerable[GH_LayoutItem]
"""
Name=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Name(self: GH_Layout) -> str
"""
PanelCount=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: PanelCount(self: GH_Layout) -> int
"""
TabCount=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: TabCount(self: GH_Layout) -> int
"""
Tabs=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Tabs(self: GH_Layout) -> List[GH_LayoutTab]
"""
class GH_LayoutItem(object,GH_ISerializable):
"""
GH_LayoutItem()
GH_LayoutItem(objectId: Guid,exposure: GH_Exposure)
GH_LayoutItem(other: GH_LayoutItem)
"""
def Read(self,reader):
""" Read(self: GH_LayoutItem,reader: GH_IReader) -> bool """
pass
def Write(self,writer):
""" Write(self: GH_LayoutItem,writer: GH_IWriter) -> bool """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,*__args):
"""
__new__(cls: type)
__new__(cls: type,objectId: Guid,exposure: GH_Exposure)
__new__(cls: type,other: GH_LayoutItem)
"""
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
Exposure=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Exposure(self: GH_LayoutItem) -> GH_Exposure
Set: Exposure(self: GH_LayoutItem)=value
"""
Id=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Id(self: GH_LayoutItem) -> Guid
"""
Selected=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Selected(self: GH_LayoutItem) -> bool
Set: Selected(self: GH_LayoutItem)=value
"""
class GH_LayoutMenuItem(GH_DoubleBufferedPanel,IComponent,IDisposable,IOleControl,IOleObject,IOleInPlaceObject,IOleInPlaceActiveObject,IOleWindow,IViewObject,IViewObject2,IPersist,IPersistStreamInit,IPersistPropertyBag,IPersistStorage,IQuickActivate,ISupportOleDropSource,IDropTarget,ISynchronizeInvoke,IWin32Window,IArrangedElement,IBindableComponent):
""" GH_LayoutMenuItem() """
def AccessibilityNotifyClients(self,*args):
"""
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,objectID: int,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control .
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
objectID: The identifier of the System.Windows.Forms.AccessibleObject.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control.
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
"""
pass
def AdjustFormScrollbars(self,*args):
"""
AdjustFormScrollbars(self: ScrollableControl,displayScrollbars: bool)
Adjusts the scroll bars on the container based on the current control positions and the control
currently selected.
displayScrollbars: true to show the scroll bars; otherwise,false.
"""
pass
def CreateAccessibilityInstance(self,*args):
"""
CreateAccessibilityInstance(self: Control) -> AccessibleObject
Creates a new accessibility object for the control.
Returns: A new System.Windows.Forms.AccessibleObject for the control.
"""
pass
def CreateControlsInstance(self,*args):
"""
CreateControlsInstance(self: Control) -> ControlCollection
Creates a new instance of the control collection for the control.
Returns: A new instance of System.Windows.Forms.Control.ControlCollection assigned to the control.
"""
pass
def CreateHandle(self,*args):
"""
CreateHandle(self: Control)
Creates a handle for the control.
"""
pass
def DefWndProc(self,*args):
"""
DefWndProc(self: Control,m: Message) -> Message
Sends the specified message to the default window procedure.
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def DestroyHandle(self,*args):
"""
DestroyHandle(self: Control)
Destroys the handle associated with the control.
"""
pass
def Dispose(self):
"""
Dispose(self: Control,disposing: bool)
Releases the unmanaged resources used by the System.Windows.Forms.Control and its child controls
and optionally releases the managed resources.
disposing: true to release both managed and unmanaged resources; false to release only unmanaged resources.
"""
pass
def GetAccessibilityObjectById(self,*args):
"""
GetAccessibilityObjectById(self: Control,objectId: int) -> AccessibleObject
Retrieves the specified System.Windows.Forms.AccessibleObject.
objectId: An Int32 that identifies the System.Windows.Forms.AccessibleObject to retrieve.
Returns: An System.Windows.Forms.AccessibleObject.
"""
pass
def GetAutoSizeMode(self,*args):
"""
GetAutoSizeMode(self: Control) -> AutoSizeMode
Retrieves a value indicating how a control will behave when its
System.Windows.Forms.Control.AutoSize property is enabled.
Returns: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def GetScaledBounds(self,*args):
"""
GetScaledBounds(self: Control,bounds: Rectangle,factor: SizeF,specified: BoundsSpecified) -> Rectangle
Retrieves the bounds within which the control is scaled.
bounds: A System.Drawing.Rectangle that specifies the area for which to retrieve the display bounds.
factor: The height and width of the control's bounds.
specified: One of the values of System.Windows.Forms.BoundsSpecified that specifies the bounds of the
control to use when defining its size and position.
Returns: A System.Drawing.Rectangle representing the bounds within which the control is scaled.
"""
pass
def GetScrollState(self,*args):
"""
GetScrollState(self: ScrollableControl,bit: int) -> bool
Determines whether the specified flag has been set.
bit: The flag to check.
Returns: true if the specified flag has been set; otherwise,false.
"""
pass
def GetService(self,*args):
"""
GetService(self: Component,service: Type) -> object
Returns an object that represents a service provided by the System.ComponentModel.Component or
by its System.ComponentModel.Container.
service: A service provided by the System.ComponentModel.Component.
Returns: An System.Object that represents a service provided by the System.ComponentModel.Component,or
null if the System.ComponentModel.Component does not provide the specified service.
"""
pass
def GetStyle(self,*args):
"""
GetStyle(self: Control,flag: ControlStyles) -> bool
Retrieves the value of the specified control style bit for the control.
flag: The System.Windows.Forms.ControlStyles bit to return the value from.
Returns: true if the specified control style bit is set to true; otherwise,false.
"""
pass
def GetTopLevel(self,*args):
"""
GetTopLevel(self: Control) -> bool
Determines if the control is a top-level control.
Returns: true if the control is a top-level control; otherwise,false.
"""
pass
def InitLayout(self,*args):
"""
InitLayout(self: Control)
Called after the control has been added to another container.
"""
pass
def InvokeGotFocus(self,*args):
"""
InvokeGotFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeLostFocus(self,*args):
"""
InvokeLostFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeOnClick(self,*args):
"""
InvokeOnClick(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Click event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokePaint(self,*args):
"""
InvokePaint(self: Control,c: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def InvokePaintBackground(self,*args):
"""
InvokePaintBackground(self: Control,c: Control,e: PaintEventArgs)
Raises the PaintBackground event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def IsInputChar(self,*args):
"""
IsInputChar(self: Control,charCode: Char) -> bool
Determines if a character is an input character that the control recognizes.
charCode: The character to test.
Returns: true if the character should be sent directly to the control and not preprocessed; otherwise,
false.
"""
pass
def IsInputKey(self,*args):
"""
IsInputKey(self: Control,keyData: Keys) -> bool
Determines whether the specified key is a regular input key or a special key that requires
preprocessing.
keyData: One of the System.Windows.Forms.Keys values.
Returns: true if the specified key is a regular input key; otherwise,false.
"""
pass
def MemberwiseClone(self,*args):
"""
MemberwiseClone(self: MarshalByRefObject,cloneIdentity: bool) -> MarshalByRefObject
Creates a shallow copy of the current System.MarshalByRefObject object.
cloneIdentity: false to delete the current System.MarshalByRefObject object's identity,which will cause the
object to be assigned a new identity when it is marshaled across a remoting boundary. A value of
false is usually appropriate. true to copy the current System.MarshalByRefObject object's
identity to its clone,which will cause remoting client calls to be routed to the remote server
object.
Returns: A shallow copy of the current System.MarshalByRefObject object.
MemberwiseClone(self: object) -> object
Creates a shallow copy of the current System.Object.
Returns: A shallow copy of the current System.Object.
"""
pass
def NotifyInvalidate(self,*args):
"""
NotifyInvalidate(self: Control,invalidatedArea: Rectangle)
Raises the System.Windows.Forms.Control.Invalidated event with a specified region of the control
to invalidate.
invalidatedArea: A System.Drawing.Rectangle representing the area to invalidate.
"""
pass
def OnAutoSizeChanged(self,*args):
"""
OnAutoSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.AutoSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackColorChanged(self,*args):
"""
OnBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackgroundImageChanged(self,*args):
"""
OnBackgroundImageChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackgroundImageLayoutChanged(self,*args):
"""
OnBackgroundImageLayoutChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageLayoutChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBindingContextChanged(self,*args):
"""
OnBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnCausesValidationChanged(self,*args):
"""
OnCausesValidationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CausesValidationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnChangeUICues(self,*args):
"""
OnChangeUICues(self: Control,e: UICuesEventArgs)
Raises the System.Windows.Forms.Control.ChangeUICues event.
e: A System.Windows.Forms.UICuesEventArgs that contains the event data.
"""
pass
def OnClick(self,*args):
"""
OnClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnClientSizeChanged(self,*args):
"""
OnClientSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ClientSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnContextMenuChanged(self,*args):
"""
OnContextMenuChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnContextMenuStripChanged(self,*args):
"""
OnContextMenuStripChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuStripChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnControlAdded(self,*args):
"""
OnControlAdded(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlAdded event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnControlRemoved(self,*args):
"""
OnControlRemoved(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlRemoved event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnCreateControl(self,*args):
"""
OnCreateControl(self: Control)
Raises the System.Windows.Forms.Control.CreateControl method.
"""
pass
def OnCursorChanged(self,*args):
"""
OnCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDockChanged(self,*args):
"""
OnDockChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DockChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDoubleClick(self,*args):
"""
OnDoubleClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DoubleClick event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDpiChangedAfterParent(self,*args):
""" OnDpiChangedAfterParent(self: Control,e: EventArgs) """
pass
def OnDpiChangedBeforeParent(self,*args):
""" OnDpiChangedBeforeParent(self: Control,e: EventArgs) """
pass
def OnDragDrop(self,*args):
"""
OnDragDrop(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragDrop event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragEnter(self,*args):
"""
OnDragEnter(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragEnter event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragLeave(self,*args):
"""
OnDragLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DragLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDragOver(self,*args):
"""
OnDragOver(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragOver event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnEnabledChanged(self,*args):
"""
OnEnabledChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.EnabledChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnEnter(self,*args):
"""
OnEnter(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Enter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnFontChanged(self,*args):
"""
OnFontChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.FontChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnForeColorChanged(self,*args):
"""
OnForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnGiveFeedback(self,*args):
"""
OnGiveFeedback(self: Control,gfbevent: GiveFeedbackEventArgs)
Raises the System.Windows.Forms.Control.GiveFeedback event.
gfbevent: A System.Windows.Forms.GiveFeedbackEventArgs that contains the event data.
"""
pass
def OnGotFocus(self,*args):
"""
OnGotFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleCreated(self,*args):
"""
OnHandleCreated(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.HandleCreated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleDestroyed(self,*args):
"""
OnHandleDestroyed(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.HandleDestroyed event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHelpRequested(self,*args):
"""
OnHelpRequested(self: Control,hevent: HelpEventArgs)
Raises the System.Windows.Forms.Control.HelpRequested event.
hevent: A System.Windows.Forms.HelpEventArgs that contains the event data.
"""
pass
def OnImeModeChanged(self,*args):
"""
OnImeModeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ImeModeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnInvalidated(self,*args):
"""
OnInvalidated(self: Control,e: InvalidateEventArgs)
Raises the System.Windows.Forms.Control.Invalidated event.
e: An System.Windows.Forms.InvalidateEventArgs that contains the event data.
"""
pass
def OnKeyDown(self,*args):
"""
OnKeyDown(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyDown event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnKeyPress(self,*args):
"""
OnKeyPress(self: Control,e: KeyPressEventArgs)
Raises the System.Windows.Forms.Control.KeyPress event.
e: A System.Windows.Forms.KeyPressEventArgs that contains the event data.
"""
pass
def OnKeyUp(self,*args):
"""
OnKeyUp(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyUp event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnLayout(self,*args):
"""
OnLayout(self: ScrollableControl,levent: LayoutEventArgs)
levent: A System.Windows.Forms.LayoutEventArgs that contains the event data.
"""
pass
def OnLeave(self,*args):
"""
OnLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Leave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLocationChanged(self,*args):
"""
OnLocationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LocationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLostFocus(self,*args):
"""
OnLostFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMarginChanged(self,*args):
"""
OnMarginChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MarginChanged event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnMouseCaptureChanged(self,*args):
"""
OnMouseCaptureChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseCaptureChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseClick(self,*args):
"""
OnMouseClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDoubleClick(self,*args):
"""
OnMouseDoubleClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseDoubleClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDown(self,*args):
"""
OnMouseDown(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseDown event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseEnter(self,*args):
"""
OnMouseEnter(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseEnter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseHover(self,*args):
"""
OnMouseHover(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseHover event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseLeave(self,*args):
"""
OnMouseLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseMove(self,*args):
"""
OnMouseMove(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseMove event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseUp(self,*args):
"""
OnMouseUp(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseUp event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseWheel(self,*args):
"""
OnMouseWheel(self: ScrollableControl,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseWheel event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMove(self,*args):
"""
OnMove(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Move event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnNotifyMessage(self,*args):
"""
OnNotifyMessage(self: Control,m: Message)
Notifies the control of Windows messages.
m: A System.Windows.Forms.Message that represents the Windows message.
"""
pass
def OnPaddingChanged(self,*args):
"""
OnPaddingChanged(self: ScrollableControl,e: EventArgs)
Raises the System.Windows.Forms.Control.PaddingChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPaint(self,*args):
""" OnPaint(self: GH_LayoutMenuItem,e: PaintEventArgs) """
pass
def OnPaintBackground(self,*args):
""" OnPaintBackground(self: GH_LayoutMenuItem,e: PaintEventArgs) """
pass
def OnParentBackColorChanged(self,*args):
"""
OnParentBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event when the
System.Windows.Forms.Control.BackColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBackgroundImageChanged(self,*args):
"""
OnParentBackgroundImageChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event when the
System.Windows.Forms.Control.BackgroundImage property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBindingContextChanged(self,*args):
"""
OnParentBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event when the
System.Windows.Forms.Control.BindingContext property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentChanged(self,*args):
"""
OnParentChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ParentChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentCursorChanged(self,*args):
"""
OnParentCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentEnabledChanged(self,*args):
"""
OnParentEnabledChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.EnabledChanged event when the
System.Windows.Forms.Control.Enabled property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentFontChanged(self,*args):
"""
OnParentFontChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.FontChanged event when the
System.Windows.Forms.Control.Font property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentForeColorChanged(self,*args):
"""
OnParentForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event when the
System.Windows.Forms.Control.ForeColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentRightToLeftChanged(self,*args):
"""
OnParentRightToLeftChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RightToLeftChanged event when the
System.Windows.Forms.Control.RightToLeft property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentVisibleChanged(self,*args):
"""
OnParentVisibleChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.VisibleChanged event when the
System.Windows.Forms.Control.Visible property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPreviewKeyDown(self,*args):
"""
OnPreviewKeyDown(self: Control,e: PreviewKeyDownEventArgs)
Raises the System.Windows.Forms.Control.PreviewKeyDown event.
e: A System.Windows.Forms.PreviewKeyDownEventArgs that contains the event data.
"""
pass
def OnPrint(self,*args):
"""
OnPrint(self: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def OnQueryContinueDrag(self,*args):
"""
OnQueryContinueDrag(self: Control,qcdevent: QueryContinueDragEventArgs)
Raises the System.Windows.Forms.Control.QueryContinueDrag event.
qcdevent: A System.Windows.Forms.QueryContinueDragEventArgs that contains the event data.
"""
pass
def OnRegionChanged(self,*args):
"""
OnRegionChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RegionChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnResize(self,*args):
"""
OnResize(self: Panel,eventargs: EventArgs)
Fires the event indicating that the panel has been resized. Inheriting controls should use this
in favor of actually listening to the event,but should still call base.onResize to ensure that
the event is fired for external listeners.
eventargs: An System.EventArgs that contains the event data.
"""
pass
def OnRightToLeftChanged(self,*args):
"""
OnRightToLeftChanged(self: ScrollableControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnScroll(self,*args):
"""
OnScroll(self: ScrollableControl,se: ScrollEventArgs)
Raises the System.Windows.Forms.ScrollableControl.Scroll event.
se: A System.Windows.Forms.ScrollEventArgs that contains the event data.
"""
pass
def OnSizeChanged(self,*args):
"""
OnSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnStyleChanged(self,*args):
"""
OnStyleChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.StyleChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnSystemColorsChanged(self,*args):
"""
OnSystemColorsChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SystemColorsChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabIndexChanged(self,*args):
"""
OnTabIndexChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabIndexChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabStopChanged(self,*args):
"""
OnTabStopChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabStopChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTextChanged(self,*args):
"""
OnTextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TextChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidated(self,*args):
"""
OnValidated(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Validated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidating(self,*args):
"""
OnValidating(self: Control,e: CancelEventArgs)
Raises the System.Windows.Forms.Control.Validating event.
e: A System.ComponentModel.CancelEventArgs that contains the event data.
"""
pass
def OnVisibleChanged(self,*args):
"""
OnVisibleChanged(self: ScrollableControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def ProcessCmdKey(self,*args):
"""
ProcessCmdKey(self: Control,msg: Message,keyData: Keys) -> (bool,Message)
Processes a command key.
msg: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the character was processed by the control; otherwise,false.
"""
pass
def ProcessDialogChar(self,*args):
"""
ProcessDialogChar(self: Control,charCode: Char) -> bool
Processes a dialog character.
charCode: The character to process.
Returns: true if the character was processed by the control; otherwise,false.
"""
pass
def ProcessDialogKey(self,*args):
"""
ProcessDialogKey(self: Control,keyData: Keys) -> bool
Processes a dialog key.
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the key was processed by the control; otherwise,false.
"""
pass
def ProcessKeyEventArgs(self,*args):
"""
ProcessKeyEventArgs(self: Control,m: Message) -> (bool,Message)
Processes a key message and generates the appropriate control events.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyMessage(self,*args):
"""
ProcessKeyMessage(self: Control,m: Message) -> (bool,Message)
Processes a keyboard message.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyPreview(self,*args):
"""
ProcessKeyPreview(self: Control,m: Message) -> (bool,Message)
Previews a keyboard message.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessMnemonic(self,*args):
"""
ProcessMnemonic(self: Control,charCode: Char) -> bool
Processes a mnemonic character.
charCode: The character to process.
Returns: true if the character was processed as a mnemonic by the control; otherwise,false.
"""
pass
def RaiseDragEvent(self,*args):
"""
RaiseDragEvent(self: Control,key: object,e: DragEventArgs)
Raises the appropriate drag event.
key: The event to raise.
e: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def RaiseKeyEvent(self,*args):
"""
RaiseKeyEvent(self: Control,key: object,e: KeyEventArgs)
Raises the appropriate key event.
key: The event to raise.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def RaiseMouseEvent(self,*args):
"""
RaiseMouseEvent(self: Control,key: object,e: MouseEventArgs)
Raises the appropriate mouse event.
key: The event to raise.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def RaisePaintEvent(self,*args):
"""
RaisePaintEvent(self: Control,key: object,e: PaintEventArgs)
Raises the appropriate paint event.
key: The event to raise.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def RecreateHandle(self,*args):
"""
RecreateHandle(self: Control)
Forces the re-creation of the handle for the control.
"""
pass
def RescaleConstantsForDpi(self,*args):
""" RescaleConstantsForDpi(self: Control,deviceDpiOld: int,deviceDpiNew: int) """
pass
def ResetMouseEventArgs(self,*args):
"""
ResetMouseEventArgs(self: Control)
Resets the control to handle the System.Windows.Forms.Control.MouseLeave event.
"""
pass
def RtlTranslateAlignment(self,*args):
"""
RtlTranslateAlignment(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
RtlTranslateAlignment(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
RtlTranslateAlignment(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateContent(self,*args):
"""
RtlTranslateContent(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
"""
pass
def RtlTranslateHorizontal(self,*args):
"""
RtlTranslateHorizontal(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateLeftRight(self,*args):
"""
RtlTranslateLeftRight(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
"""
pass
def ScaleControl(self,*args):
"""
ScaleControl(self: ScrollableControl,factor: SizeF,specified: BoundsSpecified)
factor: The factor by which the height and width of the control will be scaled.
specified: A System.Windows.Forms.BoundsSpecified value that specifies the bounds of the control to use
when defining its size and position.
"""
pass
def ScaleCore(self,*args):
"""
ScaleCore(self: ScrollableControl,dx: Single,dy: Single)
dx: The horizontal scaling factor.
dy: The vertical scaling factor.
"""
pass
def ScrollToControl(self,*args):
"""
ScrollToControl(self: ScrollableControl,activeControl: Control) -> Point
Calculates the scroll offset to the specified child control.
activeControl: The child control to scroll into view.
Returns: The upper-left hand System.Drawing.Point of the display area relative to the client area
required to scroll the control into view.
"""
pass
def Select(self):
"""
Select(self: Control,directed: bool,forward: bool)
Activates a child control. Optionally specifies the direction in the tab order to select the
control from.
directed: true to specify the direction of the control to select; otherwise,false.
forward: true to move forward in the tab order; false to move backward in the tab order.
"""
pass
def SetAutoSizeMode(self,*args):
"""
SetAutoSizeMode(self: Control,mode: AutoSizeMode)
Sets a value indicating how a control will behave when its System.Windows.Forms.Control.AutoSize
property is enabled.
mode: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def SetBoundsCore(self,*args):
"""
SetBoundsCore(self: Control,x: int,y: int,width: int,height: int,specified: BoundsSpecified)
Performs the work of setting the specified bounds of this control.
x: The new System.Windows.Forms.Control.Left property value of the control.
y: The new System.Windows.Forms.Control.Top property value of the control.
width: The new System.Windows.Forms.Control.Width property value of the control.
height: The new System.Windows.Forms.Control.Height property value of the control.
specified: A bitwise combination of the System.Windows.Forms.BoundsSpecified values.
"""
pass
def SetClientSizeCore(self,*args):
"""
SetClientSizeCore(self: Control,x: int,y: int)
Sets the size of the client area of the control.
x: The client area width,in pixels.
y: The client area height,in pixels.
"""
pass
def SetDisplayRectLocation(self,*args):
"""
SetDisplayRectLocation(self: ScrollableControl,x: int,y: int)
Positions the display window to the specified value.
x: The horizontal offset at which to position the System.Windows.Forms.ScrollableControl.
y: The vertical offset at which to position the System.Windows.Forms.ScrollableControl.
"""
pass
def SetScrollState(self,*args):
"""
SetScrollState(self: ScrollableControl,bit: int,value: bool)
Sets the specified scroll state flag.
bit: The scroll state flag to set.
value: The value to set the flag.
"""
pass
def SetStyle(self,*args):
"""
SetStyle(self: Control,flag: ControlStyles,value: bool)
Sets a specified System.Windows.Forms.ControlStyles flag to either true or false.
flag: The System.Windows.Forms.ControlStyles bit to set.
value: true to apply the specified style to the control; otherwise,false.
"""
pass
def SetTopLevel(self,*args):
"""
SetTopLevel(self: Control,value: bool)
Sets the control as the top-level control.
value: true to set the control as the top-level control; otherwise,false.
"""
pass
def SetVisibleCore(self,*args):
"""
SetVisibleCore(self: Control,value: bool)
Sets the control to the specified visible state.
value: true to make the control visible; otherwise,false.
"""
pass
def SizeFromClientSize(self,*args):
"""
SizeFromClientSize(self: Control,clientSize: Size) -> Size
Determines the size of the entire control from the height and width of its client area.
clientSize: A System.Drawing.Size value representing the height and width of the control's client area.
Returns: A System.Drawing.Size value representing the height and width of the entire control.
"""
pass
def UpdateBounds(self,*args):
"""
UpdateBounds(self: Control,x: int,y: int,width: int,height: int,clientWidth: int,clientHeight: int)
Updates the bounds of the control with the specified size,location,and client size.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
clientWidth: The client System.Drawing.Size.Width of the control.
clientHeight: The client System.Drawing.Size.Height of the control.
UpdateBounds(self: Control,x: int,y: int,width: int,height: int)
Updates the bounds of the control with the specified size and location.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
UpdateBounds(self: Control)
Updates the bounds of the control with the current size and location.
"""
pass
def UpdateStyles(self,*args):
"""
UpdateStyles(self: Control)
Forces the assigned styles to be reapplied to the control.
"""
pass
def UpdateZOrder(self,*args):
"""
UpdateZOrder(self: Control)
Updates the control in its parent's z-order.
"""
pass
def WndProc(self,*args):
"""
WndProc(self: ScrollableControl,m: Message) -> Message
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def __enter__(self,*args):
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self,*args):
"""
__exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self,*args):
pass
CanEnableIme=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the System.Windows.Forms.Control.ImeMode property can be set to an active value,to enable IME support.
"""
CanRaiseEvents=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Determines if events can be raised on the control.
"""
CloseRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: CloseRegion(self: GH_LayoutMenuItem) -> Rectangle
"""
CreateParams=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the required creation parameters when the control handle is created.
"""
DefaultCursor=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the default cursor for the control.
"""
DefaultImeMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the default Input Method Editor (IME) mode supported by the control.
"""
DefaultMargin=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the space,in pixels,that is specified by default between controls.
"""
DefaultMaximumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default maximum size of a control.
"""
DefaultMinimumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default minimum size of a control.
"""
DefaultPadding=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the internal spacing,in pixels,of the contents of a control.
"""
DefaultSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
DesignMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that indicates whether the System.ComponentModel.Component is currently in design mode.
"""
DoubleBuffered=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether this control should redraw its surface using a secondary buffer to reduce or prevent flicker.
"""
EditRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: EditRegion(self: GH_LayoutMenuItem) -> Rectangle
"""
Events=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the list of event handlers that are attached to this System.ComponentModel.Component.
"""
FontHeight=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the height of the font of the control.
"""
HScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the horizontal scroll bar is visible.
"""
ImeModeBase=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the IME mode of a control.
"""
LabelRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: LabelRegion(self: GH_LayoutMenuItem) -> Rectangle
"""
LayoutFile=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: LayoutFile(self: GH_LayoutMenuItem) -> str
Set: LayoutFile(self: GH_LayoutMenuItem)=value
"""
RenderRightToLeft=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""This property is now obsolete.
"""
ResizeRedraw=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the control redraws itself when resized.
"""
Ribbon=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Ribbon(self: GH_LayoutMenuItem) -> GH_Ribbon
Set: Ribbon(self: GH_LayoutMenuItem)=value
"""
ScaleChildren=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that determines the scaling of child controls.
"""
ShowFocusCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the control should display focus rectangles.
"""
ShowKeyboardCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the user interface is in the appropriate state to show or hide keyboard accelerators.
"""
VisibleRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: VisibleRegion(self: GH_LayoutMenuItem) -> Rectangle
"""
VScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the vertical scroll bar is visible.
"""
class GH_LayoutPanel(object,GH_ISerializable,IComparable[GH_LayoutPanel]):
"""
GH_LayoutPanel()
GH_LayoutPanel(name: str)
GH_LayoutPanel(other: GH_LayoutPanel)
"""
def CompareTo(self,other):
""" CompareTo(self: GH_LayoutPanel,other: GH_LayoutPanel) -> int """
pass
def Read(self,reader):
""" Read(self: GH_LayoutPanel,reader: GH_IReader) -> bool """
pass
def Sort(self):
""" Sort(self: GH_LayoutPanel) """
pass
def Write(self,writer):
""" Write(self: GH_LayoutPanel,writer: GH_IWriter) -> bool """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,*__args):
"""
__new__(cls: type)
__new__(cls: type,name: str)
__new__(cls: type,other: GH_LayoutPanel)
"""
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
Items=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Items(self: GH_LayoutPanel) -> List[GH_LayoutItem]
"""
Name=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Name(self: GH_LayoutPanel) -> str
Set: Name(self: GH_LayoutPanel)=value
"""
class GH_LayoutTab(object,GH_ISerializable,IComparable[GH_LayoutTab]):
"""
GH_LayoutTab()
GH_LayoutTab(name: str)
GH_LayoutTab(other: GH_LayoutTab)
"""
def AddPanel(self,name=None):
"""
AddPanel(self: GH_LayoutTab,name: str) -> GH_LayoutPanel
AddPanel(self: GH_LayoutTab) -> GH_LayoutPanel
"""
pass
def CompareTo(self,other):
""" CompareTo(self: GH_LayoutTab,other: GH_LayoutTab) -> int """
pass
def Deselect(self):
""" Deselect(self: GH_LayoutTab) """
pass
def Read(self,reader):
""" Read(self: GH_LayoutTab,reader: GH_IReader) -> bool """
pass
def SortPanels(self,order):
""" SortPanels(self: GH_LayoutTab,*order: Array[str]) """
pass
def Write(self,writer):
""" Write(self: GH_LayoutTab,writer: GH_IWriter) -> bool """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,*__args):
"""
__new__(cls: type)
__new__(cls: type,name: str)
__new__(cls: type,other: GH_LayoutTab)
"""
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
Name=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Name(self: GH_LayoutTab) -> str
Set: Name(self: GH_LayoutTab)=value
"""
Panels=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Panels(self: GH_LayoutTab) -> List[GH_LayoutPanel]
"""
Selected=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Selected(self: GH_LayoutTab) -> bool
Set: Selected(self: GH_LayoutTab)=value
"""
class GH_Ribbon(UserControl,IComponent,IDisposable,IOleControl,IOleObject,IOleInPlaceObject,IOleInPlaceActiveObject,IOleWindow,IViewObject,IViewObject2,IPersist,IPersistStreamInit,IPersistPropertyBag,IPersistStorage,IQuickActivate,ISupportOleDropSource,IDropTarget,ISynchronizeInvoke,IWin32Window,IArrangedElement,IBindableComponent,IContainerControl,IGH_FixedSizeControl):
""" GH_Ribbon() """
def AccessibilityNotifyClients(self,*args):
"""
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,objectID: int,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control .
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
objectID: The identifier of the System.Windows.Forms.AccessibleObject.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control.
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
"""
pass
def AddLayout(self,file,visible=None):
""" AddLayout(self: GH_Ribbon,file: str,visible: bool)AddLayout(self: GH_Ribbon,file: str) """
pass
def AdjustFormScrollbars(self,*args):
"""
AdjustFormScrollbars(self: ContainerControl,displayScrollbars: bool)
displayScrollbars: true to show the scroll bars; otherwise,false.
"""
pass
def AmIFocused(self,item):
""" AmIFocused(self: GH_Ribbon,item: GH_RibbonItem) -> bool """
pass
def CreateAccessibilityInstance(self,*args):
"""
CreateAccessibilityInstance(self: Control) -> AccessibleObject
Creates a new accessibility object for the control.
Returns: A new System.Windows.Forms.AccessibleObject for the control.
"""
pass
def CreateControlsInstance(self,*args):
"""
CreateControlsInstance(self: Control) -> ControlCollection
Creates a new instance of the control collection for the control.
Returns: A new instance of System.Windows.Forms.Control.ControlCollection assigned to the control.
"""
pass
def CreateHandle(self,*args):
"""
CreateHandle(self: Control)
Creates a handle for the control.
"""
pass
def CreateRibbonLayoutMenu(self):
""" CreateRibbonLayoutMenu(self: GH_Ribbon) -> ToolStripDropDown """
pass
def DefWndProc(self,*args):
"""
DefWndProc(self: Control,m: Message) -> Message
Sends the specified message to the default window procedure.
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def DestroyHandle(self,*args):
"""
DestroyHandle(self: Control)
Destroys the handle associated with the control.
"""
pass
def DetermineFocusItem(self,*args):
""" DetermineFocusItem(self: GH_Ribbon,e: MouseEventArgs) -> GH_RibbonControlFocusResult """
pass
def Dispose(self):
""" Dispose(self: GH_Ribbon,disposing: bool) """
pass
def FindAndDisplayProxy(self,id,tabBox,iconbox,dropdown):
""" FindAndDisplayProxy(self: GH_Ribbon,id: Guid) -> (bool,Rectangle,Rectangle,GH_RibbonDropdown) """
pass
def GetAccessibilityObjectById(self,*args):
"""
GetAccessibilityObjectById(self: Control,objectId: int) -> AccessibleObject
Retrieves the specified System.Windows.Forms.AccessibleObject.
objectId: An Int32 that identifies the System.Windows.Forms.AccessibleObject to retrieve.
Returns: An System.Windows.Forms.AccessibleObject.
"""
pass
def GetAutoSizeMode(self,*args):
"""
GetAutoSizeMode(self: Control) -> AutoSizeMode
Retrieves a value indicating how a control will behave when its
System.Windows.Forms.Control.AutoSize property is enabled.
Returns: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def GetScaledBounds(self,*args):
"""
GetScaledBounds(self: Control,bounds: Rectangle,factor: SizeF,specified: BoundsSpecified) -> Rectangle
Retrieves the bounds within which the control is scaled.
bounds: A System.Drawing.Rectangle that specifies the area for which to retrieve the display bounds.
factor: The height and width of the control's bounds.
specified: One of the values of System.Windows.Forms.BoundsSpecified that specifies the bounds of the
control to use when defining its size and position.
Returns: A System.Drawing.Rectangle representing the bounds within which the control is scaled.
"""
pass
def GetScrollState(self,*args):
"""
GetScrollState(self: ScrollableControl,bit: int) -> bool
Determines whether the specified flag has been set.
bit: The flag to check.
Returns: true if the specified flag has been set; otherwise,false.
"""
pass
def GetService(self,*args):
"""
GetService(self: Component,service: Type) -> object
Returns an object that represents a service provided by the System.ComponentModel.Component or
by its System.ComponentModel.Container.
service: A service provided by the System.ComponentModel.Component.
Returns: An System.Object that represents a service provided by the System.ComponentModel.Component,or
null if the System.ComponentModel.Component does not provide the specified service.
"""
pass
def GetStyle(self,*args):
"""
GetStyle(self: Control,flag: ControlStyles) -> bool
Retrieves the value of the specified control style bit for the control.
flag: The System.Windows.Forms.ControlStyles bit to return the value from.
Returns: true if the specified control style bit is set to true; otherwise,false.
"""
pass
def GetTopLevel(self,*args):
"""
GetTopLevel(self: Control) -> bool
Determines if the control is a top-level control.
Returns: true if the control is a top-level control; otherwise,false.
"""
pass
def HideLayout(self,path):
""" HideLayout(self: GH_Ribbon,path: str) """
pass
def InitLayout(self,*args):
"""
InitLayout(self: Control)
Called after the control has been added to another container.
"""
pass
def InvokeGotFocus(self,*args):
"""
InvokeGotFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeLostFocus(self,*args):
"""
InvokeLostFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeOnClick(self,*args):
"""
InvokeOnClick(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Click event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokePaint(self,*args):
"""
InvokePaint(self: Control,c: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def InvokePaintBackground(self,*args):
"""
InvokePaintBackground(self: Control,c: Control,e: PaintEventArgs)
Raises the PaintBackground event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def IsInputChar(self,*args):
"""
IsInputChar(self: Control,charCode: Char) -> bool
Determines if a character is an input character that the control recognizes.
charCode: The character to test.
Returns: true if the character should be sent directly to the control and not preprocessed; otherwise,
false.
"""
pass
def IsInputKey(self,*args):
"""
IsInputKey(self: Control,keyData: Keys) -> bool
Determines whether the specified key is a regular input key or a special key that requires
preprocessing.
keyData: One of the System.Windows.Forms.Keys values.
Returns: true if the specified key is a regular input key; otherwise,false.
"""
pass
def IsLayoutLoaded(self,file):
""" IsLayoutLoaded(self: GH_Ribbon,file: str) -> bool """
pass
def IsLayoutVisible(self,file):
""" IsLayoutVisible(self: GH_Ribbon,file: str) -> bool """
pass
def LayoutRibbon(self):
""" LayoutRibbon(self: GH_Ribbon) """
pass
def MemberwiseClone(self,*args):
"""
MemberwiseClone(self: MarshalByRefObject,cloneIdentity: bool) -> MarshalByRefObject
Creates a shallow copy of the current System.MarshalByRefObject object.
cloneIdentity: false to delete the current System.MarshalByRefObject object's identity,which will cause the
object to be assigned a new identity when it is marshaled across a remoting boundary. A value of
false is usually appropriate. true to copy the current System.MarshalByRefObject object's
identity to its clone,which will cause remoting client calls to be routed to the remote server
object.
Returns: A shallow copy of the current System.MarshalByRefObject object.
MemberwiseClone(self: object) -> object
Creates a shallow copy of the current System.Object.
Returns: A shallow copy of the current System.Object.
"""
pass
def NearestHeight(self,h):
""" NearestHeight(self: GH_Ribbon,h: int) -> int """
pass
def NearestWidth(self,w):
""" NearestWidth(self: GH_Ribbon,w: int) -> int """
pass
def NotifyInvalidate(self,*args):
"""
NotifyInvalidate(self: Control,invalidatedArea: Rectangle)
Raises the System.Windows.Forms.Control.Invalidated event with a specified region of the control
to invalidate.
invalidatedArea: A System.Drawing.Rectangle representing the area to invalidate.
"""
pass
def OnAutoSizeChanged(self,*args):
"""
OnAutoSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.AutoSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnAutoValidateChanged(self,*args):
"""
OnAutoValidateChanged(self: ContainerControl,e: EventArgs)
Raises the System.Windows.Forms.ContainerControl.AutoValidateChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackColorChanged(self,*args):
"""
OnBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackgroundImageChanged(self,*args):
"""
OnBackgroundImageChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackgroundImageLayoutChanged(self,*args):
"""
OnBackgroundImageLayoutChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageLayoutChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBindingContextChanged(self,*args):
"""
OnBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnCausesValidationChanged(self,*args):
"""
OnCausesValidationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CausesValidationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnChangeUICues(self,*args):
"""
OnChangeUICues(self: Control,e: UICuesEventArgs)
Raises the System.Windows.Forms.Control.ChangeUICues event.
e: A System.Windows.Forms.UICuesEventArgs that contains the event data.
"""
pass
def OnClick(self,*args):
"""
OnClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnClientSizeChanged(self,*args):
"""
OnClientSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ClientSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnContextMenuChanged(self,*args):
"""
OnContextMenuChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnContextMenuStripChanged(self,*args):
"""
OnContextMenuStripChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuStripChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnControlAdded(self,*args):
"""
OnControlAdded(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlAdded event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnControlRemoved(self,*args):
"""
OnControlRemoved(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlRemoved event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnCreateControl(self,*args):
"""
OnCreateControl(self: UserControl)
Raises the CreateControl event.
"""
pass
def OnCursorChanged(self,*args):
"""
OnCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDockChanged(self,*args):
"""
OnDockChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DockChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDoubleClick(self,*args):
"""
OnDoubleClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DoubleClick event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDpiChangedAfterParent(self,*args):
""" OnDpiChangedAfterParent(self: Control,e: EventArgs) """
pass
def OnDpiChangedBeforeParent(self,*args):
""" OnDpiChangedBeforeParent(self: Control,e: EventArgs) """
pass
def OnDragDrop(self,*args):
"""
OnDragDrop(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragDrop event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragEnter(self,*args):
"""
OnDragEnter(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragEnter event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragLeave(self,*args):
"""
OnDragLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DragLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDragOver(self,*args):
"""
OnDragOver(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragOver event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnEnabledChanged(self,*args):
"""
OnEnabledChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.EnabledChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnEnter(self,*args):
"""
OnEnter(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Enter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnFontChanged(self,*args):
"""
OnFontChanged(self: ContainerControl,e: EventArgs)
Raises the System.Windows.Forms.Control.FontChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnForeColorChanged(self,*args):
"""
OnForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnGiveFeedback(self,*args):
"""
OnGiveFeedback(self: Control,gfbevent: GiveFeedbackEventArgs)
Raises the System.Windows.Forms.Control.GiveFeedback event.
gfbevent: A System.Windows.Forms.GiveFeedbackEventArgs that contains the event data.
"""
pass
def OnGotFocus(self,*args):
"""
OnGotFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleCreated(self,*args):
"""
OnHandleCreated(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.HandleCreated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleDestroyed(self,*args):
"""
OnHandleDestroyed(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.HandleDestroyed event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHelpRequested(self,*args):
"""
OnHelpRequested(self: Control,hevent: HelpEventArgs)
Raises the System.Windows.Forms.Control.HelpRequested event.
hevent: A System.Windows.Forms.HelpEventArgs that contains the event data.
"""
pass
def OnImeModeChanged(self,*args):
"""
OnImeModeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ImeModeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnInvalidated(self,*args):
"""
OnInvalidated(self: Control,e: InvalidateEventArgs)
Raises the System.Windows.Forms.Control.Invalidated event.
e: An System.Windows.Forms.InvalidateEventArgs that contains the event data.
"""
pass
def OnKeyDown(self,*args):
"""
OnKeyDown(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyDown event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnKeyPress(self,*args):
"""
OnKeyPress(self: Control,e: KeyPressEventArgs)
Raises the System.Windows.Forms.Control.KeyPress event.
e: A System.Windows.Forms.KeyPressEventArgs that contains the event data.
"""
pass
def OnKeyUp(self,*args):
"""
OnKeyUp(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyUp event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnLayout(self,*args):
"""
OnLayout(self: ContainerControl,e: LayoutEventArgs)
Raises the System.Windows.Forms.Control.Layout event.
e: A System.Windows.Forms.LayoutEventArgs that contains the event data.
"""
pass
def OnLeave(self,*args):
"""
OnLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Leave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLoad(self,*args):
"""
OnLoad(self: UserControl,e: EventArgs)
Raises the System.Windows.Forms.UserControl.Load event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLocationChanged(self,*args):
"""
OnLocationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LocationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLostFocus(self,*args):
"""
OnLostFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMarginChanged(self,*args):
"""
OnMarginChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MarginChanged event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnMouseCaptureChanged(self,*args):
"""
OnMouseCaptureChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseCaptureChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseClick(self,*args):
"""
OnMouseClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDoubleClick(self,*args):
"""
OnMouseDoubleClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseDoubleClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDown(self,*args):
"""
OnMouseDown(self: UserControl,e: MouseEventArgs)
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseEnter(self,*args):
"""
OnMouseEnter(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseEnter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseHover(self,*args):
"""
OnMouseHover(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseHover event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseLeave(self,*args):
"""
OnMouseLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseMove(self,*args):
"""
OnMouseMove(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseMove event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseUp(self,*args):
"""
OnMouseUp(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseUp event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseWheel(self,*args):
"""
OnMouseWheel(self: ScrollableControl,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseWheel event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMove(self,*args):
"""
OnMove(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Move event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnNotifyMessage(self,*args):
"""
OnNotifyMessage(self: Control,m: Message)
Notifies the control of Windows messages.
m: A System.Windows.Forms.Message that represents the Windows message.
"""
pass
def OnPaddingChanged(self,*args):
"""
OnPaddingChanged(self: ScrollableControl,e: EventArgs)
Raises the System.Windows.Forms.Control.PaddingChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPaint(self,*args):
""" OnPaint(self: GH_Ribbon,e: PaintEventArgs) """
pass
def OnPaintBackground(self,*args):
""" OnPaintBackground(self: GH_Ribbon,e: PaintEventArgs) """
pass
def OnParentBackColorChanged(self,*args):
"""
OnParentBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event when the
System.Windows.Forms.Control.BackColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBackgroundImageChanged(self,*args):
"""
OnParentBackgroundImageChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event when the
System.Windows.Forms.Control.BackgroundImage property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBindingContextChanged(self,*args):
"""
OnParentBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event when the
System.Windows.Forms.Control.BindingContext property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentChanged(self,*args):
"""
OnParentChanged(self: ContainerControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentCursorChanged(self,*args):
"""
OnParentCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentEnabledChanged(self,*args):
"""
OnParentEnabledChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.EnabledChanged event when the
System.Windows.Forms.Control.Enabled property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentFontChanged(self,*args):
"""
OnParentFontChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.FontChanged event when the
System.Windows.Forms.Control.Font property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentForeColorChanged(self,*args):
"""
OnParentForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event when the
System.Windows.Forms.Control.ForeColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentRightToLeftChanged(self,*args):
"""
OnParentRightToLeftChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RightToLeftChanged event when the
System.Windows.Forms.Control.RightToLeft property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentVisibleChanged(self,*args):
"""
OnParentVisibleChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.VisibleChanged event when the
System.Windows.Forms.Control.Visible property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPreviewKeyDown(self,*args):
"""
OnPreviewKeyDown(self: Control,e: PreviewKeyDownEventArgs)
Raises the System.Windows.Forms.Control.PreviewKeyDown event.
e: A System.Windows.Forms.PreviewKeyDownEventArgs that contains the event data.
"""
pass
def OnPrint(self,*args):
"""
OnPrint(self: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def OnQueryContinueDrag(self,*args):
"""
OnQueryContinueDrag(self: Control,qcdevent: QueryContinueDragEventArgs)
Raises the System.Windows.Forms.Control.QueryContinueDrag event.
qcdevent: A System.Windows.Forms.QueryContinueDragEventArgs that contains the event data.
"""
pass
def OnRegionChanged(self,*args):
"""
OnRegionChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RegionChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnResize(self,*args):
"""
OnResize(self: UserControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnRightToLeftChanged(self,*args):
"""
OnRightToLeftChanged(self: ScrollableControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnScroll(self,*args):
"""
OnScroll(self: ScrollableControl,se: ScrollEventArgs)
Raises the System.Windows.Forms.ScrollableControl.Scroll event.
se: A System.Windows.Forms.ScrollEventArgs that contains the event data.
"""
pass
def OnSizeChanged(self,*args):
"""
OnSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnStyleChanged(self,*args):
"""
OnStyleChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.StyleChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnSystemColorsChanged(self,*args):
"""
OnSystemColorsChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SystemColorsChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabIndexChanged(self,*args):
"""
OnTabIndexChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabIndexChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabStopChanged(self,*args):
"""
OnTabStopChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabStopChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTextChanged(self,*args):
"""
OnTextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TextChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidated(self,*args):
"""
OnValidated(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Validated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidating(self,*args):
"""
OnValidating(self: Control,e: CancelEventArgs)
Raises the System.Windows.Forms.Control.Validating event.
e: A System.ComponentModel.CancelEventArgs that contains the event data.
"""
pass
def OnVisibleChanged(self,*args):
"""
OnVisibleChanged(self: ScrollableControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def PopulateRibbon(self):
""" PopulateRibbon(self: GH_Ribbon) """
pass
def ProcessCmdKey(self,*args):
"""
ProcessCmdKey(self: ContainerControl,msg: Message,keyData: Keys) -> (bool,Message)
msg: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the character was processed by the control; otherwise,false.
"""
pass
def ProcessDialogChar(self,*args):
"""
ProcessDialogChar(self: ContainerControl,charCode: Char) -> bool
charCode: The character to process.
Returns: true if the character was processed by the control; otherwise,false.
"""
pass
def ProcessDialogKey(self,*args):
"""
ProcessDialogKey(self: ContainerControl,keyData: Keys) -> bool
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the key was processed by the control; otherwise,false.
"""
pass
def ProcessKeyEventArgs(self,*args):
"""
ProcessKeyEventArgs(self: Control,m: Message) -> (bool,Message)
Processes a key message and generates the appropriate control events.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyMessage(self,*args):
"""
ProcessKeyMessage(self: Control,m: Message) -> (bool,Message)
Processes a keyboard message.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyPreview(self,*args):
"""
ProcessKeyPreview(self: Control,m: Message) -> (bool,Message)
Previews a keyboard message.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessMnemonic(self,*args):
"""
ProcessMnemonic(self: ContainerControl,charCode: Char) -> bool
charCode: The character to process.
Returns: true if the character was processed as a mnemonic by the control; otherwise,false.
"""
pass
def ProcessTabKey(self,*args):
"""
ProcessTabKey(self: ContainerControl,forward: bool) -> bool
Selects the next available control and makes it the active control.
forward: true to cycle forward through the controls in the System.Windows.Forms.ContainerControl;
otherwise,false.
Returns: true if a control is selected; otherwise,false.
"""
pass
def RaiseDragEvent(self,*args):
"""
RaiseDragEvent(self: Control,key: object,e: DragEventArgs)
Raises the appropriate drag event.
key: The event to raise.
e: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def RaiseKeyEvent(self,*args):
"""
RaiseKeyEvent(self: Control,key: object,e: KeyEventArgs)
Raises the appropriate key event.
key: The event to raise.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def RaiseMouseEvent(self,*args):
"""
RaiseMouseEvent(self: Control,key: object,e: MouseEventArgs)
Raises the appropriate mouse event.
key: The event to raise.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def RaisePaintEvent(self,*args):
"""
RaisePaintEvent(self: Control,key: object,e: PaintEventArgs)
Raises the appropriate paint event.
key: The event to raise.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def RecreateHandle(self,*args):
"""
RecreateHandle(self: Control)
Forces the re-creation of the handle for the control.
"""
pass
def RemoveAllLayouts(self):
""" RemoveAllLayouts(self: GH_Ribbon) """
pass
def RemoveLayout(self,file):
""" RemoveLayout(self: GH_Ribbon,file: str) -> bool """
pass
def RenderFocusRectangle(self,g):
""" RenderFocusRectangle(self: GH_Ribbon,g: Graphics) """
pass
def RescaleConstantsForDpi(self,*args):
""" RescaleConstantsForDpi(self: Control,deviceDpiOld: int,deviceDpiNew: int) """
pass
def ResetMouseEventArgs(self,*args):
"""
ResetMouseEventArgs(self: Control)
Resets the control to handle the System.Windows.Forms.Control.MouseLeave event.
"""
pass
def RtlTranslateAlignment(self,*args):
"""
RtlTranslateAlignment(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
RtlTranslateAlignment(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
RtlTranslateAlignment(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateContent(self,*args):
"""
RtlTranslateContent(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
"""
pass
def RtlTranslateHorizontal(self,*args):
"""
RtlTranslateHorizontal(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateLeftRight(self,*args):
"""
RtlTranslateLeftRight(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
"""
pass
def ScaleControl(self,*args):
"""
ScaleControl(self: ScrollableControl,factor: SizeF,specified: BoundsSpecified)
factor: The factor by which the height and width of the control will be scaled.
specified: A System.Windows.Forms.BoundsSpecified value that specifies the bounds of the control to use
when defining its size and position.
"""
pass
def ScaleCore(self,*args):
"""
ScaleCore(self: ScrollableControl,dx: Single,dy: Single)
dx: The horizontal scaling factor.
dy: The vertical scaling factor.
"""
pass
def ScrollToControl(self,*args):
"""
ScrollToControl(self: ScrollableControl,activeControl: Control) -> Point
Calculates the scroll offset to the specified child control.
activeControl: The child control to scroll into view.
Returns: The upper-left hand System.Drawing.Point of the display area relative to the client area
required to scroll the control into view.
"""
pass
def Select(self):
"""
Select(self: ContainerControl,directed: bool,forward: bool)
directed: true to specify the direction of the control to select; otherwise,false.
forward: true to move forward in the tab order; false to move backward in the tab order.
"""
pass
def SetAutoSizeMode(self,*args):
"""
SetAutoSizeMode(self: Control,mode: AutoSizeMode)
Sets a value indicating how a control will behave when its System.Windows.Forms.Control.AutoSize
property is enabled.
mode: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def SetBoundsCore(self,*args):
"""
SetBoundsCore(self: Control,x: int,y: int,width: int,height: int,specified: BoundsSpecified)
Performs the work of setting the specified bounds of this control.
x: The new System.Windows.Forms.Control.Left property value of the control.
y: The new System.Windows.Forms.Control.Top property value of the control.
width: The new System.Windows.Forms.Control.Width property value of the control.
height: The new System.Windows.Forms.Control.Height property value of the control.
specified: A bitwise combination of the System.Windows.Forms.BoundsSpecified values.
"""
pass
def SetClientSizeCore(self,*args):
"""
SetClientSizeCore(self: Control,x: int,y: int)
Sets the size of the client area of the control.
x: The client area width,in pixels.
y: The client area height,in pixels.
"""
pass
def SetDisplayRectLocation(self,*args):
"""
SetDisplayRectLocation(self: ScrollableControl,x: int,y: int)
Positions the display window to the specified value.
x: The horizontal offset at which to position the System.Windows.Forms.ScrollableControl.
y: The vertical offset at which to position the System.Windows.Forms.ScrollableControl.
"""
pass
def SetScrollState(self,*args):
"""
SetScrollState(self: ScrollableControl,bit: int,value: bool)
Sets the specified scroll state flag.
bit: The scroll state flag to set.
value: The value to set the flag.
"""
pass
def SetStyle(self,*args):
"""
SetStyle(self: Control,flag: ControlStyles,value: bool)
Sets a specified System.Windows.Forms.ControlStyles flag to either true or false.
flag: The System.Windows.Forms.ControlStyles bit to set.
value: true to apply the specified style to the control; otherwise,false.
"""
pass
def SetTopLevel(self,*args):
"""
SetTopLevel(self: Control,value: bool)
Sets the control as the top-level control.
value: true to set the control as the top-level control; otherwise,false.
"""
pass
def SetVisibleCore(self,*args):
"""
SetVisibleCore(self: Control,value: bool)
Sets the control to the specified visible state.
value: true to make the control visible; otherwise,false.
"""
pass
def ShowLayout(self,path):
""" ShowLayout(self: GH_Ribbon,path: str) """
pass
def SizeFromClientSize(self,*args):
"""
SizeFromClientSize(self: Control,clientSize: Size) -> Size
Determines the size of the entire control from the height and width of its client area.
clientSize: A System.Drawing.Size value representing the height and width of the control's client area.
Returns: A System.Drawing.Size value representing the height and width of the entire control.
"""
pass
def UpdateBounds(self,*args):
"""
UpdateBounds(self: Control,x: int,y: int,width: int,height: int,clientWidth: int,clientHeight: int)
Updates the bounds of the control with the specified size,location,and client size.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
clientWidth: The client System.Drawing.Size.Width of the control.
clientHeight: The client System.Drawing.Size.Height of the control.
UpdateBounds(self: Control,x: int,y: int,width: int,height: int)
Updates the bounds of the control with the specified size and location.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
UpdateBounds(self: Control)
Updates the bounds of the control with the current size and location.
"""
pass
def UpdateDefaultButton(self,*args):
"""
UpdateDefaultButton(self: ContainerControl)
When overridden by a derived class,updates which button is the default button.
"""
pass
def UpdateStyles(self,*args):
"""
UpdateStyles(self: Control)
Forces the assigned styles to be reapplied to the control.
"""
pass
def UpdateZOrder(self,*args):
"""
UpdateZOrder(self: Control)
Updates the control in its parent's z-order.
"""
pass
def WndProc(self,*args):
"""
WndProc(self: UserControl,m: Message) -> Message
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def __enter__(self,*args):
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self,*args):
"""
__exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self,*args):
pass
ActiveObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ActiveObject(self: GH_Ribbon) -> IGH_RibbonInteractiveObject
Set: ActiveObject(self: GH_Ribbon)=value
"""
ActiveTab=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ActiveTab(self: GH_Ribbon) -> GH_RibbonTab
Set: ActiveTab(self: GH_Ribbon)=value
"""
ActiveTabIndex=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ActiveTabIndex(self: GH_Ribbon) -> int
Set: ActiveTabIndex(self: GH_Ribbon)=value
"""
ActiveTabName=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ActiveTabName(self: GH_Ribbon) -> str
Set: ActiveTabName(self: GH_Ribbon)=value
"""
AutoScaleFactor=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the scaling factor between the current and design-time automatic scaling dimensions.
"""
CanEnableIme=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the System.Windows.Forms.Control.ImeMode property can be set to an active value,to enable IME support.
"""
CanRaiseEvents=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Determines if events can be raised on the control.
"""
CreateParams=property(lambda self: object(),lambda self,v: None,lambda self: None)
DefaultCursor=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the default cursor for the control.
"""
DefaultImeMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the default Input Method Editor (IME) mode supported by the control.
"""
DefaultMargin=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the space,in pixels,that is specified by default between controls.
"""
DefaultMaximumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default maximum size of a control.
"""
DefaultMinimumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default minimum size of a control.
"""
DefaultPadding=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the internal spacing,in pixels,of the contents of a control.
"""
DefaultSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
DesignMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that indicates whether the System.ComponentModel.Component is currently in design mode.
"""
DoubleBuffered=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether this control should redraw its surface using a secondary buffer to reduce or prevent flicker.
"""
Events=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the list of event handlers that are attached to this System.ComponentModel.Component.
"""
FontHeight=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the height of the font of the control.
"""
HScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the horizontal scroll bar is visible.
"""
ImeModeBase=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the IME mode of a control.
"""
IsActiveObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: IsActiveObject(self: GH_Ribbon) -> bool
"""
Layouts=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Layouts(self: GH_Ribbon) -> List[str]
"""
RenderRightToLeft=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""This property is now obsolete.
"""
ResizeRedraw=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the control redraws itself when resized.
"""
ScaleChildren=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that determines the scaling of child controls.
"""
ShowFocusCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the control should display focus rectangles.
"""
ShowKeyboardCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the user interface is in the appropriate state to show or hide keyboard accelerators.
"""
Tabs=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Tabs(self: GH_Ribbon) -> List[GH_RibbonTab]
"""
TooltipDelay=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: TooltipDelay(self: GH_Ribbon) -> int
Set: TooltipDelay(self: GH_Ribbon)=value
"""
VScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the vertical scroll bar is visible.
"""
GH_RibbonControlFocusResult=None
m_focus_item=None
class GH_RibbonContentBase(object):
# no doc
def Contains(self,pt):
""" Contains(self: GH_RibbonContentBase,pt: Point) -> bool """
pass
def PerformLayout(self):
""" PerformLayout(self: GH_RibbonContentBase) """
pass
ClientRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ClientRegion(self: GH_RibbonContentBase) -> Rectangle
Set: ClientRegion(self: GH_RibbonContentBase)=value
"""
ContentRegion=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ContentRegion(self: GH_RibbonContentBase) -> Rectangle
Set: ContentRegion(self: GH_RibbonContentBase)=value
"""
Margin=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Margin(self: GH_RibbonContentBase) -> Padding
Set: Margin(self: GH_RibbonContentBase)=value
"""
Margin_Bottom=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Margin_Bottom(self: GH_RibbonContentBase) -> int
Set: Margin_Bottom(self: GH_RibbonContentBase)=value
"""
Margin_Left=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Margin_Left(self: GH_RibbonContentBase) -> int
Set: Margin_Left(self: GH_RibbonContentBase)=value
"""
Margin_Right=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Margin_Right(self: GH_RibbonContentBase) -> int
Set: Margin_Right(self: GH_RibbonContentBase)=value
"""
Margin_Top=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Margin_Top(self: GH_RibbonContentBase) -> int
Set: Margin_Top(self: GH_RibbonContentBase)=value
"""
Padding=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Padding(self: GH_RibbonContentBase) -> Padding
Set: Padding(self: GH_RibbonContentBase)=value
"""
Padding_Bottom=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Padding_Bottom(self: GH_RibbonContentBase) -> int
Set: Padding_Bottom(self: GH_RibbonContentBase)=value
"""
Padding_Left=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Padding_Left(self: GH_RibbonContentBase) -> int
Set: Padding_Left(self: GH_RibbonContentBase)=value
"""
Padding_Right=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Padding_Right(self: GH_RibbonContentBase) -> int
Set: Padding_Right(self: GH_RibbonContentBase)=value
"""
Padding_Top=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Padding_Top(self: GH_RibbonContentBase) -> int
Set: Padding_Top(self: GH_RibbonContentBase)=value
"""
Visible=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Visible(self: GH_RibbonContentBase) -> bool
Set: Visible(self: GH_RibbonContentBase)=value
"""
Region=None
class GH_RibbonDropdown(Form,IComponent,IDisposable,IOleControl,IOleObject,IOleInPlaceObject,IOleInPlaceActiveObject,IOleWindow,IViewObject,IViewObject2,IPersist,IPersistStreamInit,IPersistPropertyBag,IPersistStorage,IQuickActivate,ISupportOleDropSource,IDropTarget,ISynchronizeInvoke,IWin32Window,IArrangedElement,IBindableComponent,IContainerControl):
""" GH_RibbonDropdown() """
def AccessibilityNotifyClients(self,*args):
"""
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,objectID: int,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control .
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
objectID: The identifier of the System.Windows.Forms.AccessibleObject.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
AccessibilityNotifyClients(self: Control,accEvent: AccessibleEvents,childID: int)
Notifies the accessibility client applications of the specified
System.Windows.Forms.AccessibleEvents for the specified child control.
accEvent: The System.Windows.Forms.AccessibleEvents to notify the accessibility client applications of.
childID: The child System.Windows.Forms.Control to notify of the accessible event.
"""
pass
def ActivateMdiChild(self,*args):
"""
ActivateMdiChild(self: Form,form: Form)
Activates the MDI child of a form.
form: The child form to activate.
"""
pass
def AdjustFormScrollbars(self,*args):
"""
AdjustFormScrollbars(self: Form,displayScrollbars: bool)
Adjusts the scroll bars on the container based on the current control positions and the control
currently selected.
displayScrollbars: true to show the scroll bars; otherwise,false.
"""
pass
def ApplyAutoScaling(self,*args):
"""
ApplyAutoScaling(self: Form)
Resizes the form according to the current value of the
System.Windows.Forms.Form.AutoScaleBaseSize property and the size of the current font.
"""
pass
def CenterToParent(self,*args):
"""
CenterToParent(self: Form)
Centers the position of the form within the bounds of the parent form.
"""
pass
def CenterToScreen(self,*args):
"""
CenterToScreen(self: Form)
Centers the form on the current screen.
"""
pass
def CreateAccessibilityInstance(self,*args):
"""
CreateAccessibilityInstance(self: Control) -> AccessibleObject
Creates a new accessibility object for the control.
Returns: A new System.Windows.Forms.AccessibleObject for the control.
"""
pass
def CreateControlsInstance(self,*args):
"""
CreateControlsInstance(self: Form) -> ControlCollection
Returns: A new instance of System.Windows.Forms.Control.ControlCollection assigned to the control.
"""
pass
def CreateHandle(self,*args):
"""
CreateHandle(self: Form)
Creates the handle for the form. If a derived class overrides this function,it must call the
base implementation.
"""
pass
def DefWndProc(self,*args):
"""
DefWndProc(self: Form,m: Message) -> Message
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def DestroyHandle(self,*args):
"""
DestroyHandle(self: Control)
Destroys the handle associated with the control.
"""
pass
def Dispose(self):
""" Dispose(self: GH_RibbonDropdown,disposing: bool) """
pass
def FadeOut(self,*args):
""" FadeOut(self: GH_RibbonDropdown) """
pass
def FindIconRectangle(self,id,rectangle):
""" FindIconRectangle(self: GH_RibbonDropdown,id: Guid) -> (bool,Rectangle) """
pass
def GetAccessibilityObjectById(self,*args):
"""
GetAccessibilityObjectById(self: Control,objectId: int) -> AccessibleObject
Retrieves the specified System.Windows.Forms.AccessibleObject.
objectId: An Int32 that identifies the System.Windows.Forms.AccessibleObject to retrieve.
Returns: An System.Windows.Forms.AccessibleObject.
"""
pass
def GetAutoSizeMode(self,*args):
"""
GetAutoSizeMode(self: Control) -> AutoSizeMode
Retrieves a value indicating how a control will behave when its
System.Windows.Forms.Control.AutoSize property is enabled.
Returns: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def GetScaledBounds(self,*args):
"""
GetScaledBounds(self: Form,bounds: Rectangle,factor: SizeF,specified: BoundsSpecified) -> Rectangle
bounds: A System.Drawing.Rectangle that specifies the area for which to retrieve the display bounds.
factor: The height and width of the control's bounds.
specified: One of the values of System.Windows.Forms.BoundsSpecified that specifies the bounds of the
control to use when defining its size and position.
Returns: A System.Drawing.Rectangle representing the bounds within which the control is scaled.
"""
pass
def GetScrollState(self,*args):
"""
GetScrollState(self: ScrollableControl,bit: int) -> bool
Determines whether the specified flag has been set.
bit: The flag to check.
Returns: true if the specified flag has been set; otherwise,false.
"""
pass
def GetService(self,*args):
"""
GetService(self: Component,service: Type) -> object
Returns an object that represents a service provided by the System.ComponentModel.Component or
by its System.ComponentModel.Container.
service: A service provided by the System.ComponentModel.Component.
Returns: An System.Object that represents a service provided by the System.ComponentModel.Component,or
null if the System.ComponentModel.Component does not provide the specified service.
"""
pass
def GetStyle(self,*args):
"""
GetStyle(self: Control,flag: ControlStyles) -> bool
Retrieves the value of the specified control style bit for the control.
flag: The System.Windows.Forms.ControlStyles bit to return the value from.
Returns: true if the specified control style bit is set to true; otherwise,false.
"""
pass
def GetTopLevel(self,*args):
"""
GetTopLevel(self: Control) -> bool
Determines if the control is a top-level control.
Returns: true if the control is a top-level control; otherwise,false.
"""
pass
def InitLayout(self,*args):
"""
InitLayout(self: Control)
Called after the control has been added to another container.
"""
pass
def InvokeGotFocus(self,*args):
"""
InvokeGotFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeLostFocus(self,*args):
"""
InvokeLostFocus(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokeOnClick(self,*args):
"""
InvokeOnClick(self: Control,toInvoke: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event for the specified control.
toInvoke: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Click event to.
e: An System.EventArgs that contains the event data.
"""
pass
def InvokePaint(self,*args):
"""
InvokePaint(self: Control,c: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def InvokePaintBackground(self,*args):
"""
InvokePaintBackground(self: Control,c: Control,e: PaintEventArgs)
Raises the PaintBackground event for the specified control.
c: The System.Windows.Forms.Control to assign the System.Windows.Forms.Control.Paint event to.
e: An System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def IsInputChar(self,*args):
"""
IsInputChar(self: Control,charCode: Char) -> bool
Determines if a character is an input character that the control recognizes.
charCode: The character to test.
Returns: true if the character should be sent directly to the control and not preprocessed; otherwise,
false.
"""
pass
def IsInputKey(self,*args):
"""
IsInputKey(self: Control,keyData: Keys) -> bool
Determines whether the specified key is a regular input key or a special key that requires
preprocessing.
keyData: One of the System.Windows.Forms.Keys values.
Returns: true if the specified key is a regular input key; otherwise,false.
"""
pass
def MemberwiseClone(self,*args):
"""
MemberwiseClone(self: MarshalByRefObject,cloneIdentity: bool) -> MarshalByRefObject
Creates a shallow copy of the current System.MarshalByRefObject object.
cloneIdentity: false to delete the current System.MarshalByRefObject object's identity,which will cause the
object to be assigned a new identity when it is marshaled across a remoting boundary. A value of
false is usually appropriate. true to copy the current System.MarshalByRefObject object's
identity to its clone,which will cause remoting client calls to be routed to the remote server
object.
Returns: A shallow copy of the current System.MarshalByRefObject object.
MemberwiseClone(self: object) -> object
Creates a shallow copy of the current System.Object.
Returns: A shallow copy of the current System.Object.
"""
pass
def NotifyInvalidate(self,*args):
"""
NotifyInvalidate(self: Control,invalidatedArea: Rectangle)
Raises the System.Windows.Forms.Control.Invalidated event with a specified region of the control
to invalidate.
invalidatedArea: A System.Drawing.Rectangle representing the area to invalidate.
"""
pass
def OnActivated(self,*args):
"""
OnActivated(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.Activated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnAutoSizeChanged(self,*args):
"""
OnAutoSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.AutoSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnAutoValidateChanged(self,*args):
"""
OnAutoValidateChanged(self: ContainerControl,e: EventArgs)
Raises the System.Windows.Forms.ContainerControl.AutoValidateChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackColorChanged(self,*args):
"""
OnBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBackgroundImageChanged(self,*args):
"""
OnBackgroundImageChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event.
e: An System.EventArgs that contains the data.
"""
pass
def OnBackgroundImageLayoutChanged(self,*args):
"""
OnBackgroundImageLayoutChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageLayoutChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnBindingContextChanged(self,*args):
"""
OnBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnCausesValidationChanged(self,*args):
"""
OnCausesValidationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CausesValidationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnChangeUICues(self,*args):
"""
OnChangeUICues(self: Control,e: UICuesEventArgs)
Raises the System.Windows.Forms.Control.ChangeUICues event.
e: A System.Windows.Forms.UICuesEventArgs that contains the event data.
"""
pass
def OnClick(self,*args):
"""
OnClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Click event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnClientSizeChanged(self,*args):
"""
OnClientSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ClientSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnClosed(self,*args):
"""
OnClosed(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.Closed event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnClosing(self,*args):
"""
OnClosing(self: Form,e: CancelEventArgs)
Raises the System.Windows.Forms.Form.Closing event.
e: A System.ComponentModel.CancelEventArgs that contains the event data.
"""
pass
def OnContextMenuChanged(self,*args):
"""
OnContextMenuChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnContextMenuStripChanged(self,*args):
"""
OnContextMenuStripChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ContextMenuStripChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnControlAdded(self,*args):
"""
OnControlAdded(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlAdded event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnControlRemoved(self,*args):
"""
OnControlRemoved(self: Control,e: ControlEventArgs)
Raises the System.Windows.Forms.Control.ControlRemoved event.
e: A System.Windows.Forms.ControlEventArgs that contains the event data.
"""
pass
def OnCreateControl(self,*args):
"""
OnCreateControl(self: Form)
Raises the CreateControl event.
"""
pass
def OnCursorChanged(self,*args):
"""
OnCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDeactivate(self,*args):
"""
OnDeactivate(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.Deactivate event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnDockChanged(self,*args):
"""
OnDockChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DockChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDoubleClick(self,*args):
"""
OnDoubleClick(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DoubleClick event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDpiChanged(self,*args):
""" OnDpiChanged(self: Form,e: DpiChangedEventArgs) """
pass
def OnDpiChangedAfterParent(self,*args):
""" OnDpiChangedAfterParent(self: Control,e: EventArgs) """
pass
def OnDpiChangedBeforeParent(self,*args):
""" OnDpiChangedBeforeParent(self: Control,e: EventArgs) """
pass
def OnDragDrop(self,*args):
"""
OnDragDrop(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragDrop event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragEnter(self,*args):
"""
OnDragEnter(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragEnter event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnDragLeave(self,*args):
"""
OnDragLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.DragLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnDragOver(self,*args):
"""
OnDragOver(self: Control,drgevent: DragEventArgs)
Raises the System.Windows.Forms.Control.DragOver event.
drgevent: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def OnEnabledChanged(self,*args):
"""
OnEnabledChanged(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnEnter(self,*args):
"""
OnEnter(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Control.Enter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnFontChanged(self,*args):
"""
OnFontChanged(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnForeColorChanged(self,*args):
"""
OnForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnFormClosed(self,*args):
"""
OnFormClosed(self: Form,e: FormClosedEventArgs)
Raises the System.Windows.Forms.Form.FormClosed event.
e: A System.Windows.Forms.FormClosedEventArgs that contains the event data.
"""
pass
def OnFormClosing(self,*args):
"""
OnFormClosing(self: Form,e: FormClosingEventArgs)
Raises the System.Windows.Forms.Form.FormClosing event.
e: A System.Windows.Forms.FormClosingEventArgs that contains the event data.
"""
pass
def OnGetDpiScaledSize(self,*args):
""" OnGetDpiScaledSize(self: Form,deviceDpiOld: int,deviceDpiNew: int,desiredSize: Size) -> (bool,Size) """
pass
def OnGiveFeedback(self,*args):
"""
OnGiveFeedback(self: Control,gfbevent: GiveFeedbackEventArgs)
Raises the System.Windows.Forms.Control.GiveFeedback event.
gfbevent: A System.Windows.Forms.GiveFeedbackEventArgs that contains the event data.
"""
pass
def OnGotFocus(self,*args):
"""
OnGotFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.GotFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleCreated(self,*args):
"""
OnHandleCreated(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnHandleDestroyed(self,*args):
"""
OnHandleDestroyed(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnHelpButtonClicked(self,*args):
"""
OnHelpButtonClicked(self: Form,e: CancelEventArgs)
Raises the System.Windows.Forms.Form.HelpButtonClicked event.
e: A System.ComponentModel.CancelEventArgs that contains the event data.
"""
pass
def OnHelpRequested(self,*args):
"""
OnHelpRequested(self: Control,hevent: HelpEventArgs)
Raises the System.Windows.Forms.Control.HelpRequested event.
hevent: A System.Windows.Forms.HelpEventArgs that contains the event data.
"""
pass
def OnImeModeChanged(self,*args):
"""
OnImeModeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ImeModeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnInputLanguageChanged(self,*args):
"""
OnInputLanguageChanged(self: Form,e: InputLanguageChangedEventArgs)
Raises the System.Windows.Forms.Form.InputLanguageChanged event.
e: The System.Windows.Forms.InputLanguageChangedEventArgs that contains the event data.
"""
pass
def OnInputLanguageChanging(self,*args):
"""
OnInputLanguageChanging(self: Form,e: InputLanguageChangingEventArgs)
Raises the System.Windows.Forms.Form.InputLanguageChanging event.
e: The System.Windows.Forms.InputLanguageChangingEventArgs that contains the event data.
"""
pass
def OnInvalidated(self,*args):
"""
OnInvalidated(self: Control,e: InvalidateEventArgs)
Raises the System.Windows.Forms.Control.Invalidated event.
e: An System.Windows.Forms.InvalidateEventArgs that contains the event data.
"""
pass
def OnKeyDown(self,*args):
"""
OnKeyDown(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyDown event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnKeyPress(self,*args):
"""
OnKeyPress(self: Control,e: KeyPressEventArgs)
Raises the System.Windows.Forms.Control.KeyPress event.
e: A System.Windows.Forms.KeyPressEventArgs that contains the event data.
"""
pass
def OnKeyUp(self,*args):
"""
OnKeyUp(self: Control,e: KeyEventArgs)
Raises the System.Windows.Forms.Control.KeyUp event.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def OnLayout(self,*args):
"""
OnLayout(self: Form,levent: LayoutEventArgs)
Raises the System.Windows.Forms.Control.Layout event.
levent: The event data.
"""
pass
def OnLeave(self,*args):
"""
OnLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Leave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLoad(self,*args):
"""
OnLoad(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.Load event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLocationChanged(self,*args):
"""
OnLocationChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LocationChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnLostFocus(self,*args):
"""
OnLostFocus(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.LostFocus event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMarginChanged(self,*args):
"""
OnMarginChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MarginChanged event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnMaximizedBoundsChanged(self,*args):
"""
OnMaximizedBoundsChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MaximizedBoundsChanged event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnMaximumSizeChanged(self,*args):
"""
OnMaximumSizeChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MaximumSizeChanged event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnMdiChildActivate(self,*args):
"""
OnMdiChildActivate(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MdiChildActivate event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnMenuComplete(self,*args):
"""
OnMenuComplete(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MenuComplete event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnMenuStart(self,*args):
"""
OnMenuStart(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MenuStart event.
e: The System.EventArgs that contains the event data.
"""
pass
def OnMinimumSizeChanged(self,*args):
"""
OnMinimumSizeChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.MinimumSizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseCaptureChanged(self,*args):
"""
OnMouseCaptureChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseCaptureChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseClick(self,*args):
"""
OnMouseClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDoubleClick(self,*args):
"""
OnMouseDoubleClick(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseDoubleClick event.
e: An System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseDown(self,*args):
"""
OnMouseDown(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseDown event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseEnter(self,*args):
"""
OnMouseEnter(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseEnter event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseHover(self,*args):
"""
OnMouseHover(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseHover event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseLeave(self,*args):
"""
OnMouseLeave(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.MouseLeave event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnMouseMove(self,*args):
"""
OnMouseMove(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseMove event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseUp(self,*args):
"""
OnMouseUp(self: Control,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseUp event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMouseWheel(self,*args):
"""
OnMouseWheel(self: ScrollableControl,e: MouseEventArgs)
Raises the System.Windows.Forms.Control.MouseWheel event.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def OnMove(self,*args):
"""
OnMove(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Move event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnNotifyMessage(self,*args):
"""
OnNotifyMessage(self: Control,m: Message)
Notifies the control of Windows messages.
m: A System.Windows.Forms.Message that represents the Windows message.
"""
pass
def OnPaddingChanged(self,*args):
"""
OnPaddingChanged(self: ScrollableControl,e: EventArgs)
Raises the System.Windows.Forms.Control.PaddingChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPaint(self,*args):
"""
OnPaint(self: Form,e: PaintEventArgs)
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def OnPaintBackground(self,*args):
"""
OnPaintBackground(self: ScrollableControl,e: PaintEventArgs)
Paints the background of the control.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def OnParentBackColorChanged(self,*args):
"""
OnParentBackColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackColorChanged event when the
System.Windows.Forms.Control.BackColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBackgroundImageChanged(self,*args):
"""
OnParentBackgroundImageChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BackgroundImageChanged event when the
System.Windows.Forms.Control.BackgroundImage property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentBindingContextChanged(self,*args):
"""
OnParentBindingContextChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.BindingContextChanged event when the
System.Windows.Forms.Control.BindingContext property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentChanged(self,*args):
"""
OnParentChanged(self: ContainerControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentCursorChanged(self,*args):
"""
OnParentCursorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.CursorChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentEnabledChanged(self,*args):
"""
OnParentEnabledChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.EnabledChanged event when the
System.Windows.Forms.Control.Enabled property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentFontChanged(self,*args):
"""
OnParentFontChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.FontChanged event when the
System.Windows.Forms.Control.Font property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentForeColorChanged(self,*args):
"""
OnParentForeColorChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.ForeColorChanged event when the
System.Windows.Forms.Control.ForeColor property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentRightToLeftChanged(self,*args):
"""
OnParentRightToLeftChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RightToLeftChanged event when the
System.Windows.Forms.Control.RightToLeft property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnParentVisibleChanged(self,*args):
"""
OnParentVisibleChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.VisibleChanged event when the
System.Windows.Forms.Control.Visible property value of the control's container changes.
e: An System.EventArgs that contains the event data.
"""
pass
def OnPreviewKeyDown(self,*args):
"""
OnPreviewKeyDown(self: Control,e: PreviewKeyDownEventArgs)
Raises the System.Windows.Forms.Control.PreviewKeyDown event.
e: A System.Windows.Forms.PreviewKeyDownEventArgs that contains the event data.
"""
pass
def OnPrint(self,*args):
"""
OnPrint(self: Control,e: PaintEventArgs)
Raises the System.Windows.Forms.Control.Paint event.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def OnQueryContinueDrag(self,*args):
"""
OnQueryContinueDrag(self: Control,qcdevent: QueryContinueDragEventArgs)
Raises the System.Windows.Forms.Control.QueryContinueDrag event.
qcdevent: A System.Windows.Forms.QueryContinueDragEventArgs that contains the event data.
"""
pass
def OnRegionChanged(self,*args):
"""
OnRegionChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.RegionChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnResize(self,*args):
"""
OnResize(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnResizeBegin(self,*args):
"""
OnResizeBegin(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.ResizeBegin event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnResizeEnd(self,*args):
"""
OnResizeEnd(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.ResizeEnd event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnRightToLeftChanged(self,*args):
"""
OnRightToLeftChanged(self: ScrollableControl,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnRightToLeftLayoutChanged(self,*args):
"""
OnRightToLeftLayoutChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.RightToLeftLayoutChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnScroll(self,*args):
"""
OnScroll(self: ScrollableControl,se: ScrollEventArgs)
Raises the System.Windows.Forms.ScrollableControl.Scroll event.
se: A System.Windows.Forms.ScrollEventArgs that contains the event data.
"""
pass
def OnShown(self,*args):
"""
OnShown(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Form.Shown event.
e: A System.EventArgs that contains the event data.
"""
pass
def OnSizeChanged(self,*args):
"""
OnSizeChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SizeChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnStyleChanged(self,*args):
"""
OnStyleChanged(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnSystemColorsChanged(self,*args):
"""
OnSystemColorsChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.SystemColorsChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabIndexChanged(self,*args):
"""
OnTabIndexChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabIndexChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTabStopChanged(self,*args):
"""
OnTabStopChanged(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.TabStopChanged event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnTextChanged(self,*args):
"""
OnTextChanged(self: Form,e: EventArgs)
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidated(self,*args):
"""
OnValidated(self: Control,e: EventArgs)
Raises the System.Windows.Forms.Control.Validated event.
e: An System.EventArgs that contains the event data.
"""
pass
def OnValidating(self,*args):
"""
OnValidating(self: Control,e: CancelEventArgs)
Raises the System.Windows.Forms.Control.Validating event.
e: A System.ComponentModel.CancelEventArgs that contains the event data.
"""
pass
def OnVisibleChanged(self,*args):
"""
OnVisibleChanged(self: Form,e: EventArgs)
Raises the System.Windows.Forms.Control.VisibleChanged event.
e: The System.EventArgs that contains the event data.
"""
pass
def PopulateContent(self,proxies):
""" PopulateContent(self: GH_RibbonDropdown,proxies: List[IGH_ObjectProxy]) """
pass
def ProcessCmdKey(self,*args):
"""
ProcessCmdKey(self: Form,msg: Message,keyData: Keys) -> (bool,Message)
Processes a command key.
msg: A System.Windows.Forms.Message,passed by reference,that represents the Win32 message to
process.
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the keystroke was processed and consumed by the control; otherwise,false to allow
further processing.
"""
pass
def ProcessDialogChar(self,*args):
"""
ProcessDialogChar(self: Form,charCode: Char) -> bool
Processes a dialog character.
charCode: The character to process.
Returns: true if the character was processed by the control; otherwise,false.
"""
pass
def ProcessDialogKey(self,*args):
"""
ProcessDialogKey(self: Form,keyData: Keys) -> bool
Processes a dialog box key.
keyData: One of the System.Windows.Forms.Keys values that represents the key to process.
Returns: true if the keystroke was processed and consumed by the control; otherwise,false to allow
further processing.
"""
pass
def ProcessKeyEventArgs(self,*args):
"""
ProcessKeyEventArgs(self: Control,m: Message) -> (bool,Message)
Processes a key message and generates the appropriate control events.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyMessage(self,*args):
"""
ProcessKeyMessage(self: Control,m: Message) -> (bool,Message)
Processes a keyboard message.
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessKeyPreview(self,*args):
"""
ProcessKeyPreview(self: Form,m: Message) -> (bool,Message)
m: A System.Windows.Forms.Message,passed by reference,that represents the window message to
process.
Returns: true if the message was processed by the control; otherwise,false.
"""
pass
def ProcessMnemonic(self,*args):
"""
ProcessMnemonic(self: Form,charCode: Char) -> bool
Processes a mnemonic character.
charCode: The character to process.
Returns: true if the character was processed as a mnemonic by the control; otherwise,false.
"""
pass
def ProcessTabKey(self,*args):
"""
ProcessTabKey(self: Form,forward: bool) -> bool
forward: true to cycle forward through the controls in the System.Windows.Forms.ContainerControl;
otherwise,false.
Returns: true if a control is selected; otherwise,false.
"""
pass
def RaiseDragEvent(self,*args):
"""
RaiseDragEvent(self: Control,key: object,e: DragEventArgs)
Raises the appropriate drag event.
key: The event to raise.
e: A System.Windows.Forms.DragEventArgs that contains the event data.
"""
pass
def RaiseKeyEvent(self,*args):
"""
RaiseKeyEvent(self: Control,key: object,e: KeyEventArgs)
Raises the appropriate key event.
key: The event to raise.
e: A System.Windows.Forms.KeyEventArgs that contains the event data.
"""
pass
def RaiseMouseEvent(self,*args):
"""
RaiseMouseEvent(self: Control,key: object,e: MouseEventArgs)
Raises the appropriate mouse event.
key: The event to raise.
e: A System.Windows.Forms.MouseEventArgs that contains the event data.
"""
pass
def RaisePaintEvent(self,*args):
"""
RaisePaintEvent(self: Control,key: object,e: PaintEventArgs)
Raises the appropriate paint event.
key: The event to raise.
e: A System.Windows.Forms.PaintEventArgs that contains the event data.
"""
pass
def RecreateHandle(self,*args):
"""
RecreateHandle(self: Control)
Forces the re-creation of the handle for the control.
"""
pass
def RescaleConstantsForDpi(self,*args):
""" RescaleConstantsForDpi(self: Control,deviceDpiOld: int,deviceDpiNew: int) """
pass
def ResetMouseEventArgs(self,*args):
"""
ResetMouseEventArgs(self: Control)
Resets the control to handle the System.Windows.Forms.Control.MouseLeave event.
"""
pass
def RtlTranslateAlignment(self,*args):
"""
RtlTranslateAlignment(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
RtlTranslateAlignment(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
RtlTranslateAlignment(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateContent(self,*args):
"""
RtlTranslateContent(self: Control,align: ContentAlignment) -> ContentAlignment
Converts the specified System.Drawing.ContentAlignment to the appropriate
System.Drawing.ContentAlignment to support right-to-left text.
align: One of the System.Drawing.ContentAlignment values.
Returns: One of the System.Drawing.ContentAlignment values.
"""
pass
def RtlTranslateHorizontal(self,*args):
"""
RtlTranslateHorizontal(self: Control,align: HorizontalAlignment) -> HorizontalAlignment
Converts the specified System.Windows.Forms.HorizontalAlignment to the appropriate
System.Windows.Forms.HorizontalAlignment to support right-to-left text.
align: One of the System.Windows.Forms.HorizontalAlignment values.
Returns: One of the System.Windows.Forms.HorizontalAlignment values.
"""
pass
def RtlTranslateLeftRight(self,*args):
"""
RtlTranslateLeftRight(self: Control,align: LeftRightAlignment) -> LeftRightAlignment
Converts the specified System.Windows.Forms.LeftRightAlignment to the appropriate
System.Windows.Forms.LeftRightAlignment to support right-to-left text.
align: One of the System.Windows.Forms.LeftRightAlignment values.
Returns: One of the System.Windows.Forms.LeftRightAlignment values.
"""
pass
def ScaleControl(self,*args):
"""
ScaleControl(self: Form,factor: SizeF,specified: BoundsSpecified)
Scales the location,size,padding,and margin of a control.
factor: The factor by which the height and width of the control are scaled.
specified: A System.Windows.Forms.BoundsSpecified value that specifies the bounds of the control to use
when defining its size and position.
"""
pass
def ScaleCore(self,*args):
"""
ScaleCore(self: Form,x: Single,y: Single)
Performs scaling of the form.
x: Percentage to scale the form horizontally
y: Percentage to scale the form vertically
"""
pass
def ScrollToControl(self,*args):
"""
ScrollToControl(self: ScrollableControl,activeControl: Control) -> Point
Calculates the scroll offset to the specified child control.
activeControl: The child control to scroll into view.
Returns: The upper-left hand System.Drawing.Point of the display area relative to the client area
required to scroll the control into view.
"""
pass
def Select(self):
"""
Select(self: Form,directed: bool,forward: bool)
Selects this form,and optionally selects the next or previous control.
directed: If set to true that the active control is changed
forward: If directed is true,then this controls the direction in which focus is moved. If this is true,
then the next control is selected; otherwise,the previous control is selected.
"""
pass
def SetAutoSizeMode(self,*args):
"""
SetAutoSizeMode(self: Control,mode: AutoSizeMode)
Sets a value indicating how a control will behave when its System.Windows.Forms.Control.AutoSize
property is enabled.
mode: One of the System.Windows.Forms.AutoSizeMode values.
"""
pass
def SetBoundsCore(self,*args):
"""
SetBoundsCore(self: Form,x: int,y: int,width: int,height: int,specified: BoundsSpecified)
x: The x-coordinate.
y: The y-coordinate.
width: The bounds width.
height: The bounds height.
specified: A value from the BoundsSpecified enumeration.
"""
pass
def SetClientSizeCore(self,*args):
"""
SetClientSizeCore(self: Form,x: int,y: int)
Sets the client size of the form. This will adjust the bounds of the form to make the client
size the requested size.
x: Requested width of the client region.
y: Requested height of the client region.
"""
pass
def SetDisplayRectLocation(self,*args):
"""
SetDisplayRectLocation(self: ScrollableControl,x: int,y: int)
Positions the display window to the specified value.
x: The horizontal offset at which to position the System.Windows.Forms.ScrollableControl.
y: The vertical offset at which to position the System.Windows.Forms.ScrollableControl.
"""
pass
def SetOwner(self,owner):
""" SetOwner(self: GH_RibbonDropdown,owner: GH_RibbonPanel) """
pass
def SetScrollState(self,*args):
"""
SetScrollState(self: ScrollableControl,bit: int,value: bool)
Sets the specified scroll state flag.
bit: The scroll state flag to set.
value: The value to set the flag.
"""
pass
def SetStyle(self,*args):
"""
SetStyle(self: Control,flag: ControlStyles,value: bool)
Sets a specified System.Windows.Forms.ControlStyles flag to either true or false.
flag: The System.Windows.Forms.ControlStyles bit to set.
value: true to apply the specified style to the control; otherwise,false.
"""
pass
def SetTopLevel(self,*args):
"""
SetTopLevel(self: Control,value: bool)
Sets the control as the top-level control.
value: true to set the control as the top-level control; otherwise,false.
"""
pass
def SetVisibleCore(self,*args):
"""
SetVisibleCore(self: Form,value: bool)
value: true to make the control visible; otherwise,false.
"""
pass
def SizeFromClientSize(self,*args):
"""
SizeFromClientSize(self: Control,clientSize: Size) -> Size
Determines the size of the entire control from the height and width of its client area.
clientSize: A System.Drawing.Size value representing the height and width of the control's client area.
Returns: A System.Drawing.Size value representing the height and width of the entire control.
"""
pass
def UpdateBounds(self,*args):
"""
UpdateBounds(self: Control,x: int,y: int,width: int,height: int,clientWidth: int,clientHeight: int)
Updates the bounds of the control with the specified size,location,and client size.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
clientWidth: The client System.Drawing.Size.Width of the control.
clientHeight: The client System.Drawing.Size.Height of the control.
UpdateBounds(self: Control,x: int,y: int,width: int,height: int)
Updates the bounds of the control with the specified size and location.
x: The System.Drawing.Point.X coordinate of the control.
y: The System.Drawing.Point.Y coordinate of the control.
width: The System.Drawing.Size.Width of the control.
height: The System.Drawing.Size.Height of the control.
UpdateBounds(self: Control)
Updates the bounds of the control with the current size and location.
"""
pass
def UpdateDefaultButton(self,*args):
"""
UpdateDefaultButton(self: Form)
Updates which button is the default button.
"""
pass
def UpdateStyles(self,*args):
"""
UpdateStyles(self: Control)
Forces the assigned styles to be reapplied to the control.
"""
pass
def UpdateZOrder(self,*args):
"""
UpdateZOrder(self: Control)
Updates the control in its parent's z-order.
"""
pass
def WndProc(self,*args):
"""
WndProc(self: Form,m: Message) -> Message
m: The Windows System.Windows.Forms.Message to process.
"""
pass
def __enter__(self,*args):
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self,*args):
"""
__exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self,*args):
pass
AutoClose=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: AutoClose(self: GH_RibbonDropdown) -> bool
Set: AutoClose(self: GH_RibbonDropdown)=value
"""
AutoScaleFactor=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the scaling factor between the current and design-time automatic scaling dimensions.
"""
CanEnableIme=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the System.Windows.Forms.Control.ImeMode property can be set to an active value,to enable IME support.
"""
CanRaiseEvents=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Determines if events can be raised on the control.
"""
CreateParams=property(lambda self: object(),lambda self,v: None,lambda self: None)
DefaultCursor=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the default cursor for the control.
"""
DefaultImeMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the default Input Method Editor (IME) mode supported by the control.
"""
DefaultMargin=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the space,in pixels,that is specified by default between controls.
"""
DefaultMaximumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default maximum size of a control.
"""
DefaultMinimumSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the length and height,in pixels,that is specified as the default minimum size of a control.
"""
DefaultPadding=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the internal spacing,in pixels,of the contents of a control.
"""
DefaultSize=property(lambda self: object(),lambda self,v: None,lambda self: None)
DesignMode=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that indicates whether the System.ComponentModel.Component is currently in design mode.
"""
DoubleBuffered=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether this control should redraw its surface using a secondary buffer to reduce or prevent flicker.
"""
Events=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets the list of event handlers that are attached to this System.ComponentModel.Component.
"""
FontHeight=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the height of the font of the control.
"""
HScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the horizontal scroll bar is visible.
"""
ImeModeBase=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets the IME mode of a control.
"""
MaximizedBounds=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets and sets the size of the form when it is maximized.
"""
RenderRightToLeft=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""This property is now obsolete.
"""
ResizeRedraw=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the control redraws itself when resized.
"""
ScaleChildren=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value that determines the scaling of child controls.
"""
ShowFocusCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the control should display focus rectangles.
"""
ShowKeyboardCues=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets a value indicating whether the user interface is in the appropriate state to show or hide keyboard accelerators.
"""
ShowWithoutActivation=property(lambda self: object(),lambda self,v: None,lambda self: None)
VScroll=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Gets or sets a value indicating whether the vertical scroll bar is visible.
"""
class GH_RibbonItem(GH_RibbonContentBase,IGH_RibbonInteractiveObject,IComparable[GH_RibbonItem]):
""" GH_RibbonItem(item_proxy: IGH_ObjectProxy) """
def CompareTo(self,other):
""" CompareTo(self: GH_RibbonItem,other: GH_RibbonItem) -> int """
pass
def DisplayTooltip(self,e):
""" DisplayTooltip(self: GH_RibbonItem,e: GH_TooltipDisplayEventArgs) """
pass
def MouseClick(self,sender,e):
""" MouseClick(self: GH_RibbonItem,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDoubleClick(self,sender,e):
""" MouseDoubleClick(self: GH_RibbonItem,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDown(self,sender,e):
""" MouseDown(self: GH_RibbonItem,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseMove(self,sender,e):
""" MouseMove(self: GH_RibbonItem,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseUp(self,sender,e):
""" MouseUp(self: GH_RibbonItem,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def PerformLayout(self):
""" PerformLayout(self: GH_RibbonItem) """
pass
def RenderItem(self,g):
""" RenderItem(self: GH_RibbonItem,g: Graphics) """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,item_proxy):
""" __new__(cls: type,item_proxy: IGH_ObjectProxy) """
pass
Owner=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Owner(self: GH_RibbonItem) -> GH_RibbonPanel
Set: Owner(self: GH_RibbonItem)=value
"""
Proxy=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Proxy(self: GH_RibbonItem) -> IGH_ObjectProxy
Set: Proxy(self: GH_RibbonItem)=value
"""
class GH_RibbonMouseEvent(Enum,IComparable,IFormattable,IConvertible):
""" enum GH_RibbonMouseEvent,values: Handled (2),Handled_Redraw (3),Ignored (1),Unset (0) """
def __eq__(self,*args):
""" x.__eq__(y) <==> x==yx.__eq__(y) <==> x==yx.__eq__(y) <==> x==y """
pass
def __format__(self,*args):
""" __format__(formattable: IFormattable,format: str) -> str """
pass
def __ge__(self,*args):
pass
def __gt__(self,*args):
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __le__(self,*args):
pass
def __lt__(self,*args):
pass
def __ne__(self,*args):
pass
def __reduce_ex__(self,*args):
pass
def __str__(self,*args):
pass
Handled=None
Handled_Redraw=None
Ignored=None
Unset=None
value__=None
class GH_RibbonMouseEventArgs(MouseEventArgs):
"""
GH_RibbonMouseEventArgs(e: MouseEventArgs)
GH_RibbonMouseEventArgs(e: MouseEventArgs,nActiveObject: IGH_RibbonInteractiveObject)
"""
def NewActiveObject(self,nObject):
""" NewActiveObject(self: GH_RibbonMouseEventArgs,nObject: IGH_RibbonInteractiveObject) """
pass
def ReleaseActiveObject(self,owner_filter=None):
""" ReleaseActiveObject(self: GH_RibbonMouseEventArgs,owner_filter: IGH_RibbonInteractiveObject)ReleaseActiveObject(self: GH_RibbonMouseEventArgs) """
pass
def Reset(self):
""" Reset(self: GH_RibbonMouseEventArgs) """
pass
@staticmethod
def __new__(self,e,nActiveObject=None):
"""
__new__(cls: type,e: MouseEventArgs)
__new__(cls: type,e: MouseEventArgs,nActiveObject: IGH_RibbonInteractiveObject)
"""
pass
ActiveObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: ActiveObject(self: GH_RibbonMouseEventArgs) -> IGH_RibbonInteractiveObject
"""
IsActiveObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: IsActiveObject(self: GH_RibbonMouseEventArgs) -> bool
"""
Release=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Release(self: GH_RibbonMouseEventArgs) -> bool
"""
m_activeobject=None
m_release=None
class GH_RibbonPainter(object):
# no doc
@staticmethod
def DropDownBar(rec):
""" DropDownBar(rec: Rectangle) -> GraphicsPath """
pass
@staticmethod
def PanelBottomBar(rec):
""" PanelBottomBar(rec: Rectangle) -> GraphicsPath """
pass
@staticmethod
def PanelInnerBorder(rec):
""" PanelInnerBorder(rec: Rectangle) -> GraphicsPath """
pass
@staticmethod
def PanelOuterBorder(rec):
""" PanelOuterBorder(rec: Rectangle) -> GraphicsPath """
pass
@staticmethod
def TabEdgeBrush(rec):
""" TabEdgeBrush(rec: Rectangle) -> Brush """
pass
@staticmethod
def TabOuterBorder(GripRec,ContentRec):
""" TabOuterBorder(GripRec: Rectangle,ContentRec: Rectangle) -> GraphicsPath """
pass
@staticmethod
def TabPaneBrush(rec,bg):
""" TabPaneBrush(rec: Rectangle,bg: Color) -> Brush """
pass
PanelBarHeight=14
PanelCornerRadius=4
TabGripHeight=22
class GH_RibbonPanel(GH_RibbonContentBase,IGH_RibbonInteractiveObject,IComparable[GH_RibbonPanel]):
"""
GH_RibbonPanel()
GH_RibbonPanel(iName: str)
"""
def AddItem(self,item):
""" AddItem(self: GH_RibbonPanel,item: GH_RibbonItem) -> bool """
pass
def CollapseLastColumn(self):
""" CollapseLastColumn(self: GH_RibbonPanel) -> bool """
pass
def CollapseLeastSignificantColumn(self):
""" CollapseLeastSignificantColumn(self: GH_RibbonPanel) -> bool """
pass
def CompareTo(self,other):
""" CompareTo(self: GH_RibbonPanel,other: GH_RibbonPanel) -> int """
pass
def Contains(self,*__args):
""" Contains(self: GH_RibbonPanel,id: Guid,exposure: GH_Exposure) -> bool """
pass
def DesiredHeight(self,given_height):
""" DesiredHeight(self: GH_RibbonPanel,given_height: int) -> int """
pass
def DisplayDropdown(self,autoCloseDropdown=None):
"""
DisplayDropdown(self: GH_RibbonPanel,autoCloseDropdown: bool) -> GH_RibbonDropdown
DisplayDropdown(self: GH_RibbonPanel)
"""
pass
def DisplayTooltip(self,e):
""" DisplayTooltip(self: GH_RibbonPanel,e: GH_TooltipDisplayEventArgs) """
pass
def IndexAt(self,pt):
""" IndexAt(self: GH_RibbonPanel,pt: Point) -> int """
pass
def ItemAt(self,*__args):
"""
ItemAt(self: GH_RibbonPanel,pt: Point) -> GH_RibbonItem
ItemAt(self: GH_RibbonPanel,index: int) -> GH_RibbonItem
"""
pass
def MinimumWidth(self):
""" MinimumWidth(self: GH_RibbonPanel) -> int """
pass
def MouseClick(self,sender,e):
""" MouseClick(self: GH_RibbonPanel,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDoubleClick(self,sender,e):
""" MouseDoubleClick(self: GH_RibbonPanel,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDown(self,sender,e):
""" MouseDown(self: GH_RibbonPanel,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseMove(self,sender,e):
""" MouseMove(self: GH_RibbonPanel,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseUp(self,sender,e):
""" MouseUp(self: GH_RibbonPanel,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MoveTo(self,x,y):
""" MoveTo(self: GH_RibbonPanel,x: int,y: int) """
pass
def PanelBarFont(self):
""" PanelBarFont(self: GH_RibbonPanel) -> Font """
pass
def PanelBarRegion(self):
""" PanelBarRegion(self: GH_RibbonPanel) -> Rectangle """
pass
def PerformLayout(self):
""" PerformLayout(self: GH_RibbonPanel) """
pass
def RemoveItem(self,item):
""" RemoveItem(self: GH_RibbonPanel,item: GH_RibbonItem) -> bool """
pass
def RenderPanel(self,g):
""" RenderPanel(self: GH_RibbonPanel,g: Graphics) """
pass
def Sort(self):
""" Sort(self: GH_RibbonPanel) """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,iName=None):
"""
__new__(cls: type)
__new__(cls: type,iName: str)
"""
pass
AllItems=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: AllItems(self: GH_RibbonPanel) -> List[GH_RibbonItem]
"""
Name=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Name(self: GH_RibbonPanel) -> str
Set: Name(self: GH_RibbonPanel)=value
"""
Owner=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Owner(self: GH_RibbonPanel) -> GH_RibbonTab
Set: Owner(self: GH_RibbonPanel)=value
"""
VisibleItems=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: VisibleItems(self: GH_RibbonPanel) -> List[GH_RibbonItem]
"""
class GH_RibbonTab(GH_RibbonContentBase,IGH_RibbonInteractiveObject):
""" GH_RibbonTab(owner: GH_Ribbon,name: str) """
def DisplayTooltip(self,e):
""" DisplayTooltip(self: GH_RibbonTab,e: GH_TooltipDisplayEventArgs) """
pass
def EnsurePanel(self,name):
""" EnsurePanel(self: GH_RibbonTab,name: str) -> GH_RibbonPanel """
pass
def MouseClick(self,sender,e):
""" MouseClick(self: GH_RibbonTab,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDoubleClick(self,sender,e):
""" MouseDoubleClick(self: GH_RibbonTab,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDown(self,sender,e):
""" MouseDown(self: GH_RibbonTab,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseMove(self,sender,e):
""" MouseMove(self: GH_RibbonTab,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseUp(self,sender,e):
""" MouseUp(self: GH_RibbonTab,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def PerformLayout(self):
""" PerformLayout(self: GH_RibbonTab) """
pass
def RenderTab(self,g):
""" RenderTab(self: GH_RibbonTab,g: Graphics) """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self,owner,name):
""" __new__(cls: type,owner: GH_Ribbon,name: str) """
pass
DisplayStyle=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: DisplayStyle(self: GH_RibbonTab) -> GH_TabDisplay
Set: DisplayStyle(self: GH_RibbonTab)=value
"""
Grip=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Grip(self: GH_RibbonTab) -> Rectangle
Set: Grip(self: GH_RibbonTab)=value
"""
HasIcon=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: HasIcon(self: GH_RibbonTab) -> bool
"""
Icon=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Icon(self: GH_RibbonTab) -> Bitmap
"""
NameFull=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: NameFull(self: GH_RibbonTab) -> str
"""
NameShort=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: NameShort(self: GH_RibbonTab) -> str
"""
NameSymbol=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: NameSymbol(self: GH_RibbonTab) -> str
"""
Owner=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Owner(self: GH_RibbonTab) -> GH_Ribbon
"""
Panels=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Get: Panels(self: GH_RibbonTab) -> List[GH_RibbonPanel]
"""
class GH_TabDisplay(Enum,IComparable,IFormattable,IConvertible):
""" enum GH_TabDisplay,values: FullName (1),Icon (4),None (0),ShortName (2),Symbol (3) """
def __eq__(self,*args):
""" x.__eq__(y) <==> x==yx.__eq__(y) <==> x==yx.__eq__(y) <==> x==y """
pass
def __format__(self,*args):
""" __format__(formattable: IFormattable,format: str) -> str """
pass
def __ge__(self,*args):
pass
def __gt__(self,*args):
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __le__(self,*args):
pass
def __lt__(self,*args):
pass
def __ne__(self,*args):
pass
def __reduce_ex__(self,*args):
pass
def __str__(self,*args):
pass
FullName=None
Icon=None
None=None
ShortName=None
Symbol=None
value__=None
class IGH_RibbonInteractiveObject:
# no doc
def DisplayTooltip(self,e):
""" DisplayTooltip(self: IGH_RibbonInteractiveObject,e: GH_TooltipDisplayEventArgs) """
pass
def MouseClick(self,sender,e):
""" MouseClick(self: IGH_RibbonInteractiveObject,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDoubleClick(self,sender,e):
""" MouseDoubleClick(self: IGH_RibbonInteractiveObject,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseDown(self,sender,e):
""" MouseDown(self: IGH_RibbonInteractiveObject,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseMove(self,sender,e):
""" MouseMove(self: IGH_RibbonInteractiveObject,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def MouseUp(self,sender,e):
""" MouseUp(self: IGH_RibbonInteractiveObject,sender: GH_Ribbon,e: GH_RibbonMouseEventArgs) -> GH_RibbonMouseEvent """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
| 24.657568 | 374 | 0.697403 | 21,432 | 180,666 | 5.819289 | 0.044933 | 0.032665 | 0.080677 | 0.063479 | 0.911577 | 0.897906 | 0.886256 | 0.876843 | 0.872978 | 0.869996 | 0 | 0.00034 | 0.202617 | 180,666 | 7,326 | 375 | 24.660934 | 0.865403 | 0.001212 | 0 | 0.883986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.435587 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
8e7c9825054decc9d5b13e6a9137daa6df64f9a4 | 140 | py | Python | utils/time_tools.py | lizhuxian020/ZXPythonPratice | 7f313a06cc1756fe2ddb1888ae7ec19ef04f2ef5 | [
"MIT"
] | null | null | null | utils/time_tools.py | lizhuxian020/ZXPythonPratice | 7f313a06cc1756fe2ddb1888ae7ec19ef04f2ef5 | [
"MIT"
] | null | null | null | utils/time_tools.py | lizhuxian020/ZXPythonPratice | 7f313a06cc1756fe2ddb1888ae7ec19ef04f2ef5 | [
"MIT"
] | null | null | null | # coding: utf-8
# Author: lee_zix
import time
def get_time():
return time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time()))
| 15.555556 | 74 | 0.635714 | 24 | 140 | 3.625 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008403 | 0.15 | 140 | 8 | 75 | 17.5 | 0.722689 | 0.207143 | 0 | 0 | 0 | 0 | 0.157407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
79a4746a93c72789f9d989d63f6706a33292cf98 | 25,607 | py | Python | proxies-generator.py | Elijah2101021/Zero-attacker | ad335f468e2f5cec93852bf63d1a28c950700034 | [
"MIT"
] | 2 | 2021-11-08T15:14:01.000Z | 2022-02-28T00:58:25.000Z | proxies-generator.py | Elijah2101021/Zero-attacker | ad335f468e2f5cec93852bf63d1a28c950700034 | [
"MIT"
] | 1 | 2021-10-09T06:09:21.000Z | 2021-10-11T08:22:06.000Z | proxies-generator.py | Zero-Tool/Zero-attacker | 62d6c8c11c686afe338e67c05d7753f396cdb360 | [
"MIT"
] | 1 | 2021-11-08T15:45:05.000Z | 2021-11-08T15:45:05.000Z | from pytransform import pyarmor_runtime
pyarmor_runtime()
__pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x09\x00\x61\x0d\x0d\x0a\x08\x2d\xa0\x01\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\xa9\x18\x00\x00\x00\x00\x00\x18\xc0\x0c\x44\x11\x61\xe5\xaa\xb3\x90\xc1\xc0\x53\x4c\x17\x8c\x8e\x00\x00\x00\x00\x00\x00\x00\x00\x8c\x6a\x10\xa0\x5c\x50\x50\xc0\x23\x7a\x34\x7b\xa1\x77\xa1\x91\x70\x1d\x53\x5a\x95\x4c\xa1\xa3\x75\xaa\x53\xac\x66\x32\xef\x5d\xaf\x3f\x7f\xbc\x81\x55\xaa\x1c\x52\x76\xc1\x03\xa0\xc0\x88\x3c\xdb\xef\xe8\x4b\x17\xff\xcc\x2a\x9a\x53\x0b\x4e\x98\x1e\x55\x95\x26\xd4\x5d\xb0\x8a\x14\xc5\x14\x87\x1b\x30\xf3\x1b\x82\x53\x8b\xc7\xb8\x62\x80\x7c\xa9\x37\xc8\x28\x54\xec\x19\x68\x65\x9e\x25\x81\xb2\x4c\x8c\xd0\x51\xc6\x48\xab\x38\xaf\xd5\x58\xa4\xbd\xac\x7a\x32\xe2\x1d\x19\xdf\xb0\x6a\x68\x7c\x97\x10\x26\x5b\xc6\x53\x6d\x94\x02\xf1\xb7\xf3\xb1\x2f\x7c\x9d\xac\x4c\x4a\x2b\x82\xf0\x5c\xcb\x8c\x2c\xdb\xa8\x6e\x8f\xd6\x89\xda\x4f\x1f\xe3\x2d\xef\x66\xdf\xdf\x2d\x3d\x1f\xcd\x96\xab\x73\x99\xdb\xc6\x4f\x28\x94\xd2\x0e\xe6\x4c\xcd\x7f\x2c\x2b\xe5\x6c\x46\x33\xba\xac\x03\x5c\x12\xd1\x6a\x61\x9b\x9e\x31\x6a\x84\x5a\x45\x77\xcb\x0e\xab\x4d\x2d\x0d\x0a\x4e\x17\x03\x2c\x90\x19\x88\x6a\x0f\x96\x17\x1e\xa6\x90\xd4\x14\x6d\xd4\x83\x98\x43\x61\x49\xe6\x41\x0a\x3c\xfd\xd8\xa7\x4b\x44\x02\xb1\xe3\xd3\x39\x92\x74\x41\x9d\x08\xe3\x8e\x98\x52\x9a\xce\xe0\x9e\x91\xe9\x59\x03\x08\xc7\x0f\xd2\x59\xe6\xa5\xea\x3f\x21\x10\xd9\x1d\xab\xf3\xb1\xa1\x59\x08\xf6\x9e\x01\xd3\xb7\x55\x7c\x68\xf7\x50\x79\xf1\x03\x9c\xfc\x70\xa8\x80\xfd\xe1\x44\x52\x62\x36\x9f\x8f\x78\x46\x12\x92\x91\x75\xc6\x52\x9d\x1f\x95\x5e\x55\x79\x39\x3d\x90\x5d\x74\xed\x2c\xa1\xdf\x70\xae\x09\x99\x29\x1d\x0d\x47\x6d\x7e\x9c\x15\x4a\x93\x20\x93\xe0\xfe\x82\x54\x0d\xb1\x14\xed\x07\x50\x80\x71\x07\x8a\x60\x97\x7e\xa7\x54\xc1\x35\xf4\x42\xae\xdb\xeb\x08\x46\xfd\x38\xf2\x53\xb2\xc5\x02\x6e\x52\xc3\xed\x32\x18\x2f\xfb\xc2\x5b\x75\xbe\xc9\xe3\x75\x5c\xb6\x3a\x8b\xa2\x84\xb0\xf6\x05\xd0\x5a\x37\x4e\xac\x8b\x98\x1d\x03\x3c\x2f\x4d\xae\xd6\xc4\xe5\x07\xa8\xed\x37\xd7\x4f\x69\x87\x95\x7a\x5b\xd2\x26\x06\xb8\x05\x83\x9d\x99\x5b\x55\x3a\x91\x4f\x91\x5c\xe8\x41\x05\xc5\xdf\x1b\x49\x7e\x62\x75\x7f\x4e\x06\x6b\xc1\x07\x76\x3f\x62\xaa\x82\x9a\xa1\xd1\x19\xb7\x07\x38\x29\xa6\x37\x0c\x4e\xf2\x4b\x30\xbb\x83\x38\x22\x95\x5f\x7f\xd7\x3e\x66\x1c\x5f\x04\xaa\xa3\x40\x95\x99\x36\xc2\xa1\xa7\xe2\x59\x20\xf9\x3c\x78\x57\xcb\x7e\x6a\x51\x3c\x8b\xa9\xb1\xa8\xe8\xa4\x53\x36\x38\xff\xd7\x2c\xfb\x12\xee\x79\x53\x08\xec\xee\x5d\x33\x4c\xba\xad\xf4\x26\x78\x14\x9c\xb9\x06\xb0\x22\x5e\x04\x1c\x7d\x14\x88\xc3\x09\x9a\x88\x00\x9d\xfe\x8a\xa3\xa3\xa9\xce\x1a\x7a\xe7\x07\xbf\xe2\x7b\x2e\xb6\xc2\xc8\xbb\x93\x70\x82\x7a\x59\x55\x1c\x61\xc4\x14\x23\x44\x99\x2f\x94\x81\x3b\x07\xf2\x87\xff\xe4\x8e\xed\x4e\x20\xc5\x69\x14\xe5\xc6\x5a\x9f\xca\x5d\x14\x27\x70\x78\xc0\xf2\xfb\x7d\x11\xd3\x97\x2f\xf9\xbd\x64\x5c\x8f\xbb\x62\x7f\xd7\xa1\xfc\xab\xba\x8e\xa3\x6d\x30\x44\x4e\xd1\x9a\xea\xdb\xcc\x12\xfd\xcd\xc1\x07\xb4\x93\x16\x73\x70\xcf\x6d\xe4\x8f\xe7\x89\x33\xee\xf5\x93\x75\xe3\x95\xae\xb6\xb1\xe3\x5a\xce\x15\x66\xb1\x06\x84\x92\x6e\xb0\x93\xf8\xcb\x0a\x75\x26\x94\xf8\x1e\x9d\x01\x0a\xe4\x92\x8a\xd4\x51\x40\xce\x36\xca\x01\x6e\x2f\x8b\x2e\xfc\xf5\x35\x35\x5b\xb5\x83\x48\x69\x9a\x6e\x7d\x8b\x80\x88\x54\x46\x3e\xb6\x2f\xb9\x8f\xaf\x2a\x15\xc4\x40\xc7\x1a\x51\x3f\xb9\x3e\x77\xa9\xf3\x25\x1a\xc4\xb5\xeb\x62\x03\xb9\xd7\x07\x7d\xfe\x6d\x02\x4d\x82\xe5\x3a\x44\xc8\xa2\x83\x9a\xb4\x29\x66\x16\x89\x57\x09\xcc\xaf\xca\xbd\xf2\xa5\x89\xb1\x88\x33\xe9\xe9\x89\x2a\x93\xdf\x0c\x9e\x0a\x76\x44\x95\x43\x7a\x27\x7b\x48\x83\x22\x74\xae\x20\xda\x64\xc9\xe3\x65\x02\x27\x2f\xd5\x17\x14\xfe\x91\x75\x2f\xdf\xdc\x39\xb2\x5c\xf1\x0d\x29\x41\x70\x65\x12\xe5\xfa\x86\x79\x69\x8a\xe7\xf4\x56\xc3\xc5\xc4\xf0\x99\xc1\xc8\x84\x28\xd6\xb7\xef\x29\x2b\xc7\xc2\xd0\x51\x53\xe8\x66\xdd\x9f\xd1\x91\x45\x6c\x8b\xe1\x41\x49\x8e\x01\xa0\x3f\x50\x30\x7c\x31\xe6\x70\xfb\x00\x61\xda\xee\x42\x07\x5b\x8f\x39\xb1\x9b\x21\x7e\xac\x9b\xdd\x3c\x28\x73\x34\xba\x2e\xba\xb5\x49\x14\xac\xad\x2d\xcd\x64\x96\xec\x26\xe4\xdc\x3d\x24\xeb\x73\x67\xb1\x7c\x11\xec\x46\xb8\xc1\xac\x73\xd3\x5d\x94\x50\xcd\x8d\x62\x6c\x91\x35\xe6\x80\x86\x00\x6b\x04\xaf\xba\x1a\xbf\x37\x00\x32\xb6\x40\x61\xa9\x7b\xd9\x81\xd3\x4b\x34\x15\x2c\xee\x6c\x0d\xe8\x6b\x5d\x79\x89\xd0\x43\xbd\x59\xf0\xf8\x40\x9d\xc2\xdb\x1e\x8d\x49\xd8\xfa\x1f\x03\x2f\x61\xbf\xb7\x87\xa2\x83\xd0\x4f\x3f\xf1\x86\xed\xc4\x7f\xd5\xc7\x3e\x1e\x74\x49\x2b\xc4\x88\x2a\xfc\x4c\x0d\x49\x92\xa3\x7c\xe4\x77\x99\xb9\xce\x89\x78\xe5\x8b\x34\xf8\xde\xaf\x30\x3a\x15\x1b\xc5\x2b\xcc\x68\xec\x11\x71\x41\xc4\x01\x63\x91\x35\x47\x03\x70\x15\xce\x21\xf5\x50\x82\xba\xe6\x59\xe2\xee\x28\x40\x54\x9e\x0f\xe3\xb3\x18\x3e\x5d\x95\x61\x85\x2b\x92\x57\xc1\x23\x44\x2e\x0a\xf8\xcb\x00\x11\x26\x70\xc8\xe2\x13\x13\x20\x1a\xb4\xf5\xe9\xe2\xc6\x01\x78\x4a\x7a\xb5\xcb\x8d\x74\xa1\x33\x6c\x6f\x61\x09\x73\x20\xd0\x85\x3a\x9f\xad\xdb\x63\x5c\x7b\xb2\xb7\xae\x77\xae\xe6\xa5\x68\x23\x41\x78\xdc\x12\x3f\xd3\xbd\x77\xa5\x30\xb1\x80\xe1\xdd\x7e\xaf\x89\xd5\x0c\xbe\x30\x95\x2a\x34\x44\xaf\x2b\x77\x85\x82\x58\xca\x4a\x00\xd8\x14\xd6\xe1\x0c\x2c\x7f\xdd\x9b\x3d\xd3\x47\xc3\xb1\x5d\xe0\x60\x08\xb0\x87\x1d\x03\xa2\xba\xb4\xfa\xf7\x4b\xe3\x05\x71\xe1\xf6\x55\xd8\xf7\xcc\xd2\x20\xe6\x9b\x21\x8e\x4f\x0a\xc7\x36\xec\x5c\xce\x47\x96\x0b\x10\x2e\xfb\x94\xa2\x91\x52\x20\xc9\x18\x15\x51\x6a\x55\x30\xdc\xed\x5c\x03\xcc\xa0\x3c\x54\x21\xe4\x27\x4a\xa0\x30\x8c\x7d\xc9\xac\x59\x46\xf1\xb0\xcc\xe0\xf3\xbf\x92\xe0\x48\xee\x67\x40\xc8\x2a\x7c\x1b\xe2\xf6\x5c\x1a\xc6\x2d\x6b\xab\x50\x63\x93\xdd\x9e\x49\x74\x43\x4f\xfe\x0c\x9a\x65\xe4\x3c\xd2\xed\x50\x07\x66\xa1\x12\x6e\xb5\x6e\x60\x25\x5c\xf0\x47\x17\xfa\x30\x0e\x80\xe9\xa7\xf2\xeb\x47\x16\xbe\xf7\x93\xf3\x10\x91\xcc\x25\x63\x3c\x2b\x35\xaa\xc3\xd6\x3f\x77\xac\x83\x0d\x2e\xec\xb4\xc4\x37\x50\x9b\x97\xce\xb2\xaa\x22\x11\xcd\x2d\xe0\xcb\xa8\x1e\xad\xd7\x7e\xa3\x9d\x4f\xdf\x9d\x60\x2d\x8b\xb3\xe9\xb0\x41\x36\x4a\xeb\xf3\x0a\x43\x33\xad\x25\xd7\x90\xb8\xda\x83\x95\xda\x02\xed\xb8\x1d\x81\xeb\xbb\x2d\xd1\x35\xeb\x8c\x73\x06\x90\xfc\xb9\xd2\x8c\x30\x72\x89\xd8\x89\x2c\xa1\x89\xdf\x23\xa9\xb5\x92\x1b\x87\xea\x24\xeb\x7f\x5d\x21\xf8\xf4\xd0\xaa\x16\xfa\x58\xb9\xe1\xff\x7b\x2b\xb2\xe0\x57\x7e\x8e\xe3\x89\x4e\xee\x4a\x47\x5e\x86\x15\xe4\x28\xb6\xe2\x27\x66\x2f\xec\xd3\x1b\x74\x14\x5d\x49\xd9\x0a\x20\x06\x2d\x1f\x3d\x2d\x63\xb9\xd8\x27\x97\x3c\x93\xcc\x5e\x5e\x83\x4b\xdf\x4c\xab\x3c\xe4\x3e\x71\x3c\x29\x58\x8c\xc4\x1d\x47\xc4\xe8\x9a\x8d\x73\x3a\x47\x2e\x80\xe3\xfe\xaa\x0e\x12\x7a\xde\x2b\xa6\x5a\x3b\x0b\x64\xa1\x5d\x52\xfd\x82\xba\xf4\xa3\xfe\x89\x81\x91\x44\x62\x9e\xaf\x94\x7d\x29\xd7\xb8\x21\x7e\x43\x82\x0f\x21\xcc\x11\xcf\xc6\x58\xed\x7b\xc0\x63\x3a\x3b\xe4\xf9\xfa\x5c\x33\x5c\xa5\xcc\x4f\xd4\x2e\x7d\x13\x95\x44\x19\x3b\x9e\x42\x00\xd7\x05\xf6\x28\x82\xb2\xc5\x45\x03\xd7\x72\x2e\xe3\xd4\x39\x1c\x50\x9b\xd2\x0c\xe8\x1c\x21\x9e\x4f\x78\x88\xff\x6b\xe0\xf5\xb6\xf3\x48\x98\x85\xd1\xd6\x13\x0e\x18\x6a\x0d\x25\x0d\xb8\x1c\x80\x6b\x78\xe9\x50\x6d\x67\xf6\xbf\x09\x3a\x57\x87\xa9\x24\x09\x44\x37\x18\x78\x4e\x05\xe2\x2d\x1c\x4e\xfd\x4d\xba\x5d\xda\x07\x91\x57\xb8\x51\x9f\x88\x3a\x3a\x9f\x9e\x49\x61\xdf\xca\x17\x93\xe3\xf1\xf5\x5d\x74\x67\xcd\x0f\xac\x35\xc0\x5b\x86\x69\x20\xf6\xe5\x9c\x91\xc5\x1c\x91\x9d\xb5\xc2\xcb\xf5\xce\x93\xfd\x50\xc0\x56\x95\x34\xdb\xa8\xea\x8d\x2f\x2b\xd5\xd6\xbe\x8b\x9f\x97\xb8\xf2\x4c\xc3\xde\x2c\x90\x26\x50\x74\x0c\x1f\x6d\xa3\x01\x94\xa9\x64\x15\xd3\xd9\x93\x24\x90\x77\x32\xb5\xcf\x54\x26\x48\x0a\x34\x7d\xed\xf2\x78\xea\x54\x85\x40\x67\xe4\xac\x72\x0a\x87\x93\xd2\x76\x38\x4a\xa1\x3f\x03\x78\x17\x28\xe6\x3e\xc7\x34\xe7\xc9\xe9\x0d\xd0\x13\x36\x0b\xfe\xe7\xe0\x45\x5c\x10\x65\x8b\x23\xad\x8a\x2f\x67\xce\x5d\xf0\x67\x3c\x8f\x2c\xc7\x32\x45\xe7\xef\x9d\xeb\x94\x91\x7c\x01\x5c\x8f\xea\x13\x40\x83\x0a\xdc\xd3\x5d\x07\xe1\xbb\x1d\x86\x78\x12\xf8\x3b\xd7\xdb\x00\x9a\x10\x42\xbf\x82\x07\xd8\x18\xc4\x8f\xe6\x55\x77\x89\x6a\x16\x03\x55\x61\x59\xe0\x39\x1b\x37\x18\x93\x15\x96\x67\xb2\x28\xbd\xe5\x6d\x9b\x31\x52\xc7\x60\xe0\x9a\xee\x90\x17\x58\xf4\xaa\x70\x2d\x91\xfc\x42\x3f\x87\x27\x94\xf9\xd8\xaf\xdc\x2b\x21\xe2\xf7\xc8\x32\x23\x37\x5e\x6c\xdc\xc4\x30\xd8\x0b\xb2\xb7\x7e\xee\xb9\x0e\xff\xf5\xec\xdc\xc3\x9d\xf8\xf3\x95\xf6\xc2\xee\xec\x0e\x11\x1d\x94\x96\x75\xd5\x97\x1d\xab\xcd\x68\xc5\xf1\x66\x3a\x9a\xe1\xa1\xab\x54\x6d\x70\x1b\x64\x39\x13\x8e\x2f\x3b\x20\x70\xcb\xc7\xc1\x82\x46\x51\xce\x7d\x49\xa0\xf0\xd0\x46\x93\x60\x4c\xff\x0a\x36\x19\x28\xf6\xb4\xce\xc5\x21\x4b\xa2\x5b\xca\xe1\x4d\x65\xa9\x6a\x3b\x25\xc7\x39\x73\x28\x0a\xb9\x84\xd1\x89\x5a\x2d\x25\x5b\x90\xd9\x52\xe0\x0e\xb1\x88\xd5\x91\x9a\x91\xbc\xeb\x45\x0e\xe0\xce\xdc\xb9\xc8\x1f\x7a\x25\x54\xf5\xf6\x7e\xe9\xc5\xc9\xee\xb6\x5a\x5b\xeb\xf0\xda\x81\x58\x03\xa0\xc2\x1a\x63\xcf\x2b\xb7\x7c\x08\x78\xb1\x5d\xd1\x99\x3e\x5e\x74\x63\x7e\x67\x5b\xf3\x81\x1b\xdb\x74\x50\xe3\x8b\x81\xcf\xff\x19\x6c\xd3\x42\x31\xf8\x04\xc0\x24\xfd\x3d\xc5\xae\x03\x3d\x4b\x2e\x78\xf6\x60\xa0\x81\x59\xd7\xb3\xce\xfd\x31\xc3\x5e\x9f\x45\xd0\x65\x3c\x63\xa0\xaa\x8f\xa4\x5f\xc5\xac\x74\x0f\xe4\x15\x85\x7b\x54\x21\x05\x7a\xb8\x96\x16\xec\xbd\x09\x0f\x20\x35\x9d\x6d\xa4\x94\xc8\x2b\x6a\xca\xa7\x58\xde\x2b\x09\x74\xc1\x56\xc4\xff\x37\x7a\x2a\x81\x93\xc6\x02\x71\x75\x30\xb1\xb0\xc4\xad\xfe\x20\x2b\x6b\xb5\x9e\x57\x12\xa9\xef\xfa\x6f\xd5\xf5\x41\x8e\x4a\x9b\x13\x3f\xa3\x32\xfb\x17\x71\x16\xcc\x25\x6c\x0b\xc7\x02\xb2\xe1\x1b\x0b\x1c\x95\x74\xb3\xc9\xd6\xdc\xb7\x66\xb1\x06\x51\x42\xc2\xdd\xae\xe7\x96\x85\xf6\xab\xf2\x9d\x73\x4b\xfe\xa4\xa5\x11\x8c\x73\x2d\x12\x83\xe0\xc0\x69\x22\x0f\xed\x4b\xa9\x58\x87\x24\xdd\x24\x26\xb8\x39\xaf\x96\x1c\xa0\xb7\xa8\x6a\xb4\xa3\x66\x7d\x3c\x19\xae\x62\x1f\x36\xd1\x32\x8f\x28\x0e\xc7\x78\x72\x9c\x27\xc3\x00\x79\xaf\x1c\xe9\x3e\x64\x32\xf6\xbb\xe0\x62\xba\x83\xee\x62\xd5\xe1\x97\x22\x9f\x45\x18\x53\x53\x43\x60\x82\xab\xaf\xa1\x11\x57\xca\x1f\x78\x93\x33\xfa\x41\x6a\xff\x63\xa2\xcb\x00\x49\xfb\xd1\xfd\x21\xd6\x21\xb7\x2d\x81\x5d\x4f\xd3\xd8\x4b\x4c\x4d\x34\x4d\xae\x44\x7f\x08\xb7\x48\x11\xbd\x16\x33\xd9\x21\x17\x06\xf8\x56\x6b\xa5\xc3\x10\xd9\xb7\x3d\xbc\xf2\xff\xf9\x17\x00\x01\x29\xbb\xce\x39\xd5\x06\xaf\x78\x3f\x2f\x43\xf9\x37\x4d\xda\x65\xb2\x45\xc8\x86\x9d\x76\x4f\x8b\xd4\xd4\xeb\xc5\x14\x84\x2f\xdb\x71\x7d\xdb\x1f\x37\xf9\x19\x63\x6f\xce\xd8\xda\x24\x99\x16\x54\xab\x58\xfe\xb2\x3f\xd9\xfd\xf8\xbc\x0b\x5c\x6c\x11\x14\x00\x4e\x1a\x9e\x22\x22\x3c\x76\xde\x38\x5a\x6d\xa5\xfd\x73\x8c\xdc\x84\xcc\xf4\xe9\xb9\xad\x5e\xc0\x44\xe4\x13\xd5\x44\xa4\x83\x85\x4b\x19\x27\x4b\xf0\xcb\x9b\xb7\xa2\xfc\x9f\x8a\xd6\x8c\xfa\x13\x50\x3c\x06\x3d\xb1\xce\x7e\x45\xa7\x0b\x2b\x50\x0c\xdd\x1c\x2e\xe6\x51\x29\x12\xe2\xa6\x27\xb8\xbc\x67\xe2\xf3\x22\x4f\x27\xc8\xdd\xb0\xf8\xb8\x79\x3f\xf1\x71\x78\xd5\x30\x52\x87\xfa\xb3\x26\x73\xd5\x82\x2a\xe2\x49\x82\x2f\x36\xbb\xcc\xbd\x98\x47\x88\xd4\xc6\x88\xe8\xdf\x8a\x47\xf8\xde\xfc\x6a\x3c\xd2\x3f\xb0\xb0\x0c\xf2\x4a\xa5\x66\x89\xfb\x45\x70\x32\x49\x8d\xce\xf3\x74\xbe\xc9\x20\xa2\x21\xca\x24\x42\xfa\x32\x1b\xbd\x74\x93\x39\x66\x22\x09\x6e\x19\xf9\xb2\x51\xa5\x3a\x07\x3d\x75\xf6\xdc\xd9\x4d\x09\x84\xee\x72\xe4\x44\xbe\xc2\xfe\x34\x8d\x8c\x73\xcd\xed\xa3\xdb\x53\x55\x81\x96\x0b\xf0\x6d\x56\xe6\x9b\xf4\x49\xda\x21\xf1\x39\xfc\x20\x25\xad\x40\x8b\x01\x34\x9f\x7c\x78\xdd\x1e\x76\xc9\x80\xc7\xe4\x03\xe7\x7c\xe4\x19\x50\xcc\xd5\x55\x5b\x5c\xe2\xad\x80\x3f\xdd\x73\x6e\xcf\xaa\x9d\x7d\x28\x3d\x84\xb8\x33\xe4\xc6\x08\x18\x63\x7a\x2d\x24\x58\x05\x7b\x1e\xe0\xc8\x9e\xfe\x9d\x63\x79\x7f\xbd\xb6\xd8\x96\x9a\x4d\x5d\xd3\xf5\x47\xba\x3b\x58\xc0\xac\xb0\x9e\xd4\x37\x90\x82\x6a\xc0\xcd\x99\x31\x18\x53\x4f\xaa\x0d\x64\x86\x42\x6b\xca\xa5\x71\x43\x5a\x9e\x1f\x11\xa2\x7a\xdf\xdd\x78\xf3\xb0\x58\xe6\xb8\xae\x4a\x78\x5f\xc0\x93\x23\xef\xd8\x33\x03\x20\xe2\xff\x89\xdb\x04\x32\xe7\x5c\x7d\x4b\x97\x62\x09\x3d\x11\xc9\x79\x24\x20\xb5\x3d\x18\xe2\xbf\x9f\x82\x4c\x3f\xb1\xa0\x26\xe5\x9d\x2e\x49\x15\xa8\x0d\xb2\xf0\x51\x60\x4b\xb8\xee\x79\x14\x6c\xcb\x63\x1e\x57\x9d\xea\xdc\xdd\xb1\x95\x20\x90\xbb\xc5\x54\x50\x39\x86\xcd\x29\x5b\x2e\x32\x35\xfa\xa1\x6b\xaa\x8d\x7e\xb2\x0f\x2c\x58\x2a\x59\x1c\xa4\x50\x6f\x29\x17\x64\xb6\x39\xfe\x6b\xa6\xfa\x1e\x28\x8a\x31\x18\x5f\x35\x77\x53\x58\xb5\xbf\x1a\x66\x47\xef\x38\x89\x11\x06\x36\x30\x25\x67\xc8\xd5\x86\xee\xc0\xa3\x98\xe7\xf2\xcc\x2d\xa7\xdd\x9f\x02\xe8\x9d\x61\x8d\x27\x36\xeb\x34\x03\x7d\x29\xb1\x1c\xfe\x7b\xeb\x6d\x14\xa2\xfa\xeb\x63\x63\xb6\xed\xd5\x3b\x59\x7e\x75\xad\xf9\xc7\x74\x36\x69\x3c\x00\xd4\x3b\xd6\x11\x4e\x4b\x50\x20\xb6\xc9\x9e\xaa\x88\x9c\xc9\x82\x65\x4f\xd2\x9f\xb8\x8f\xc2\x23\xbf\x56\xe9\xc5\x1d\x24\x3c\x67\x36\x3e\x87\xe2\xe0\x2d\x15\x5e\x40\xbf\x68\x9e\x26\xdc\x50\x33\xc7\x50\x9b\xf1\x76\x2f\x82\x54\xb8\x81\xa0\xbf\x9e\xf7\x6b\x6f\x78\xe1\x75\x28\xc8\x97\xc0\xc4\x3b\xe0\x2c\x18\xd8\x33\xa6\xb3\x42\xbb\x4b\x8d\xdf\xef\x55\xec\x08\x3d\x1a\x15\xf9\xc4\x81\x17\x99\x2e\xbc\x0e\x46\x11\xec\xc0\x2c\xba\x3f\xca\xcd\xbf\x54\xce\x8d\x07\x69\xcf\x80\xad\x9b\x87\x1e\x15\xc4\xcb\x59\x0e\x6c\x5a\x19\xc5\xab\x80\xe3\x58\xd5\xeb\x69\xd0\x62\xc5\x73\xe2\x4e\x76\x5e\x60\x48\x84\x03\xcf\x60\x3d\xe8\x81\x8c\x92\x97\xb6\x4e\xb1\x9d\xbc\x6c\x9a\x7c\xd1\x06\x15\xfe\x62\x15\x57\x1f\x0d\xb4\xf7\xfd\x2f\xc0\x8e\x5d\xf2\x31\x6a\x9d\xc4\x40\x85\xa0\x92\x2c\x48\xbb\xb9\xad\x6c\xeb\xa6\x2e\x90\x1c\x29\x40\xfb\x31\x37\x0e\x09\x08\x64\x22\x0f\x56\xc4\xda\xcd\xa1\x6f\xbc\xb6\xd6\x50\x97\xe1\xcf\x24\x75\x4c\x2b\x42\xbc\xaf\xd7\x61\xe2\x6a\xfe\xc4\x9b\x15\x64\x2c\xea\x8f\x02\x86\x1a\x46\xa4\xbd\x57\x2e\xda\x44\xff\x8b\xe8\x7e\x2c\xc6\xbb\x13\xda\xca\xe9\x35\x91\x78\x88\x10\xe6\x3b\x5c\x99\x6a\x6b\x3e\xfe\x1a\x83\xfc\x48\x92\xad\xec\x81\xb2\xf8\x30\x22\xf4\x30\xa0\xc5\x96\x53\xdd\x96\x8b\x6e\xcb\x2b\xe0\x3a\x47\x00\x40\x72\x26\x22\xb2\xeb\x1b\xbd\x9e\xbc\xad\xbd\xe3\x3e\xb7\xf3\x78\xbe\x84\x1d\x8a\x88\x52\x27\x0b\xd8\x92\xda\x91\x2c\x90\x83\x5c\x65\x90\x63\x2d\xda\x7a\x7f\x62\x2f\x64\xc3\xbe\x71\x48\xed\xbb\xcc\xa8\x9c\xff\x09\xd6\x21\xcb\x0f\xc7\x7c\x7f\x71\x03\xb8\x50\x03\xba\xbb\x2f\xda\x98\x09\x92\xd5\x7c\xfa\xbe\x4b\xd1\x54\x67\xd3\x61\xe4\x0c\x3e\x51\xce\x43\x58\xc1\xcf\x3d\x66\xef\xbc\xd0\x85\x12\x68\xea\x44\x0c\x71\x10\xc7\x15\x21\xc7\x67\x72\x8d\xe8\xa0\x4c\x99\x1a\x00\xdb\x56\x96\x35\xab\xa4\x79\x3e\x2f\x5f\xdd\xb9\x00\x57\xbf\x3a\x2f\x70\x3f\x6f\x16\x6f\x49\xa2\xec\x45\xf0\x7f\xab\x04\xcb\x56\x03\x03\xd0\x90\x24\xaa\x66\x5f\x0d\x9c\xb7\x6e\xfc\x82\x4b\x06\x0f\xdb\xef\x93\xfa\x31\x12\x2c\xf6\xb7\x14\x5e\x37\x23\xc2\xac\xfb\x3e\xa5\x15\xb1\xb2\x0c\xc0\x50\x08\xff\x95\xbe\xac\xee\x09\xbf\xcd\x8c\x76\x77\x92\xcc\x7f\x47\x94\xd4\xb4\x69\xfc\x5c\xc8\x40\xca\x5f\x76\xc4\x2a\x0f\x64\x90\x9e\x56\xb6\x01\xf1\x07\x59\xd3\xa3\x58\x25\x61\x1d\x14\x6e\x71\x0c\xe3\x26\xb0\xee\x25\x9c\x3f\x22\xe2\x98\xcf\xc3\x5b\xfd\x18\x63\xb5\xd7\x04\xba\xc2\xf8\xc8\x24\xc7\x97\xbe\x92\x1f\x12\x9c\x3a\x6a\xa3\x6f\xfc\x8e\x8c\x8f\x5e\xe7\xc6\x65\x66\x6e\x22\x6f\xf5\xb3\xa1\x7f\x25\xef\x28\xec\x65\xe3\x4b\x24\xad\x61\x6e\x0c\xf1\xd6\x72\x2a\x5d\x04\xfe\xc4\x46\xb4\x00\x9e\x7c\x5c\x2a\xb8\x56\xb0\x67\xaa\xdc\x3f\xf8\xd4\xec\xb3\xc1\x3a\xb5\x57\xd7\xb6\x8b\xe7\x82\xaf\x0d\x71\x0a\x99\xad\xa4\x38\x94\xff\xda\xe9\x65\x01\x74\x0a\x2c\x57\x92\xef\xe3\xa2\x0c\xf8\xbb\x07\x70\x5d\x9c\xe0\xa2\x1b\x29\x74\x0f\x3b\xf7\x65\xd8\xf6\x38\xa3\x3b\xca\xba\x50\x47\x3f\x89\x34\x12\x1b\x85\x28\xff\xb3\x47\xe2\x22\xe8\xe4\xf7\x8f\xa4\x41\x71\x7e\x66\x55\x82\x4d\x7e\xec\x80\x6a\xaa\xbc\x8c\xaa\xa5\xd4\xc0\xdb\x2b\xe6\x8a\x4b\x4f\x49\xeb\x0a\x88\x36\xbe\xde\x6c\x06\xa4\x22\x27\x6c\x34\xbf\xd9\xe9\xd5\x56\x44\x99\xfc\xf2\x50\xc0\xc6\xfd\xad\xb8\x26\x74\x5a\x74\xb1\x47\xd8\xe0\xaf\x0c\xcf\x36\x4a\xe2\xb9\x6c\xac\xe9\x30\xd4\x0d\xf5\x5b\xcf\x6a\x8b\xb0\x06\x22\x65\x1f\x0c\x18\x68\x2f\x94\x24\x99\x88\x13\x18\x32\x7a\xdb\x6f\x09\x71\x08\xd8\x80\x41\xe0\x90\x96\xb1\x0f\xde\xe9\x96\x1b\x4b\x59\xac\x35\x5a\x91\xb5\x8b\x14\x41\xec\xa0\x91\x10\xb1\xa5\x09\x18\xde\xde\x1b\xc5\x2e\x8e\x9d\xda\xb0\xbe\x60\xf9\x5f\x6a\x7e\xbd\x41\x19\x8d\x81\xef\x59\x23\x38\x91\xb2\xfe\x2d\xa1\x07\x87\x07\xe9\x44\xf4\xc5\xc2\x35\xcf\xe7\x93\xe2\x30\x2d\xfe\xcd\x74\x7c\xfc\x5f\x80\xbc\x61\x75\x3e\xcd\x61\xf0\x09\x02\x7f\x9d\x47\xbd\x5e\x6e\xaf\x8d\x8c\xb1\x5f\xbb\x86\x00\x1a\x56\x68\x59\x78\x81\x40\xa6\xfe\xe8\xa6\x65\xc0\xa9\x36\x25\x29\x2d\x56\x6d\x4e\xe9\x71\xeb\x49\x79\x9e\x2e\x20\x02\x4e\x2a\xd6\x34\x11\xe7\x8f\xa5\x41\x2b\xa1\x32\x04\xfb\x2c\xd8\xe5\xa2\x9e\x73\x6d\x60\xee\xfa\x1a\x89\x73\xc7\xc1\xfc\x93\xe9\xec\x39\xb6\xa8\x00\x00\xd1\x29\xfc\x49\xa2\x4a\x7f\x8d\xde\xaf\xa1\xc5\x49\x5c\xc9\xa2\x45\x35\x78\x39\xca\x48\x10\x8e\xcf\x4f\xe8\xfe\xa3\x83\x73\x19\xc1\x92\x16\x87\xfb\x06\xc5\xba\xbc\x0f\x4f\x29\xda\x09\x74\x4b\x0e\xb3\x7d\x0e\x71\x5f\x55\xe2\xe6\xf9\xf9\x90\xb5\xb0\x2e\x11\x16\x02\x72\x76\xf8\x48\xae\xe2\xb6\x03\xfa\x6f\x3c\x0d\x62\xc6\x6d\x0b\x3e\xbe\xf1\xaa\x8b\xda\x62\x63\xd8\x3e\x33\x85\x86\xbf\xc6\x24\x36\x00\x69\xe7\x26\x5d\x24\x14\x4e\x4a\x6d\x08\x34\x60\x1d\x8c\x61\x7a\xb4\xbc\x75\xbe\x23\xaa\x37\xa7\x8b\xf4\xb8\x71\xaf\x6a\xed\x7c\xde\xf3\x27\xd7\x3f\xe8\x91\x74\xc7\xd2\xea\xa6\xf3\x3d\x2e\x2e\x90\x20\x3a\x79\x79\xab\xfe\xef\x2f\xc0\x6e\x4a\xf4\x16\x6b\xab\x39\x01\x46\xa8\xc0\x3e\x2c\xe9\x43\x92\x46\x22\x66\xcb\x48\x41\x76\x97\xe5\x80\x56\xb2\x67\x87\x51\x77\x5f\x23\xfe\x89\xc9\xb8\xcb\x46\xcc\x8c\xf7\x65\x0f\xe1\x3b\x35\xa0\xdc\x55\x8c\xc7\x9b\x5c\x87\xce\xae\xb5\xfa\x62\xb4\x97\x0d\xb5\x4b\xfb\xa2\x46\x24\x97\xe3\x94\x89\x58\x47\xeb\xff\x56\xc6\xea\x05\x29\xb0\xaa\xb8\x02\xcd\xb9\x5d\x87\x97\x99\x5e\x21\x1a\x55\x9d\x16\xa8\x01\xea\xc2\x96\x8a\xfd\x48\x62\x0b\xeb\xb7\xa0\x18\xbc\xff\xca\x02\x51\xc6\xee\xc0\x88\x15\x9b\x55\x34\xf3\x50\x05\x11\xa4\xd5\x39\xcb\x1b\xe6\x77\xbf\x98\x39\xe5\xdd\xb0\x6f\xd9\x3d\xf5\xba\xcf\x7b\x07\x4c\x75\x87\x1d\xef\x7e\x3b\x2c\x8d\x4b\x62\xf6\xf3\x84\x5c\x5a\x29\xb8\x12\x74\xde\x4d\xb6\x74\xf6\x1f\x29\x05\x31\x03\x4f\x5c\x3c\x2f\x94\xcd\xa7\x22\x88\x04\xb5\xe2\x1a\xf3\x7f\x8b\x87\x67\x95\x28\x47\xe9\x58\xdf\x7c\x5a\xdc\xf8\xd0\x6f\x9e\x5b\xf7\x7d\xb8\xd1\x9d\xcb\x85\x15\xa1\xb9\x29\xeb\x74\xd9\x90\xb9\xc6\x53\xdd\x3c\x16\x54\x2c\xab\x99\x19\xb3\x54\xbd\x33\xc9\xa3\x72\x7f\x83\xf6\x5a\x0f\xb0\x16\xc7\xbc\x23\x0c\x5c\x0f\x86\xf4\x4d\x87\xda\x62\xa6\x8a\x49\xcf\xb2\x77\xb2\xb7\x1e\x35\x28\x1c\x98\x91\x06\xec\xe6\x03\x87\x75\x9e\x7e\x11\x7d\x8b\xbd\x2a\x5c\xe5\xbd\xf4\x30\x11\x11\x7a\xd5\x71\x09\x67\x02\xfd\x50\x6a\x77\xaf\x98\xe4\x1a\xc9\x93\x03\xce\x2f\x5f\x2b\xf1\x7f\x56\x7b\x63\x87\x35\xa5\x7f\x5a\xbb\x8b\xfe\x2d\x67\x75\x37\x1a\x7b\x0b\x9d\x44\xa9\xbd\xc2\xf5\x60\xa5\xbe\x5f\x82\x80\x7f\x0f\x28\x65\xbf\xb5\x6e\x77\xdf\x4d\x90\xb9\xab\x2f\x33\xab\xf3\xcd\xf8\x51\x3c\xbd\xb7\x4f\x93\xa6\x10\x3a\xa2\x9c\x40\x36\x12\x8f\x67\x6a\x1c\xf4\xe0\x93\x43\x3d\xcd\x6e\xe4\x73\x1f\xda\xad\x41\x27\x08\x27\x7f\x6a\xd4\x29\x6e\xb5\xa4\xa9\xec\x6a\x00\xd1\x70\x95\xe7\x4e\x67\xab\xc1\x7c\xc4\x37\xa9\x29\x62\x80\x77\x95\x8a\x9b\xdb\x01\xfd\x88\xd0\xb9\x44\x99\x47\xc8\x59\xd4\xc9\x11\x8c\x9a\x92\x95\x55\xfa\xd9\xb7\xa4\xd9\xeb\xe4\x48\x06\x7e\xf9\xfc\x47\xe2\x02\x2d\x95\xed\x60\xda\x3d\xfe\x9a\x8b\xf9\x15\x55\x03\xda\x9f\xe6\xbc\x8e\x48\x1e\xb3\x37\x9f\x5a\x62\xdb\x76\x77\xf5\xd7\x8c\xc9\x51\x77\xc9\xc6\x67\xea\xe7\xd9\x34\xec\xf8\xac\x75\xe6\xa9\x47\x9d\x24\xdc\xe9\x80\xac\x32\xc6\x7f\xb3\x41\x66\x18\x1c\x5a\x54\x50\x0d\x98\x42\x64\x93\x9c\x5b\xb9\x95\x38\x6b\xb3\xaa\x16\xf8\x05\x82\xaa\xc4\xcd\xe3\xf7\xf1\xff\xb6\x6e\x4f\x5e\x36\x2e\x12\xd4\xf7\x0a\xb1\x33\x09\xcc\xb4\xdf\x4b\xb8\xaa\x90\x94\xdd\xb0\x0e\xb9\x18\xcc\xc4\xd5\x7d\x1e\x09\x36\xba\x50\x54\xa3\x75\xd9\xc9\x5b\x1b\xb4\x85\x31\xe1\x94\x81\xce\xb1\x18\x83\x07\x0c\x38\x45\x2d\x99\xdd\xe8\x4e\x80\xb3\xdd\x2a\xdc\xcf\x8b\x05\x7f\x1b\x82\x86\x6c\xa7\xd4\x8c\x4e\x20\xdc\x38\x79\x03\xd5\xf5\xc2\x18\x5b\x6f\xb6\xbd\xd9\x7f\xfb\xfd\x87\x91\x2c\x15\x19\x69\xa9\x9b\x80\x8c\xc0\x28\xc4\x93\x11\xe7\x2c\x59\x9b\xde\x21\xb3\xcb\x2e\xaa\xcf\x33\x85\xfa\x9c\x03\xfc\x05\x75\x9f\xaf\x7e\x08\x7b\xb8\x53\x3f\x6c\xb5\x4b\x91\xda\xfc\xe2\x3e\xac\xba\x2a\xab\xaf\x4e\xa8\xd4\x2b\x17\xa1\x6d\x21\x98\x83\x8b\x4b\x1c\x96\xd4\xf0\xbb\x36\xa1\xdf\x8c\x4b\xbe\x4c\x3f\xdd\x83\x8d\xb2\x0a\x7d\xc2\x8d\xa9\x3f\x4c\xdb\x72\x56\xfc\xb7\x4c\xba\x18\x0c\x15\x09\x31\x5c\xc3\xbb\x66\xf0\x75\xfc\x74\x4f\xc4\x7d\x6d\x30\x8a\x8c\xc7\x26\x0a\x33\xf0\x01\x32\x02\xae\x2c\x23\x89\x84\x86\x65\xdc\xae\xaf\xa3\x65\xcc\xc5\x12\x85\x5b\xd3\xd4\x97\x85\x7f\x2e\x9c\xe2\x83\x02\x45\x2e\x8f\xd5\x5a\x4c\xbf\xf1\xbf\xab\xd2\x39\xf3\x11\x05\x4b\x33\xc9\x20\x2b\xe9\x32\x0b\x86\x74\x3f\x16\xab\x1f\x39\xf8\xa2\xb5\x4c\x26\x18\xdc\x18\xd8\x3f\x24\xe7\xca\x25\x4b\xf4\xcb\xfe\x4f\x65\x85\x0a\xe6\x45\xb5\x59\x67\xd9\xc4\xdc\xac\x65\x02\x0a\x28\xba\xf0\x5d\xc5\xb1\x06\x76\x6e\xcf\xdd\x44\x5f\xbc\x25\xcf\xbe\xe6\xab\x7f\x68\x24\x8d\x88\x71\x69\xfa\xd0\x31\x2a\x17\x78\x8b\x3d\x52\x3b\x2f\x16\xdc\xe6\xe5\xba\xc6\x37\x9b\x78\x9b\x94\xac\x2d\xcd\xe8\x36\xdd\x15\xa3\xd2\xb0\x14\xbc\xe0\x16\xbb\x87\x6c\xbd\x98\x22\xad\x7a\x2c\x76\x1f\x4f\x51\x6b\x8c\x7f\x28\x8b\xb4\xdb\x51\xfd\xd0\xb7\xbb\xb2\xc1\x6b\x3f\xcb\xfb\x8c\x83\x19\xea\x9e\x35\x74\x0d\x23\xb3\xd2\x57\x9b\x16\x97\x9e\xd3\xf6\x44\xc7\x95\x5c\xe3\x45\xe5\xd0\x19\x1a\x0e\x6a\x64\x00\x56\x44\x3c\x6d\x44\xf0\xd4\x16\xc0\x25\xec\x84\x1d\x5f\x0d\xfd\x46\x15\x3a\x81\x12\x3a\xdc\xca\x5c\x0c\x18\x3c\xb0\x9d\xcd\xd0\x96\x40\x85\xbb\x1b\x21\xd4\x2e\x00\x34\xbe\x08\x20\xeb\xca\x6a\x22\xdd\x99\x43\xcf\x57\xd9\x6c\x67\x7b\x29\x98\x60\x5a\xec\xf8\xc9\x90\xf9\x7c\x25\xbd\x0f\xee\x26\x61\x95\x8f\x7a\x2f\xe9\x99\x43\x45\x2c\xf3\xd9\x64\x41\x4e\x69\x0b\x11\x45\xce\x1e\x03\x92\xb2\x12\x4a\xd5\xc0\x8d\xb0\x82\x47\x2f\xb2\xb7\x2d\xbf\xa0\x08\x6f\xfa\xcc\xe7\xbf\x68\xc9\x4e\xec\xd0\xb2\xc0\xd6\x21\x5b\x91\x4b\xbd\x0f\x53\x3f\x45\x4e\xf9\x1c\xeb\x49\xf4\xe3\x27\xe0\xb8\xff\x8f\xba\x5b\x7e\xb9\x30\x3f\xbf\xe0\x3b\x8c\x48\x4e\xd6\xa9\x20\x1f\x5a\xab\xdf\x4e\x23\x21\x0f\x06\x24\xe2\xed\xa4\x92\xe6\x5f\x65\x7d\x51\x47\xb8\x83\x3e\x3b\xba\x25\x1e\x5c\x0d\x9b\x4b\xac\x84\x72\xb0\x26\x2d\x22\x0b\x49\xde\x4a\xad\xf2\xa1\x81\x7c\x22\x8c\x85\xfc\xa9\x41\x7a\xe1\x67\x13\x34\x95\x79\xe6\xf5\x7d\x6a\xff\x64\x74\x16\x92\xf6\x81\xaa\x0a\xd9\x75\x3c\x4d\xda\x19\x6d\xb1\x6e\x7d\x63\x80\xf1\x56\x42\xeb\x5d\xb7\x17\xd4\xe6\xf1\x5c\x08\xa1\xc3\xd8\x1b\x77\xed\x75\xa1\x9f\xeb\x7a\x92\xd6\x58\xd4\x44\x40\xee\x74\x83\x7d\x24\x56\x4f\x27\x3f\xf7\xbb\x37\x53\xaf\x56\x4a\x5f\x4b\x6b\x0a\xde\x3e\x96\x7e\x2c\xca\xe1\xaa\x01\x47\x80\x06\xee\x61\xfa\xdb\x57\x72\x21\xfa\xa6\xf8\xa2\x33\x71\xca\x93\x1e\xeb\xa0\x4f\x11\x89\x2e\x82\x56\x5f\xc9\x5c\xaa\x0a\x5a\x7b\x96\x98\xed\x5e\x5e\xb7\x90\xc3\x2d\x8c\xd6\x83\x47\x29\x13\x94\x9f\x10\x09\x06\x00\x48\x92\x7c\x2f\xc6\xbf\x50\x6a\xe0\xdb\x57\xe1\x5b\x52\x43\xa5\x58\xaa\xef\x45\x16\x3a\xaa\x6b\x07\x12\xc1\x5d\x11\xda\x69\xc0\xf6\x3b\xdc\x6d\x14\x22\x9b\xdd\x5d\x20\x6d\x2a\x82\xc9\xde\xdc\xf5\x70\x95\x50\x92\x7d\x79\xd0\x9e\xde\xa6\x88\xb1\x8d\x71\xd9\x26\xd3\x0c\xfe\xd0\xa1\x24\xd9\x71\x7e\x41\x8f\xe4\x25\x93\xc3\x16\xd0\x9d\x25\x08\xea\x89\x64\x3f\x20\x6d\xbc\xbb\x1a\xff\x46\x47\x9e\x25\x29\x26\x6d\x91\x7f\xb7\x5f\xf9\x23\x31\x43\x19\xf9\x46\x0d\x8d\x21\x8f\x18\x30\xc9\x40\xc2\x05\xf2\x43\x8f\xce\xcd\xb5\xbb\x4c\xa7\x18\x96\x35\x97\xd4\x40\x08\xe9\x42\xdc\x76\xa3\x69\xb4\xc5\xa5\xaf\xad\xe3\xf1\x42\xe7\x6d\x95\xde\xbb\x20\x9b\x6e\xe9\x4c\x30\x80\xc6\xeb\xa3\x6b\x45\x62\xde\x4c\x68\x22\x65\x8e\xc3\xa6\x53\xfa\x8f\x93\xf0\xd3\x51\x53\x51\x37\x70\x50\x2a\x38\xc2\xb2\x67\x71\xe2\x3f\xb4\x33\xb9\x52\x52\x59\x36\xc8\xa6\x70\x08\xa5\x61\xdc\x41\x7e\x43\xa1\xf6\x9e\x40\x5d\xa4\x2b\xcc\xe3\x3d\xcd\x52\x94\xeb\x5f\x94\xd4\xee\x7f\xc2\xd0\xcf\x09\xaa\x11\x98\x8b\x89\x50\x13\x73\x07\x08\x91\x14\x09\xdb\x68\xbe\x95\x6d\xce\xd8\xa2\x62\x86\xc6\x8a\xfd\xea\xa6\xcd\xc0\xe1\xe4\x97\x72\xef\x2e\x6c\x0a\xeb\xcc\x39\x65\xdb\x72\x1a\x01\x1b\xc1\x44\xae\x9f\xee\xff\xbc\x92\x2d\x61\x43\x3a\xd2\x7b\xbb\xbc\x74\xa3\x60\x66\xa1\xfe\x9f\xfe\x3f\x4b\x38\xe2\x96\x79\x4f\x41\x14\xca\xe0\xb6\x2f\x97\x48\xc2\x6e\x6a\x39\xce\xb6\x59\x93\x83\x9c\x9f\x1b\xe2\xa2\x0a\xd2\x62\xc6\x13\x11\x3c\x75\x54\x66\xf6\x14\x1a\x3a\x35\xcb\x8a\x81\x5a\x47\xb6\x10\xed\x98\xe2\x23\x6b\xeb\x04\xe8\x41\x14\xbc\x7e\xbc\xa8\xc7\x82\x3a\xca\x69\x62\xcd\xe0\x10\xd4\xf6\xac\x05\x3c\x18\xf0\x75\xe2\x2b\x28\x49\xfb\x0d\xcb\x4a\xe4\xb3\xa9\x76\x2a\xda\x7d\xe4\x84\xa8\x61\x06\x34\x6f\xba\x36\x9b\xa0\x35\x25\x89\x94\xa1\x2e\x52\x57\xbc\x6b\x44\xd9\x58\xcd\x42\xd8\x75\xc0\x73\x73\xe0\xc4\xf3\xec\xa8\x98\xa1\xf7\x48\x34\xee\x8e\x7b\xac\x30\x7f\x87\xbe\xe4\x21\x1e\xe0\x96\x45\xcb\xac\x1e\x69\x97\xa4\x6d\x36\x21\x5e\x08\x49\x5b\xf7\x5b\xb7\x3e\x2a\xc8\x03\x74\x37\x87\xd6\x11\x47\x88\x3e\xb9\xa6\x0c\x8a\x6e\xdb\x8f\x97\x80\x5c\x78\xc6\x70\x25\x7f\x45\x14\x8d\xf6\xd7\xbd\x29\x88\x3f\x3d\x28\x80\x2c\xd7\xaf\xa4\x4f\x1b\xa6\xec\x08\xfb\xcb\xfe\x44\xd2\xc5\x0a\x1e\x0c\xa6\x79\x33\x8f\x88\x07\xf7\x95\x81\x5d\x0c\x2f\x5d\x4e\x23\x3e\xb6\x76\x59\xf9\xc1\x3f\x6f\xa1\xb5\x91\x35\xb3\xff\x60\xe8\xa2\x81\xcd\x11\x7c\xf9\xc0\xa9\x2e\xc8\x2f\x02\x02\xd7\x69\xc5\xd0\x7e\x4d\xc2\xdd\x89\x2d\x03\x06\xbe\x36\x3f\x19\x5f\x75\x6e\x2e\xd1\x76\xdc\x4d\x35\xdd\x35\x67\xcb\xce\xf0\x2f\xdf\x73\xaa\x19\xaa\xe7\xae\x29\x7c\x20\xf5\x83\x11\x49\x4a\x77\x59\xa1\x5b\xf9\x61\xe5\x97\xe4\x92\x9a\xa5\xc9\xe4\x85\x33\x3f\x7e\xe5\xfe\xa8\x12\xc5\x0a\x9c\x6e\x77\xa6\x99\x3e\x2a\x47\xd1\x71\x3e\x55\xbf\xaa\xd0\x39\x93\xe9\x46\x1e\x3e\x35\x96\x2f\xdd\x9d\xe8\x70\xdc\xe2\x6d\x12\xa2\x23\x17\xc9\xab\xbe\x94\x28\xac\x01\x5e\xc2\x2b\x7c\xc7\xe1\x51\x8f\x30\xb2\x71\x4c\xc0\x75\xbf\xdb\xe6\x96\x9b\xcf\xf8\xa1\x5c\x22\x82\xe5\x1a\xc9\x56\x26\x30\xb6\x8c\x35\x32\x50\x66\x51\x58\x8b\xef\x9c\x37\x08\xef\x8b\xbe\x8b\x83\x03\xb2\x03\x5b\x45\xc5\x0a\x92\x90\x4d\x23\x29\xc3\xf2\xa3\xed\x12\x03\x52\x20\xb3\xaa\x7c\xe8\xa6\x64\x49\x94\x31\xc5\xfb\x1b\x13\x16\xcb\x4b\xd5\x21\x6d\x42\xba\xe3\xdc\x53\x01\x5a\xdf\xf5\x92\xa0\x3b\x8f\x65\xfb\x34\x6b\x20\xc8\xa9\x63\x3e\x7a\x15\xbe\x81\x11\xaa\xfc\xeb\xa7\x64\xb2\xc8\x7e\xe8\x64\x75\xb8\x3e\xf1\xd6\xe8\x3a\x24\x62\x0a', 2) | 8,535.666667 | 25,547 | 0.750225 | 6,389 | 25,607 | 3.004696 | 0.041478 | 0.006251 | 0.006095 | 0.005001 | 0.002032 | 0.00125 | 0.00125 | 0 | 0 | 0 | 0 | 0.30945 | 0.000391 | 25,607 | 3 | 25,547 | 8,535.666667 | 0.44052 | 0 | 0 | 0 | 0 | 0.333333 | 0.996173 | 0.996173 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
79d1c48060164036a9c95011aa19515bdacffd36 | 11,307 | py | Python | saq/network_semaphore/test.py | ace-ecosystem/ACE | d17b5ef4bccf923ec6be5115fabe40f0627dab2d | [
"Apache-2.0"
] | 24 | 2019-09-21T21:09:45.000Z | 2022-03-15T19:48:13.000Z | saq/network_semaphore/test.py | ace-ecosystem/ACE | d17b5ef4bccf923ec6be5115fabe40f0627dab2d | [
"Apache-2.0"
] | 54 | 2019-09-16T20:06:30.000Z | 2021-08-18T22:22:08.000Z | saq/network_semaphore/test.py | ace-ecosystem/ACE | d17b5ef4bccf923ec6be5115fabe40f0627dab2d | [
"Apache-2.0"
] | 9 | 2019-09-08T13:35:55.000Z | 2021-01-03T15:23:37.000Z | # vim: sw=4:ts=4:et:cc=120
import threading
import saq
import saq.network_semaphore
from saq.network_semaphore import (
initialize_fallback_semaphores,
add_undefined_fallback_semaphore,
NetworkSemaphoreServer,
NetworkSemaphoreClient,
LoggingSemaphore
)
from saq.service import *
from saq.test import *
class TestCase(ACEBasicTestCase):
def setUp(self, *args, **kwargs):
super().setUp(*args, **kwargs)
# clear any existing network semaphore definitions
for key in saq.CONFIG['service_network_semaphore'].keys():
if key.startswith('semaphore_'):
del saq.CONFIG['service_network_semaphore'][key]
#
# test network semaphores
#
def test_add_semaphore(self):
server = NetworkSemaphoreServer()
self.assertEquals(len(server.undefined_semaphores), 0)
semaphore = server.add_undefined_semaphore('test', 1)
self.assertIsNotNone(semaphore)
self.assertTrue(isinstance(semaphore, LoggingSemaphore))
self.assertEquals(len(server.undefined_semaphores), 1)
self.assertTrue('test' in server.undefined_semaphores)
def test_service_start_stop(self):
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
def test_acquire_release_undefined_network_semaphore(self):
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
self.assertEquals(len(service.defined_semaphores), 0)
self.assertEquals(len(service.undefined_semaphores), 0)
client = NetworkSemaphoreClient()
self.assertTrue(client.acquire('test'))
self.assertEquals(len(service.undefined_semaphores), 1)
client.release()
self.assertFalse(client.semaphore_acquired)
self.wait_for_condition(lambda: len(service.undefined_semaphores) == 0)
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
self.assertTrue(log_count('acquiring fallback') == 0)
def test_acquire_release_defined_network_semaphore(self):
saq.CONFIG['service_network_semaphore']['semaphore_test'] = '3'
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
self.assertEquals(len(service.defined_semaphores), 1)
self.assertEquals(len(service.undefined_semaphores), 0)
client = NetworkSemaphoreClient()
self.assertTrue(client.acquire('test'))
client.release()
self.assertFalse(client.semaphore_acquired)
self.assertEquals(len(service.defined_semaphores), 1)
self.assertEquals(len(service.undefined_semaphores), 0)
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
self.assertTrue(log_count('acquiring fallback') == 0)
def test_acquire_block_network_semaphore(self):
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
# client_1 grabs the semaphore
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
# client_2 tries to grab the same semaphore with 0 timeout
client_2 = NetworkSemaphoreClient()
self.assertFalse(client_2.acquire('test', 0))
client_1.release()
self.assertTrue(client_2.acquire('test', 0))
client_2.release()
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
def test_acquire_after_wait_network_semaphore(self):
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
event_1 = threading.Event()
event_2 = threading.Event()
def run_client_1():
# client_1 grabs the semaphore
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
# let client_2 know it can start
event_1.set()
# wait for client_2
event_2.wait()
client_1.release()
def run_client_2():
# client_2 tries to grab the same semaphore with 0 timeout
client_2 = NetworkSemaphoreClient()
self.assertTrue(client_2.acquire('test'))
client_2.release()
thread_1 = threading.Thread(target=run_client_1)
thread_1.start()
event_1.wait()
thread_2 = threading.Thread(target=run_client_2)
thread_2.start()
wait_for_log_count('waiting for semaphore', 1)
event_2.set()
thread_2.join()
thread_1.join()
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
def test_multiple_locks_network_semaphore(self):
saq.CONFIG['service_network_semaphore']['semaphore_test'] = '3'
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
client_2 = NetworkSemaphoreClient()
self.assertTrue(client_2.acquire('test'))
client_3 = NetworkSemaphoreClient()
self.assertTrue(client_3.acquire('test'))
client_4 = NetworkSemaphoreClient()
self.assertFalse(client_4.acquire('test', 0))
client_3.release()
self.assertTrue(client_4.acquire('test'))
client_4.release()
client_2.release()
client_1.release()
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
def test_cancel_request_callback(self):
service = NetworkSemaphoreServer()
service.start_service(threaded=True)
self.wait_for_condition(lambda: service.service_status == SERVICE_STATUS_RUNNING)
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
client_2 = NetworkSemaphoreClient(cancel_request_callback=lambda: True)
self.assertFalse(client_2.acquire('test'))
client_1.release()
service.stop_service()
service.wait_service()
self.assertEquals(service.service_status, SERVICE_STATUS_STOPPED)
#
# test fallback semaphores
#
def test_add_undefined_fallback_semaphore(self):
initialize_fallback_semaphores()
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
semaphore = add_undefined_fallback_semaphore('test', 1)
self.assertIsNotNone(semaphore)
self.assertTrue(isinstance(semaphore, LoggingSemaphore))
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 1)
self.assertTrue('test' in saq.network_semaphore.undefined_fallback_semaphores)
def test_acquire_release_undefined_fallback_semaphore(self):
initialize_fallback_semaphores()
self.assertEquals(len(saq.network_semaphore.defined_fallback_semaphores), 0)
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
client = NetworkSemaphoreClient()
self.assertTrue(client.acquire('test'))
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 1)
client.release()
self.assertFalse(client.semaphore_acquired)
self.wait_for_condition(lambda: len(saq.network_semaphore.undefined_fallback_semaphores) == 0)
def test_acquire_release_defined_fallback_semaphore(self):
saq.CONFIG['service_network_semaphore']['semaphore_test'] = '3'
initialize_fallback_semaphores()
self.assertEquals(len(saq.network_semaphore.defined_fallback_semaphores), 1)
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
client = NetworkSemaphoreClient()
self.assertTrue(client.acquire('test'))
client.release()
self.assertFalse(client.semaphore_acquired)
self.assertEquals(len(saq.network_semaphore.defined_fallback_semaphores), 1)
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
def test_use_fallback_semaphores(self):
# make sure what we can use fallback semaphores if network semaphores are unavailable
client = NetworkSemaphoreClient()
self.assertIsNone(client.fallback_semaphore)
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
client.acquire('test')
self.assertIsNotNone(client.fallback_semaphore)
self.assertTrue(client.semaphore_acquired)
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 1)
client.release()
self.assertEquals(len(saq.network_semaphore.undefined_fallback_semaphores), 0)
self.assertFalse(client.semaphore_acquired)
def test_fallback_semaphore_timeout(self):
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
client_2 = NetworkSemaphoreClient()
self.assertFalse(client_2.acquire('test', timeout=0))
client_1.release()
def test_acquire_after_wait_fallback_semaphore(self):
event_1 = threading.Event()
event_2 = threading.Event()
client_1 = NetworkSemaphoreClient()
client_2 = NetworkSemaphoreClient()
def run_client_1():
# client_1 grabs the semaphore
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
# let client_2 know it can start
event_1.set()
# wait for client_2
event_2.wait()
client_1.release()
def run_client_2():
# client_2 tries to grab the same semaphore with 0 timeout
self.assertFalse(client_2.acquire('test', 0))
event_2.set()
self.assertTrue(client_2.acquire('test'))
client_2.release()
thread_1 = threading.Thread(target=run_client_1)
thread_1.start()
event_1.wait()
thread_2 = threading.Thread(target=run_client_2)
thread_2.start()
thread_2.join()
thread_1.join()
def test_cancel_request_callback_fallback_semaphore(self):
client_1 = NetworkSemaphoreClient()
self.assertTrue(client_1.acquire('test'))
client_2 = NetworkSemaphoreClient(cancel_request_callback=lambda: True)
self.assertFalse(client_2.acquire('test'))
client_1.release()
| 37.69 | 102 | 0.693906 | 1,221 | 11,307 | 6.144963 | 0.087633 | 0.028922 | 0.053179 | 0.05038 | 0.830468 | 0.80088 | 0.761429 | 0.756764 | 0.746102 | 0.735306 | 0 | 0.015198 | 0.21438 | 11,307 | 299 | 103 | 37.816054 | 0.82945 | 0.049704 | 0 | 0.726852 | 0 | 0 | 0.032541 | 0.011655 | 0 | 0 | 0 | 0 | 0.310185 | 1 | 0.092593 | false | 0 | 0.027778 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8df2dc30343d7070857df424b477bdec986da189 | 38,587 | py | Python | basicsr/models/single_frame_sr_compress_model.py | IanYeung/ReCp | 1a7ace0e1ca3c262e24a222f3f0ab0d5674e9410 | [
"Apache-2.0",
"MIT"
] | null | null | null | basicsr/models/single_frame_sr_compress_model.py | IanYeung/ReCp | 1a7ace0e1ca3c262e24a222f3f0ab0d5674e9410 | [
"Apache-2.0",
"MIT"
] | null | null | null | basicsr/models/single_frame_sr_compress_model.py | IanYeung/ReCp | 1a7ace0e1ca3c262e24a222f3f0ab0d5674e9410 | [
"Apache-2.0",
"MIT"
] | null | null | null | import math
import random
import torch
from collections import OrderedDict
from os import path as osp
from tqdm import tqdm
from basicsr.archs import build_network
from basicsr.losses import build_loss
from basicsr.metrics import calculate_metric
from basicsr.utils import get_root_logger, imwrite, tensor2img
from basicsr.utils.registry import MODEL_REGISTRY
from .base_model import BaseModel
@MODEL_REGISTRY.register()
class SingleFrameSRCompressModel(BaseModel):
"""SR+Compress model for compression aware image super-resolution."""
def __init__(self, opt):
super(SingleFrameSRCompressModel, self).__init__(opt)
# define network
self.net_sr = build_network(opt['network_sr'])
self.net_cp = build_network(opt['network_cp'])
self.net_sr = self.model_to_device(self.net_sr)
self.net_cp = self.model_to_device(self.net_cp)
self.print_network(self.net_sr)
self.print_network(self.net_cp)
# load pretrained models
load_path = self.opt['path'].get('pretrain_network_sr', None)
if load_path is not None:
self.load_network(self.net_sr, load_path,
self.opt['path'].get('strict_load_sr', True))
load_path = self.opt['path'].get('pretrain_network_cp', None)
if load_path is not None:
self.load_network(self.net_cp, load_path,
self.opt['path'].get('strict_load_cp', True))
if self.is_train:
self.init_training_settings()
def init_training_settings(self):
self.net_sr.train()
self.net_cp.train()
train_opt = self.opt['train']
# define losses
if train_opt.get('pixel_opt_sr'):
self.cri_pix_sr = build_loss(train_opt['pixel_opt_sr']).to(self.device)
else:
self.cri_pix_sr = None
if train_opt.get('perceptual_opt_sr'):
self.cri_perceptual_sr = build_loss(train_opt['perceptual_opt_sr']).to(self.device)
else:
self.cri_perceptual_sr = None
if self.cri_pix_sr is None and self.cri_perceptual_sr is None:
raise ValueError('Both pixel and perceptual losses for super-resolution network are None.')
if train_opt.get('pixel_opt_cp'):
self.cri_pix_cp = build_loss(train_opt['pixel_opt_cp']).to(self.device)
else:
self.cri_pix_cp = None
if train_opt.get('perceptual_opt_cp'):
self.cri_perceptual_cp = build_loss(train_opt['perceptual_opt_cp']).to(self.device)
else:
self.cri_perceptual_cp = None
if self.cri_pix_cp is None and self.cri_perceptual_cp is None:
raise ValueError('Both pixel and perceptual losses for compression network are None.')
# set up optimizers and schedulers
self.setup_optimizers()
self.setup_schedulers()
def setup_optimizers(self):
train_opt = self.opt['train']
optim_params = []
for k, v in self.net_sr.named_parameters():
if v.requires_grad:
optim_params.append(v)
else:
logger = get_root_logger()
logger.warning(f'Params {k} will not be optimized.')
for k, v in self.net_cp.named_parameters():
v.requires_grad = False
if v.requires_grad:
optim_params.append(v)
else:
logger = get_root_logger()
logger.warning(f'Params {k} will not be optimized.')
optim_type = train_opt['optim_g'].pop('type')
if optim_type == 'Adam':
self.optimizer_g = torch.optim.Adam(optim_params,
**train_opt['optim_g'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_g)
def feed_data(self, data):
self.lq = data['lq'].to(self.device)
if 'gt' in data:
self.gt = data['gt'].to(self.device)
def optimize_parameters(self, current_iter):
self.optimizer_g.zero_grad()
self.output_sr = self.net_sr(self.lq)
self.output_cp = self.net_cp(self.output_sr)
l_total = 0
loss_dict = OrderedDict()
# pixel loss
if self.cri_pix_sr:
l_pix_sr = self.cri_pix_sr(self.output_sr, self.gt)
l_total += l_pix_sr
loss_dict['l_pix_sr'] = l_pix_sr
if self.cri_pix_cp:
l_pix_cp = self.cri_pix_cp(self.output_cp, self.gt)
l_total += l_pix_cp
loss_dict['l_pix_cp'] = l_pix_cp
# perceptual loss
if self.cri_perceptual_sr:
l_percep_sr, l_style_sr = self.cri_perceptual_sr(self.output_sr, self.gt)
if l_percep_sr is not None:
l_total += l_percep_sr
loss_dict['l_percep_sr'] = l_percep_sr
if l_style_sr is not None:
l_total += l_style_sr
loss_dict['l_style_sr'] = l_style_sr
if self.cri_perceptual_cp:
l_percep_cp, l_style_cp = self.cri_perceptua_cp(self.output_cp, self.gt)
if l_percep_cp is not None:
l_total += l_percep_cp
loss_dict['l_percep_sr'] = l_percep_cp
if l_style_cp is not None:
l_total += l_style_cp
loss_dict['l_style_sr'] = l_style_cp
l_total.backward()
self.optimizer_g.step()
self.log_dict = self.reduce_loss_dict(loss_dict)
def test(self):
self.net_sr.eval()
self.net_cp.eval()
with torch.no_grad():
self.output_sr = self.net_sr(self.lq)
self.output_cp = self.net_cp(self.output_sr)
self.net_sr.train()
self.net_cp.train()
def dist_validation(self, dataloader, current_iter, tb_logger, save_img):
logger = get_root_logger()
logger.info('Only support single GPU validation.')
self.nondist_validation(dataloader, current_iter, tb_logger, save_img)
def nondist_validation(self, dataloader, current_iter, tb_logger,
save_img):
dataset_name = dataloader.dataset.opt['name']
with_metrics = self.opt['val'].get('metrics') is not None
if with_metrics:
self.metric_results_sr = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
self.metric_results_cp = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
pbar = tqdm(total=len(dataloader), unit='image')
for idx, val_data in enumerate(dataloader):
clip_name = val_data['key'][0].replace('/', '_')
img_name = 'im4'
self.feed_data(val_data)
self.test()
visuals = self.get_current_visuals()
sr_img = tensor2img([visuals['result_sr']])
cp_img = tensor2img([visuals['result_cp']])
if 'gt' in visuals:
gt_img = tensor2img([visuals['gt']])
del self.gt
# tentative for out of GPU memory
del self.lq
del self.output_sr
del self.output_cp
torch.cuda.empty_cache()
if save_img:
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["name"]}.png')
imwrite(sr_img, save_img_path)
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["name"]}.png')
imwrite(cp_img, save_img_path)
if with_metrics:
# calculate metrics
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_sr = dict(img1=sr_img, img2=gt_img)
self.metric_results_sr[name] += calculate_metric(metric_data_sr, opt_)
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_cp = dict(img1=cp_img, img2=gt_img)
self.metric_results_cp[name] += calculate_metric(metric_data_cp, opt_)
pbar.update(1)
pbar.set_description(f'Test {clip_name}')
pbar.close()
if with_metrics:
for metric in self.metric_results_sr.keys():
self.metric_results_sr[metric] /= (idx + 1)
for metric in self.metric_results_cp.keys():
self.metric_results_cp[metric] /= (idx + 1)
self._log_validation_metric_values(current_iter, dataset_name, tb_logger)
def _log_validation_metric_values(self, current_iter, dataset_name,
tb_logger):
log_str = f'Validation {dataset_name} SR\n'
for metric, value in self.metric_results_sr.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
log_str = f'Validation {dataset_name} CP\n'
for metric, value in self.metric_results_cp.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
if tb_logger:
for metric, value in self.metric_results_sr.items():
tb_logger.add_scalar(f'metrics_sr/{metric}', value, current_iter)
for metric, value in self.metric_results_cp.items():
tb_logger.add_scalar(f'metrics_cp/{metric}', value, current_iter)
def get_current_visuals(self):
out_dict = OrderedDict()
out_dict['lq'] = self.lq.detach().cpu()
out_dict['result_sr'] = self.output_sr.detach().cpu()
out_dict['result_cp'] = self.output_cp.detach().cpu()
if hasattr(self, 'gt'):
out_dict['gt'] = self.gt.detach().cpu()
return out_dict
def save(self, epoch, current_iter):
self.save_network(self.net_sr, 'net_sr', current_iter)
self.save_network(self.net_cp, 'net_cp', current_iter)
self.save_training_state(epoch, current_iter)
@MODEL_REGISTRY.register()
class SingleFrameSRMultiQPModel(BaseModel):
"""SR+Compress model for compression aware image super-resolution."""
def __init__(self, opt):
super(SingleFrameSRMultiQPModel, self).__init__(opt)
# define network
self.net_sr = build_network(opt['network_sr'])
self.net_cp = build_network(opt['network_cp'])
self.net_sr = self.model_to_device(self.net_sr)
self.net_cp = self.model_to_device(self.net_cp)
self.print_network(self.net_sr)
self.print_network(self.net_cp)
# load pretrained models
load_path = self.opt['path'].get('pretrain_network_sr', None)
if load_path is not None:
self.load_network(self.net_sr, load_path,
self.opt['path'].get('strict_load_sr', True))
load_path = self.opt['path'].get('pretrain_network_cp', None)
if load_path is not None:
self.load_network(self.net_cp, load_path,
self.opt['path'].get('strict_load_cp', True))
if self.is_train:
self.init_training_settings()
def init_training_settings(self):
self.net_sr.train()
self.net_cp.train()
train_opt = self.opt['train']
# define losses
if train_opt.get('pixel_opt_sr'):
self.cri_pix_sr = build_loss(train_opt['pixel_opt_sr']).to(self.device)
else:
self.cri_pix_sr = None
if train_opt.get('perceptual_opt_sr'):
self.cri_perceptual_sr = build_loss(train_opt['perceptual_opt_sr']).to(self.device)
else:
self.cri_perceptual_sr = None
# if self.cri_pix_sr is None and self.cri_perceptual_sr is None:
# raise ValueError('Both pixel and perceptual losses for super-resolution network are None.')
if train_opt.get('pixel_opt_cp'):
self.cri_pix_cp = build_loss(train_opt['pixel_opt_cp']).to(self.device)
else:
self.cri_pix_cp = None
if train_opt.get('perceptual_opt_cp'):
self.cri_perceptual_cp = build_loss(train_opt['perceptual_opt_cp']).to(self.device)
else:
self.cri_perceptual_cp = None
# if self.cri_pix_cp is None and self.cri_perceptual_cp is None:
# raise ValueError('Both pixel and perceptual losses for compression network are None.')
if train_opt.get('rate_opt'):
self.cri_bpp = build_loss(train_opt['rate_opt']).to(self.device)
else:
self.cri_bpp = None
# set up optimizers and schedulers
self.setup_optimizers()
self.setup_schedulers()
def setup_optimizers(self):
train_opt = self.opt['train']
# optimizer g
optim_params = []
for k, v in self.net_sr.named_parameters():
if v.requires_grad:
optim_params.append(v)
else:
logger = get_root_logger()
logger.warning(f'Params {k} will not be optimized.')
for k, v in self.net_cp.named_parameters():
if not k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_g'].pop('type')
if optim_type == 'Adam':
self.optimizer_g = torch.optim.Adam(optim_params,
**train_opt['optim_g'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_g)
# optimizer c
optim_params = []
for k, v in self.net_cp.named_parameters():
if not k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_c'].pop('type')
if optim_type == 'Adam':
self.optimizer_c = torch.optim.Adam(optim_params,
**train_opt['optim_c'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_c)
# optimizer a
optim_params = []
for k, v in self.net_cp.named_parameters():
if k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_a'].pop('type')
if optim_type == 'Adam':
self.optimizer_a = torch.optim.Adam(optim_params,
**train_opt['optim_a'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_a)
def feed_data(self, data):
self.lq = data['lq'].to(self.device)
if 'gt' in data:
self.gt = data['gt'].to(self.device)
def optimize_parameters(self, current_iter):
self.optimizer_g.zero_grad()
self.optimizer_c.zero_grad()
self.optimizer_a.zero_grad()
self.output_sr = self.net_sr(self.lq)
self.output_cp, self.likehihoods = self.net_cp(self.output_sr, qp=random.randint(15, 30), training=True)
l_total = 0
loss_dict = OrderedDict()
# pixel loss
if self.cri_pix_sr:
l_pix_sr = self.cri_pix_sr(self.output_sr, self.gt)
l_total += l_pix_sr
loss_dict['l_pix_sr'] = l_pix_sr
if self.cri_pix_cp:
l_pix_cp = self.cri_pix_cp(self.output_cp, self.gt)
l_total += l_pix_cp
loss_dict['l_pix_cp'] = l_pix_cp
# perceptual loss
if self.cri_perceptual_sr:
l_percep_sr, l_style_sr = self.cri_perceptual_sr(self.output_sr, self.gt)
if l_percep_sr is not None:
l_total += l_percep_sr
loss_dict['l_percep_sr'] = l_percep_sr
if l_style_sr is not None:
l_total += l_style_sr
loss_dict['l_style_sr'] = l_style_sr
if self.cri_perceptual_cp:
l_percep_cp, l_style_cp = self.cri_perceptua_cp(self.output_cp, self.gt)
if l_percep_cp is not None:
l_total += l_percep_cp
loss_dict['l_percep_sr'] = l_percep_cp
if l_style_cp is not None:
l_total += l_style_cp
loss_dict['l_style_sr'] = l_style_cp
# rate loss
if self.cri_bpp:
N, _, H, W = self.output_sr.shape
num_pixels = N * H * W
l_bpp = 0
for i in range(len(self.likehihoods)):
l_bpp += self.cri_bpp(self.likehihoods[i], num_pixels)
# l_bpp = self.cri_bpp(self.likehihoods, num_pixels)
loss_dict['l_bpp'] = l_bpp
l_total += l_bpp
l_total.backward()
self.optimizer_g.step()
self.optimizer_c.step()
aux_loss = self.get_bare_model(self.net_cp).aux_loss()
loss_dict['l_aux'] = aux_loss
aux_loss.backward()
self.optimizer_a.step()
self.log_dict = self.reduce_loss_dict(loss_dict)
def test(self):
self.net_sr.eval()
self.net_cp.eval()
with torch.no_grad():
self.output_sr = self.net_sr(self.lq)
self.output_cp, _ = self.net_cp(self.output_sr, qp=20, training=False)
self.net_sr.train()
self.net_cp.train()
def dist_validation(self, dataloader, current_iter, tb_logger, save_img):
logger = get_root_logger()
logger.info('Only support single GPU validation.')
self.nondist_validation(dataloader, current_iter, tb_logger, save_img)
def nondist_validation(self, dataloader, current_iter, tb_logger,
save_img):
dataset_name = dataloader.dataset.opt['name']
with_metrics = self.opt['val'].get('metrics') is not None
if with_metrics:
self.metric_results_sr = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
self.metric_results_cp = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
pbar = tqdm(total=len(dataloader), unit='image')
for idx, val_data in enumerate(dataloader):
clip_name = val_data['key'][0].replace('/', '_')
img_name = 'im4'
self.feed_data(val_data)
self.test()
visuals = self.get_current_visuals()
sr_img = tensor2img([visuals['result_sr']])
cp_img = tensor2img([visuals['result_cp']])
if 'gt' in visuals:
gt_img = tensor2img([visuals['gt']])
del self.gt
# tentative for out of GPU memory
del self.lq
del self.output_sr
del self.output_cp
torch.cuda.empty_cache()
if save_img:
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["name"]}.png')
imwrite(sr_img, save_img_path)
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["name"]}.png')
imwrite(cp_img, save_img_path)
if with_metrics:
# calculate metrics
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_sr = dict(img1=sr_img, img2=gt_img)
self.metric_results_sr[name] += calculate_metric(metric_data_sr, opt_)
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_cp = dict(img1=cp_img, img2=gt_img)
self.metric_results_cp[name] += calculate_metric(metric_data_cp, opt_)
pbar.update(1)
pbar.set_description(f'Test {clip_name}')
pbar.close()
if with_metrics:
for metric in self.metric_results_sr.keys():
self.metric_results_sr[metric] /= (idx + 1)
for metric in self.metric_results_cp.keys():
self.metric_results_cp[metric] /= (idx + 1)
self._log_validation_metric_values(current_iter, dataset_name, tb_logger)
def _log_validation_metric_values(self, current_iter, dataset_name,
tb_logger):
log_str = f'Validation {dataset_name} SR\n'
for metric, value in self.metric_results_sr.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
log_str = f'Validation {dataset_name} CP\n'
for metric, value in self.metric_results_cp.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
if tb_logger:
for metric, value in self.metric_results_sr.items():
tb_logger.add_scalar(f'metrics_sr/{metric}', value, current_iter)
for metric, value in self.metric_results_cp.items():
tb_logger.add_scalar(f'metrics_cp/{metric}', value, current_iter)
def get_current_visuals(self):
out_dict = OrderedDict()
out_dict['lq'] = self.lq.detach().cpu()
out_dict['result_sr'] = self.output_sr.detach().cpu()
out_dict['result_cp'] = self.output_cp.detach().cpu()
if hasattr(self, 'gt'):
out_dict['gt'] = self.gt.detach().cpu()
return out_dict
def save(self, epoch, current_iter):
self.save_network(self.net_sr, 'net_sr', current_iter)
self.save_network(self.net_cp, 'net_cp', current_iter)
self.save_training_state(epoch, current_iter)
@MODEL_REGISTRY.register()
class DoubleFrameSRMultiQPModel(BaseModel):
"""SR+Compress model for compression aware image super-resolution."""
def __init__(self, opt):
super(DoubleFrameSRMultiQPModel, self).__init__(opt)
# define network
self.net_sr = build_network(opt['network_sr'])
self.net_cp = build_network(opt['network_cp'])
self.net_sr = self.model_to_device(self.net_sr)
self.net_cp = self.model_to_device(self.net_cp)
self.print_network(self.net_sr)
self.print_network(self.net_cp)
# load pretrained models
load_path = self.opt['path'].get('pretrain_network_sr', None)
if load_path is not None:
self.load_network(self.net_sr, load_path,
self.opt['path'].get('strict_load_sr', True))
load_path = self.opt['path'].get('pretrain_network_cp', None)
if load_path is not None:
self.load_network(self.net_cp, load_path,
self.opt['path'].get('strict_load_cp', True))
if self.is_train:
self.init_training_settings()
def init_training_settings(self):
self.net_sr.train()
self.net_cp.train()
train_opt = self.opt['train']
# define losses
if train_opt.get('pixel_opt_sr'):
self.cri_pix_sr = build_loss(train_opt['pixel_opt_sr']).to(self.device)
else:
self.cri_pix_sr = None
if train_opt.get('perceptual_opt_sr'):
self.cri_perceptual_sr = build_loss(train_opt['perceptual_opt_sr']).to(self.device)
else:
self.cri_perceptual_sr = None
# if self.cri_pix_sr is None and self.cri_perceptual_sr is None:
# raise ValueError('Both pixel and perceptual losses for super-resolution network are None.')
if train_opt.get('pixel_opt_cp'):
self.cri_pix_cp = build_loss(train_opt['pixel_opt_cp']).to(self.device)
else:
self.cri_pix_cp = None
if train_opt.get('perceptual_opt_cp'):
self.cri_perceptual_cp = build_loss(train_opt['perceptual_opt_cp']).to(self.device)
else:
self.cri_perceptual_cp = None
# if self.cri_pix_cp is None and self.cri_perceptual_cp is None:
# raise ValueError('Both pixel and perceptual losses for compression network are None.')
if train_opt.get('rate_opt'):
self.cri_bpp = build_loss(train_opt['rate_opt']).to(self.device)
else:
self.cri_bpp = None
# set up optimizers and schedulers
self.setup_optimizers()
self.setup_schedulers()
def setup_optimizers(self):
train_opt = self.opt['train']
# optimizer g
optim_params = []
for k, v in self.net_sr.named_parameters():
if v.requires_grad:
optim_params.append(v)
else:
logger = get_root_logger()
logger.warning(f'Params {k} will not be optimized.')
for k, v in self.net_cp.named_parameters():
if not k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_g'].pop('type')
if optim_type == 'Adam':
self.optimizer_g = torch.optim.Adam(optim_params,
**train_opt['optim_g'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_g)
# optimizer c
optim_params = []
for k, v in self.net_cp.named_parameters():
if not k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_c'].pop('type')
if optim_type == 'Adam':
self.optimizer_c = torch.optim.Adam(optim_params,
**train_opt['optim_c'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_c)
# optimizer a
optim_params = []
for k, v in self.net_cp.named_parameters():
if k.endswith(".quantiles"):
optim_params.append(v)
optim_type = train_opt['optim_a'].pop('type')
if optim_type == 'Adam':
self.optimizer_a = torch.optim.Adam(optim_params,
**train_opt['optim_a'])
else:
raise NotImplementedError(
f'optimizer {optim_type} is not supperted yet.')
self.optimizers.append(self.optimizer_a)
def feed_data(self, data):
# [B, 2, C, H, W]
self.lq = data['lq'].to(self.device)
# [B, 2, C, H, W]
if 'gt' in data:
self.gt = data['gt'].to(self.device)
def optimize_parameters(self, current_iter):
self.optimizer_g.zero_grad()
self.optimizer_c.zero_grad()
self.optimizer_a.zero_grad()
self.prev_frame_lq, self.curr_frame_lq = self.lq[:, 0, :, :, :], self.lq[:, 1, :, :, :]
self.prev_frame_gt, self.curr_frame_gt = self.gt[:, 0, :, :, :], self.gt[:, 1, :, :, :]
self.prev_frame_sr = self.net_sr(self.prev_frame_lq)
self.curr_frame_sr = self.net_sr(self.curr_frame_lq)
if current_iter % 100 == 0:
mode = 'intra'
else:
mode = 'inter'
min_qp = self.opt['train']['min_qp']
max_qp = self.opt['train']['max_qp']
qp = random.randint(min_qp, max_qp)
self.curr_frame_cp, self.flow, self.likehihoods = self.net_cp(self.curr_frame_sr, self.prev_frame_sr,
qp=qp, training=True, mode=mode,
beta=100, debug=False)
l_total = 0
loss_dict = OrderedDict()
# pixel loss
if self.cri_pix_sr:
l_pix_sr = self.cri_pix_sr(self.curr_frame_sr, self.curr_frame_gt)
l_total += l_pix_sr
loss_dict['l_pix_sr'] = l_pix_sr
if self.cri_pix_cp:
l_pix_cp = self.cri_pix_cp(self.curr_frame_cp, self.curr_frame_gt)
l_total += l_pix_cp
loss_dict['l_pix_cp'] = l_pix_cp
# perceptual loss
if self.cri_perceptual_sr:
l_percep_sr, l_style_sr = self.cri_perceptual_sr(self.curr_frame_sr, self.curr_frame_gt)
if l_percep_sr is not None:
l_total += l_percep_sr
loss_dict['l_percep_sr'] = l_percep_sr
if l_style_sr is not None:
l_total += l_style_sr
loss_dict['l_style_sr'] = l_style_sr
if self.cri_perceptual_cp:
l_percep_cp, l_style_cp = self.cri_perceptua_cp(self.curr_frame_cp, self.curr_frame_gt)
if l_percep_cp is not None:
l_total += l_percep_cp
loss_dict['l_percep_sr'] = l_percep_cp
if l_style_cp is not None:
l_total += l_style_cp
loss_dict['l_style_sr'] = l_style_cp
# rate loss
if self.cri_bpp:
N, _, H, W = self.curr_frame_sr.shape
num_pixels = N * H * W
l_bpp = 0
for i in range(len(self.likehihoods)):
l_bpp += self.cri_bpp(self.likehihoods[i], num_pixels)
loss_dict['l_bpp'] = l_bpp
l_total += l_bpp
l_total.backward()
self.optimizer_g.step()
self.optimizer_c.step()
aux_loss = self.get_bare_model(self.net_cp).aux_loss()
loss_dict['l_aux'] = aux_loss
aux_loss.backward()
self.optimizer_a.step()
self.log_dict = self.reduce_loss_dict(loss_dict)
def test(self):
self.net_sr.eval()
self.net_cp.eval()
with torch.no_grad():
self.prev_frame_lq, self.curr_frame_lq = self.lq[:, 0, :, :, :], self.lq[:, 1, :, :, :]
self.prev_frame_gt, self.curr_frame_gt = self.gt[:, 0, :, :, :], self.gt[:, 1, :, :, :]
self.prev_frame_sr = self.net_sr(self.prev_frame_lq)
self.curr_frame_sr = self.net_sr(self.curr_frame_lq)
self.curr_frame_cp, _, _ = \
self.get_bare_model(self.net_cp)(self.curr_frame_sr, self.prev_frame_sr, qp=0, training=False)
self.net_sr.train()
self.net_cp.train()
def dist_validation(self, dataloader, current_iter, tb_logger, save_img):
logger = get_root_logger()
logger.info('Only support single GPU validation.')
self.nondist_validation(dataloader, current_iter, tb_logger, save_img)
def nondist_validation(self, dataloader, current_iter, tb_logger,
save_img):
dataset_name = dataloader.dataset.opt['name']
with_metrics = self.opt['val'].get('metrics') is not None
if with_metrics:
self.metric_results_sr = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
self.metric_results_cp = {
metric: 0
for metric in self.opt['val']['metrics'].keys()
}
pbar = tqdm(total=len(dataloader), unit='image')
for idx, val_data in enumerate(dataloader):
clip_name = val_data['key'][0].replace('/', '_')
img_name = 'im4'
self.feed_data(val_data)
self.test()
visuals = self.get_current_visuals()
sr_img = tensor2img([visuals['curr_sr']])
cp_img = tensor2img([visuals['curr_cp']])
if 'gt' in visuals:
gt_img = tensor2img([visuals['curr_gt']])
del self.gt
# tentative for out of GPU memory
del self.lq
del self.curr_frame_sr
del self.curr_frame_cp
torch.cuda.empty_cache()
if save_img:
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'sr_{img_name}_{self.opt["name"]}.png')
imwrite(sr_img, save_img_path)
if self.opt['is_train']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{current_iter:08d}.png')
else:
if self.opt['val']['suffix']:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["val"]["suffix"]}.png')
else:
save_img_path = osp.join(
self.opt['path']['visualization'], dataset_name, clip_name,
f'cp_{img_name}_{self.opt["name"]}.png')
imwrite(cp_img, save_img_path)
if with_metrics:
# calculate metrics
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_sr = dict(img1=sr_img, img2=gt_img)
self.metric_results_sr[name] += calculate_metric(metric_data_sr, opt_)
for name, opt_ in self.opt['val']['metrics'].items():
metric_data_cp = dict(img1=cp_img, img2=gt_img)
self.metric_results_cp[name] += calculate_metric(metric_data_cp, opt_)
pbar.update(1)
pbar.set_description(f'Test {clip_name}')
pbar.close()
if with_metrics:
for metric in self.metric_results_sr.keys():
self.metric_results_sr[metric] /= (idx + 1)
for metric in self.metric_results_cp.keys():
self.metric_results_cp[metric] /= (idx + 1)
self._log_validation_metric_values(current_iter, dataset_name, tb_logger)
def _log_validation_metric_values(self, current_iter, dataset_name,
tb_logger):
log_str = f'Validation {dataset_name} SR\n'
for metric, value in self.metric_results_sr.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
log_str = f'Validation {dataset_name} CP\n'
for metric, value in self.metric_results_cp.items():
log_str += f'\t # {metric}: {value:.4f}\n'
logger = get_root_logger()
logger.info(log_str)
if tb_logger:
for metric, value in self.metric_results_sr.items():
tb_logger.add_scalar(f'metrics_sr/{metric}', value, current_iter)
for metric, value in self.metric_results_cp.items():
tb_logger.add_scalar(f'metrics_cp/{metric}', value, current_iter)
def get_current_visuals(self):
out_dict = OrderedDict()
out_dict['lq'] = self.lq.detach().cpu()
out_dict['curr_sr'] = self.curr_frame_sr.detach().cpu()
out_dict['curr_cp'] = self.curr_frame_cp.detach().cpu()
if hasattr(self, 'gt'):
out_dict['gt'] = self.gt.detach().cpu()
out_dict['curr_gt'] = self.curr_frame_gt.detach().cpu()
return out_dict
def save(self, epoch, current_iter):
self.save_network(self.net_sr, 'net_sr', current_iter)
self.save_network(self.net_cp, 'net_cp', current_iter)
self.save_training_state(epoch, current_iter)
| 41.18143 | 112 | 0.566693 | 4,954 | 38,587 | 4.106177 | 0.045822 | 0.027529 | 0.018582 | 0.016813 | 0.963573 | 0.955019 | 0.953741 | 0.94912 | 0.947744 | 0.941205 | 0 | 0.003443 | 0.322622 | 38,587 | 936 | 113 | 41.225427 | 0.774811 | 0.038329 | 0 | 0.926797 | 0 | 0 | 0.104049 | 0.018947 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043137 | false | 0 | 0.015686 | 0 | 0.066667 | 0.007843 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
308e8e1e80963180cc716fa34546615df4f0cb83 | 19,488 | py | Python | Protheus_WebApp/Modules/SIGAPLS/PLSA475TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 17 | 2018-09-24T17:27:08.000Z | 2021-09-16T19:09:46.000Z | Protheus_WebApp/Modules/SIGAPLS/PLSA475TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 4 | 2018-09-24T17:30:32.000Z | 2022-01-03T11:39:30.000Z | Protheus_WebApp/Modules/SIGAPLS/PLSA475TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 18 | 2019-06-07T17:41:34.000Z | 2022-01-31T18:17:31.000Z | from tir import Webapp
import unittest
from tir.technologies.apw_internal import ApwInternal
import datetime
import time
DateSystem = datetime.datetime.today().strftime('%d/%m/%Y')
DateVal = datetime.datetime(2120, 5, 17)
"""-------------------------------------------------------------------
/*/{Protheus.doc} PLSA475TestCase
TIR - Casos de testes da rotina Mudança / Retorno de fase por lote
@author Silvia SantAnna
@since 11/2020
@version 12
-------------------------------------------------------------------"""
class PLSA475(unittest.TestCase):
@classmethod
def setUpClass(inst):
inst.oHelper = Webapp()
inst.oHelper.Setup("SIGAPLS",DateSystem,"T1","M SP 01","33")
#inst.oHelper.Program("PLSA475")
def test_PLSA475_001(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000417") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000417") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
#self.oHelper.SetValue("mv_par13","Mudar a Fase")# Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
#self.oHelper.SetValue("mv_par13",1) # Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
#self.oHelper.SetValue(field="mv_par13", value="Mudar a Fase", name_attr=True)
#self.oHelper.SetValue(field="mv_par13", value=1, name_attr=True)
self.oHelper.SetValue("Tipo ?","Mudar a Fase") # Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
#self.oHelper.SetValue("mv_par20","Sim") # Rep.Regras Cadastro = 1-Sim; 2-Nao
#self.oHelper.SetValue("mv_par20",1) # Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("Rep.Regras Cadastro ?","Sim")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
#self.oHelper.SetValue("mv_par28","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
#self.oHelper.SetValue("mv_par28",1) # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("Somente c/ valor zero ?","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
def test_PLSA475_002(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000419") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000419") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
self.oHelper.SetValue("Tipo ?","Retornar a Fase") # Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
self.oHelper.SetValue("Rep.Regras Cadastro ?","Sim")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
self.oHelper.SetValue("Somente c/ valor zero ?","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
def test_PLSA475_003(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000440") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000440") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
self.oHelper.SetValue("Tipo ?","Revalor. Pagto")# Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
self.oHelper.SetValue("Rep.Regras Cadastro ?","Sim")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
self.oHelper.SetValue("Somente c/ valor zero ?","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
def test_PLSA475_004(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000440") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000440") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
self.oHelper.SetValue("Tipo ?","Revalor. Cobr.")# Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
self.oHelper.SetValue("Rep.Regras Cadastro ?","Sim")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
self.oHelper.SetValue("Somente c/ valor zero ?","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
def test_PLSA475_005(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000440") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000440") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
self.oHelper.SetValue("Tipo ?","Rev.Cob./Pagto")# Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
self.oHelper.SetValue("Rep.Regras Cadastro ?","Sim")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
self.oHelper.SetValue("Somente c/ valor zero ?","Nao") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
def test_PLSA475_006(self):
self.oHelper.Program("PLSA475")
# Tela - processamento de guias por lote
self.oHelper.SetButton("Param.") # Chama o Pergunte PLS475
self.oHelper.SetValue("mv_par01","") # Operadora De
self.oHelper.SetValue("mv_par02","ZZZZ") # Operadora Ate
self.oHelper.SetValue("mv_par03","") # Local Dig. De
self.oHelper.SetValue("mv_par04","ZZZZ") # Local Dig. Ate
self.oHelper.SetValue("mv_par05","00000411") # Numero PEG de
self.oHelper.SetValue("mv_par06","00000411") # Numero PEG Ate
self.oHelper.SetValue("mv_par07","") # Rede Atend. De
self.oHelper.SetValue("mv_par08","ZZZZZZ") # Rede Atend. Ate
self.oHelper.SetValue("mv_par09","") # Ano De
self.oHelper.SetValue("mv_par10","ZZZZ") # Ano Ate
self.oHelper.SetValue("mv_par11","01") # Mes De
self.oHelper.SetValue("mv_par12","12") # Mes Ate
self.oHelper.SetValue("Tipo ?","Mudar a Fase") # Tipo = 1-Mudar a fase; 2-Retornar a fase; 3-Revalor. Pagto; 4-Revalor Cobr.; 5-Rev.Cob./Pagto
self.oHelper.SetValue("mv_par14","") # Grupo Empresa De
self.oHelper.SetValue("mv_par15","ZZZZ") # Grupo Empresa Ate
self.oHelper.SetValue("mv_par16","") # Contrato De
self.oHelper.SetValue("mv_par17","ZZZZZZZZZZZZ")# Contrato Ate
self.oHelper.SetValue("mv_par18","") # SubContrato De
self.oHelper.SetValue("mv_par19","ZZZZZZZZZ") # SubContrato Ate
self.oHelper.SetValue("Rep.Regras Cadastro ?","Nao")# Rep.Regras Cadastro = 1-Sim; 2-Nao
self.oHelper.SetValue("mv_par21","") # Grp Cobr BA3/BQC De
self.oHelper.SetValue("mv_par22","ZZZZ") # Grp Cobr BA3/BQC Ate
self.oHelper.SetValue("mv_par23","/ /", check_value=False) # Dt Evento De
self.oHelper.SetValue("mv_par24","/ /", check_value=False) # Dt Evento Ate
self.oHelper.SetValue("mv_par25","") # Local Atendimento De
self.oHelper.SetValue("mv_par26","ZZZ") # Local Atencimento Ate
self.oHelper.SetValue("mv_par27","CLI,DEN,HOS,LAB,MED,OPE") # Classe RDA
self.oHelper.SetValue("Somente c/ valor zero ?","Sim") # Somente c/ valor zero = 1-Nao; 2-Sim
self.oHelper.SetValue("mv_par29","") # Protocolo De
self.oHelper.SetValue("mv_par30","ZZZZZZZZZZZZ")# Protocolo Ate
self.oHelper.SetValue("mv_par31","") # Grupo Pagamento De
self.oHelper.SetValue("mv_par32","ZZZZ") # Grupo Pagamento Ate
self.oHelper.SetValue("mv_par33","") # Numero da NFSS De
self.oHelper.SetValue("mv_par34","ZZZZZZZZZZ") # Numero da NFSS Ate
self.oHelper.SetButton("OK")
self.oHelper.SetButton("Ok") # inicia o processamento
self.oHelper.SetButton("Sim") # Confirma a mudança de fase dos PEGs de acordo com os parâmetros informados ?
time.sleep(30) # Aguardando status do processamento..
self.oHelper.CheckResult("Aberta", "Sim", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Confirmar") # Fecha telinha de statuso do processamento
self.oHelper.SetButton("Cancelar") # Fecha telinha de Processamento de Guias por Lote
self.oHelper.AssertTrue()
@classmethod
def tearDownClass(inst):
inst.oHelper.TearDown()
if __name__ == '__main__':
unittest.main() | 53.98338 | 147 | 0.704793 | 2,739 | 19,488 | 4.930997 | 0.077035 | 0.221531 | 0.298238 | 0.298534 | 0.953872 | 0.951207 | 0.94943 | 0.945506 | 0.937287 | 0.937287 | 0 | 0.04225 | 0.131619 | 19,488 | 361 | 148 | 53.98338 | 0.755835 | 0.321583 | 0 | 0.887372 | 0 | 0 | 0.23502 | 0.010894 | 0 | 0 | 0 | 0 | 0.020478 | 1 | 0.027304 | false | 0 | 0.017065 | 0 | 0.047782 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
30cf62f4d6194d3a93d64017198e4b3877b042c0 | 184 | py | Python | module_03_numbers/psl_03.01_math_02.py | CodingGearsCourses/Python-3-Standard-Library-Essentials | 8b80bc8b77fa477b6ccbe2886ed9239c2defdfda | [
"Apache-2.0"
] | null | null | null | module_03_numbers/psl_03.01_math_02.py | CodingGearsCourses/Python-3-Standard-Library-Essentials | 8b80bc8b77fa477b6ccbe2886ed9239c2defdfda | [
"Apache-2.0"
] | null | null | null | module_03_numbers/psl_03.01_math_02.py | CodingGearsCourses/Python-3-Standard-Library-Essentials | 8b80bc8b77fa477b6ccbe2886ed9239c2defdfda | [
"Apache-2.0"
] | null | null | null | # --------------------------------
# CodingGears.io
# --------------------------------
# Math Module
import math
# TODO: Constants
print(round(math.pi, 3))
print(round(math.e, 3))
| 14.153846 | 34 | 0.429348 | 18 | 184 | 4.388889 | 0.666667 | 0.253165 | 0.35443 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.119565 | 184 | 12 | 35 | 15.333333 | 0.475309 | 0.586957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
6906d7db78ee6e523a1af98f5699d81095a1d3da | 10,330 | py | Python | simulator_pb2.py | ryanshea/worth_simulator | d2132a08da0a64059396bde06efe241e655f8732 | [
"Apache-2.0"
] | null | null | null | simulator_pb2.py | ryanshea/worth_simulator | d2132a08da0a64059396bde06efe241e655f8732 | [
"Apache-2.0"
] | null | null | null | simulator_pb2.py | ryanshea/worth_simulator | d2132a08da0a64059396bde06efe241e655f8732 | [
"Apache-2.0"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
from google.protobuf import descriptor
from google.protobuf import message
from google.protobuf import reflection
from google.protobuf import service
from google.protobuf import service_reflection
from google.protobuf import descriptor_pb2
_ACCOUNT_SWEEP = descriptor.Descriptor(
name='Sweep',
full_name='simulator.Account.Sweep',
filename='simulator.proto',
containing_type=None,
fields=[
descriptor.FieldDescriptor(
name='sweep_account', full_name='simulator.Account.Sweep.sweep_account', index=0,
number=1, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='amount', full_name='simulator.Account.Sweep.amount', index=1,
number=2, type=2, cpp_type=6, label=2,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[], # TODO(robinson): Implement.
enum_types=[
],
options=None)
_ACCOUNT = descriptor.Descriptor(
name='Account',
full_name='simulator.Account',
filename='simulator.proto',
containing_type=None,
fields=[
descriptor.FieldDescriptor(
name='name', full_name='simulator.Account.name', index=0,
number=1, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='value', full_name='simulator.Account.value', index=1,
number=2, type=2, cpp_type=6, label=2,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='liquidity', full_name='simulator.Account.liquidity', index=2,
number=3, type=2, cpp_type=6, label=1,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='sweep_out', full_name='simulator.Account.sweep_out', index=3,
number=4, type=11, cpp_type=10, label=1,
default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='sweep_in', full_name='simulator.Account.sweep_in', index=4,
number=5, type=11, cpp_type=10, label=1,
default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='rate', full_name='simulator.Account.rate', index=5,
number=6, type=2, cpp_type=6, label=1,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='stddev', full_name='simulator.Account.stddev', index=6,
number=7, type=2, cpp_type=6, label=1,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='timespec', full_name='simulator.Account.timespec', index=7,
number=8, type=9, cpp_type=9, label=1,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='start_date', full_name='simulator.Account.start_date', index=8,
number=9, type=9, cpp_type=9, label=1,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='loan_months', full_name='simulator.Account.loan_months', index=9,
number=10, type=5, cpp_type=1, label=1,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[], # TODO(robinson): Implement.
enum_types=[
],
options=None)
_CASHFLOW = descriptor.Descriptor(
name='Cashflow',
full_name='simulator.Cashflow',
filename='simulator.proto',
containing_type=None,
fields=[
descriptor.FieldDescriptor(
name='name', full_name='simulator.Cashflow.name', index=0,
number=1, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='account', full_name='simulator.Cashflow.account', index=1,
number=2, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='timespec', full_name='simulator.Cashflow.timespec', index=2,
number=3, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='amount', full_name='simulator.Cashflow.amount', index=3,
number=4, type=2, cpp_type=6, label=2,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='start_date', full_name='simulator.Cashflow.start_date', index=4,
number=5, type=9, cpp_type=9, label=1,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='end_date', full_name='simulator.Cashflow.end_date', index=5,
number=6, type=9, cpp_type=9, label=1,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='category', full_name='simulator.Cashflow.category', index=6,
number=7, type=9, cpp_type=9, label=1,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='stddev', full_name='simulator.Cashflow.stddev', index=7,
number=8, type=2, cpp_type=6, label=1,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[], # TODO(robinson): Implement.
enum_types=[
],
options=None)
_GENERALLEDGER = descriptor.Descriptor(
name='GeneralLedger',
full_name='simulator.GeneralLedger',
filename='simulator.proto',
containing_type=None,
fields=[
descriptor.FieldDescriptor(
name='account', full_name='simulator.GeneralLedger.account', index=0,
number=1, type=11, cpp_type=10, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='cashflow', full_name='simulator.GeneralLedger.cashflow', index=1,
number=2, type=11, cpp_type=10, label=3,
default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='inflation', full_name='simulator.GeneralLedger.inflation', index=2,
number=3, type=2, cpp_type=6, label=2,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='property_tax', full_name='simulator.GeneralLedger.property_tax', index=3,
number=4, type=2, cpp_type=6, label=2,
default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='start_date', full_name='simulator.GeneralLedger.start_date', index=4,
number=5, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
descriptor.FieldDescriptor(
name='end_date', full_name='simulator.GeneralLedger.end_date', index=5,
number=6, type=9, cpp_type=9, label=2,
default_value=unicode("", "utf-8"),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[], # TODO(robinson): Implement.
enum_types=[
],
options=None)
_ACCOUNT.fields_by_name['sweep_out'].message_type = _ACCOUNT_SWEEP
_ACCOUNT.fields_by_name['sweep_in'].message_type = _ACCOUNT_SWEEP
_GENERALLEDGER.fields_by_name['account'].message_type = _ACCOUNT
_GENERALLEDGER.fields_by_name['cashflow'].message_type = _CASHFLOW
class Account(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
class Sweep(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
DESCRIPTOR = _ACCOUNT_SWEEP
DESCRIPTOR = _ACCOUNT
class Cashflow(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
DESCRIPTOR = _CASHFLOW
class GeneralLedger(message.Message):
__metaclass__ = reflection.GeneratedProtocolMessageType
DESCRIPTOR = _GENERALLEDGER
| 37.70073 | 87 | 0.706583 | 1,304 | 10,330 | 5.370399 | 0.06365 | 0.093674 | 0.072826 | 0.070541 | 0.850493 | 0.757961 | 0.72369 | 0.716836 | 0.716836 | 0.716836 | 0 | 0.018844 | 0.167764 | 10,330 | 273 | 88 | 37.838828 | 0.795743 | 0.015876 | 0 | 0.707031 | 1 | 0 | 0.117925 | 0.076189 | 0 | 0 | 0 | 0.003663 | 0 | 1 | 0 | false | 0 | 0.023438 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
691e7548e6278b27f6f06f0e12023fb3a48c5008 | 8,519 | py | Python | FourierIdea/a_Scene2b_PhaseMore.py | kolibril13/manim-3b1b-kolibril-backup | 8a8a98a0c310e9660a0c65df5fbdc2175266afce | [
"MIT"
] | 2 | 2021-04-08T08:40:07.000Z | 2021-04-15T15:12:48.000Z | FourierIdea/a_Scene2b_PhaseMore.py | kolibril13/manim-3b1b-kolibril-backup | 8a8a98a0c310e9660a0c65df5fbdc2175266afce | [
"MIT"
] | null | null | null | FourierIdea/a_Scene2b_PhaseMore.py | kolibril13/manim-3b1b-kolibril-backup | 8a8a98a0c310e9660a0c65df5fbdc2175266afce | [
"MIT"
] | null | null | null | from hendrik_old.Image_Processing.FourierIdea.ImProImports import *
from manimlib.imports import *
global k_plane_size
k_plane_size=0.7
# scene = "Scene2PhaseMore" # FULL ANIMATION SCENE phase with real out
class Scene2PhaseMore(ThreeDScene): # with real plane on the right
def construct(self):
self.camera.frame_center.shift(2 * OUT)
self.set_camera_orientation(phi=75 * DEGREES, theta=-60 * DEGREES) # 2.5D
self.clear()
self.add(Image_coordinate_system())
postion_setting = {"preset_position": "LEFT", "center_dist": 2}
# math_preperation:
k_math = FourierMathJuggling.k_from_preset_minimal(**postion_setting, amplitude=0)
pixels = k_math.get_pixels()
print(pixels)
k_disp = KSpace(pixel_len=19)
img_kamp, img_kph = k_math.get_amp_and_ph()
k_disp.amp_max=255
k_disp.fill_k_space_updater(img_kamp)
k_disp.set_shade_in_3d(True)
self.add(k_disp)
real_out = Realspace(pixel_len=pixels)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real) ## why??? something with norm
real_out.scale(9 / pixels * k_plane_size * 0.3).to_edge(UR)
real_text = TextMobject("Real-Space").scale(0.75).next_to(real_out, DOWN)
self.add_fixed_in_frame_mobjects(real_out, real_text)
self.wait(1)
def update_ampli(mob):
val=track.get_value()
print(val)
k_math.img_k_space[11,9]= 255*StepFunctions.linear_step_func(track.get_value(),0,1)
k_math.img_k_space[9,11] = 255*StepFunctions.linear_step_func(track.get_value(),1,2)
if val >= 2:
k_math.img_k_space[11, 9] = 255 * (1-0.5*linear_step_func(track.get_value(), 2, 3))
if val >= 3:
k_math.img_k_space[9, 11] = 255 * (1-0.3*linear_step_func(track.get_value(), 3, 4))
mob.fill_k_space_updater(k_math.get_amp_k_only())
img_real= k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
start_val=0
track = ValueTracker(start_val)
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=1)
self.wait()
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=2)
self.wait(2)
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=1.5)
self.wait()
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=1.5)
self.wait(2)
# # ##HERE STARTS THE LOOP:
# ####change the phase
def update_phase(mob):
val= my_phase_tracker.get_value()
k_math.img_k_space[9, 11]=255*(0.7*np.exp(1j * np.deg2rad(val)))
img_kamp, img_kph=k_math.get_amp_and_ph()
mob.set_phase_flowers_updater (img_kph)
mob.set_shade_in_3d(True)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
my_phase_tracker = ValueTracker(0)
for _ in range(0,4):
self.play(my_phase_tracker.increment_value, 90,
UpdateFromFunc(k_disp, update_phase),
rate_func=linear)
self.wait(0.2)
self.wait()
def update_phase2(mob):
val = my_phase_tracker.get_value()
k_math.img_k_space[11, 9] = 255 * (0.5 * np.exp(1j * np.deg2rad(val)))
img_kamp, img_kph = k_math.get_amp_and_ph()
mob.set_phase_flowers_updater(img_kph)
mob.set_shade_in_3d(True)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
my_phase_tracker = ValueTracker(0)
for _ in range(0, 4):
self.play(my_phase_tracker.increment_value, 90,
UpdateFromFunc(k_disp, update_phase2),
rate_func=linear)
self.wait(0.2)
scene = "Scene2PhaseMoreExtend" # FULL ANIMATION SCENE phase with real out
class Scene2PhaseMoreExtend(ThreeDScene): # with real plane on the right
def construct(self):
self.camera.frame_center.shift(2 * OUT)
self.set_camera_orientation(phi=75 * DEGREES, theta=-60 * DEGREES) # 2.5D
self.clear()
self.add(Image_coordinate_system())
postion_setting = {"preset_position": "LEFT", "center_dist": 2}
# math_preperation:
k_math = FourierMathJuggling.k_from_preset_minimal(**postion_setting, amplitude=0)
pixels = k_math.get_pixels()
print(pixels)
k_disp = KSpace(pixel_len=19)
img_kamp, img_kph = k_math.get_amp_and_ph()
k_disp.amp_max=255
k_disp.fill_k_space_updater(img_kamp)
k_disp.set_shade_in_3d(True)
self.add(k_disp)
real_out = Realspace(pixel_len=pixels)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real) ## why??? something with norm
real_out.scale(9 / pixels * k_plane_size * 0.3).to_edge(UR)
real_text = TextMobject("Real-Space").scale(0.75).next_to(real_out, DOWN)
self.add_fixed_in_frame_mobjects(real_out, real_text)
self.wait(1)
def update_ampli(mob):
val=track.get_value()
print(val)
k_math.img_k_space[5,9]= 255/3*StepFunctions.linear_step_func(track.get_value(),0,0.3)
k_math.img_k_space[11,9]= 255*StepFunctions.linear_step_func(track.get_value(),0.7,1)
k_math.img_k_space[12,12] = 255*StepFunctions.linear_step_func(track.get_value(),1,2)
if val >= 2:
k_math.img_k_space[11, 9] = 255 * (1-0.5*linear_step_func(track.get_value(), 2, 3))
if val >= 3:
k_math.img_k_space[12, 12] = 255 * (1-0.3*linear_step_func(track.get_value(), 3, 4))
mob.fill_k_space_updater(k_math.get_amp_k_only())
img_real= k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
start_val=0
track = ValueTracker(start_val)
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=4)
self.wait()
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=2)
self.wait(2)
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=1.5)
self.wait()
self.play(track.increment_value, 1,UpdateFromFunc(k_disp, update_ampli),rate_func=linear,run_time=1.5)
self.wait(2)
# # ##HERE STARTS THE LOOP:
# ####change the phase
def update_phase(mob):
val= my_phase_tracker.get_value()
k_math.img_k_space[12, 12]=255*(0.7*np.exp(1j * np.deg2rad(val)))
img_kamp, img_kph=k_math.get_amp_and_ph()
mob.set_phase_flowers_updater (img_kph)
mob.set_shade_in_3d(True)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
my_phase_tracker = ValueTracker(0)
for _ in range(0,4):
self.play(my_phase_tracker.increment_value, 90,
UpdateFromFunc(k_disp, update_phase),
rate_func=linear)
self.wait(0.2)
self.wait()
def update_phase2(mob):
val = my_phase_tracker.get_value()
k_math.img_k_space[11, 9] = 255 * (0.5 * np.exp(1j * np.deg2rad(val)))
img_kamp, img_kph = k_math.get_amp_and_ph()
mob.set_phase_flowers_updater(img_kph)
mob.set_shade_in_3d(True)
img_real = k_math.get_real_out()
real_out.fill_real_space(pixels ** 2 * img_real)
return mob
my_phase_tracker = ValueTracker(0)
for _ in range(0, 4):
self.play(my_phase_tracker.increment_value, 90,
UpdateFromFunc(k_disp, update_phase2),
rate_func=linear)
self.wait(0.2)
if __name__ == "__main__":
module_name = os.path.basename(__file__)
command_A = "manim -p -c '#1C758A' --video_dir ~/Downloads/ "
command_B = module_name +" " + scene
os.system(command_A + command_B) | 46.298913 | 110 | 0.627069 | 1,244 | 8,519 | 3.95418 | 0.127814 | 0.033543 | 0.029274 | 0.023785 | 0.93393 | 0.93393 | 0.93332 | 0.93332 | 0.899167 | 0.899167 | 0 | 0.040846 | 0.261416 | 8,519 | 184 | 111 | 46.298913 | 0.740941 | 0.040732 | 0 | 0.860606 | 0 | 0 | 0.019899 | 0.00258 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048485 | false | 0 | 0.012121 | 0 | 0.109091 | 0.024242 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
691f1cd29804afea801a5fba340da63e657d29bd | 550 | py | Python | ex017.py | ChrysWillians/exercicios-python3 | 354c8684cbc9de0c734e6a40e76e2f613845de96 | [
"MIT"
] | 1 | 2021-09-26T01:08:36.000Z | 2021-09-26T01:08:36.000Z | ex017.py | ChrysWillians/exercicios-python3 | 354c8684cbc9de0c734e6a40e76e2f613845de96 | [
"MIT"
] | null | null | null | ex017.py | ChrysWillians/exercicios-python3 | 354c8684cbc9de0c734e6a40e76e2f613845de96 | [
"MIT"
] | null | null | null | # from math import hypot
# cateto1 = float(input('Digite o cumprimento do cateto oposto: '))
# cateto2 = float(input('Digite o cumprimento do cateto adjacente: '))
# h = hypot(cateto1, cateto2)
# print('A hipotenusa dos catetos é igual a {:.2f}'.format(h))
print ('=' * 40)
cateto1 = float(input('Digite o cumprimento do cateto oposto: '))
cateto2 = float(input('Digite o cumprimento do cateto adjacente: '))
h = (cateto1 **2 + cateto2 ** 2) **(1/2)
print('A hipotenusa dos catetos é igual a {:.2f}'.format(h))
print('=' * 40)
| 30.555556 | 71 | 0.650909 | 77 | 550 | 4.649351 | 0.363636 | 0.111732 | 0.178771 | 0.189944 | 0.843575 | 0.843575 | 0.843575 | 0.843575 | 0.843575 | 0.843575 | 0 | 0.040359 | 0.189091 | 550 | 17 | 72 | 32.352941 | 0.762332 | 0.447273 | 0 | 0.333333 | 0 | 0 | 0.447653 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
694c47008aeb6769a6aeb0230b9d48624b864cdd | 11,200 | py | Python | Scripts/UnitTests.py | camillevleonard/CS5010_FinalProject | 22a3cdd3251f9280c2c92012e5b0a22c64ab1dee | [
"MIT"
] | null | null | null | Scripts/UnitTests.py | camillevleonard/CS5010_FinalProject | 22a3cdd3251f9280c2c92012e5b0a22c64ab1dee | [
"MIT"
] | null | null | null | Scripts/UnitTests.py | camillevleonard/CS5010_FinalProject | 22a3cdd3251f9280c2c92012e5b0a22c64ab1dee | [
"MIT"
] | null | null | null | #CS 5010 Semester Project
#Matt Dakolios, Darren Frye, Paul Hicks, Camille Leonard, Sudhanshu Luthra
import unittest
from Module import *
#Query 1 Unit Testing
class topStateTestCase(unittest.TestCase):
def test_n_states_null(self): #validate that the null returns a df with 5 rows
dfNull = topState()
lengthNull = len(dfNull)
self.assertEqual(lengthNull, 5)
def test_n_states(self):
#number of states falls within 1-50 range
df = topState(10)
length_df = len(df)
self.assertEqual(length_df, 10)
def test_n_states_exception(self): #test exception handling for edge cases of number of states
self.assertEqual(topState(-1), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
self.assertEqual(topState(0), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
self.assertEqual(topState(51), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
def test_measure_exception(self): #test edge cases of measure exception (correct options are "Rate" or "Deaths")
self.assertEqual(topState(10, " "), "Please pass the correct value for Measure.")
self.assertEqual(topState(10, "rate"), "Please pass the correct value for Measure.")
self.assertEqual(topState(10, "test"), "Please pass the correct value for Measure.")
def test_topState_return_df(self): #test that a data frame object is returned by topState
df_topState = topState()
self.assertIsInstance(df_topState, object)
def test_gender_exception(self): #test exception handling of edge cases of gender parameter
self.assertEqual(topState(10, "Rate", "male" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(topState(10, "Rate", "female" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(topState(10, "Rate", "both" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(topState(10, "Rate", "test" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
def test_drugType_exception(self): #test exception handling for edge cases of drug type parameter
self.assertEqual(topState(10, "Rate", "Both", "Beer"), "Please pass the correct value for DrugType. Options are All, Opioid, Cocaine or Amphetamine")
def test_startYear_exception(self): #test exception handling for edge cases of startYear parameter
self.assertEqual(topState(10, "Rate", "Both", "All", 1999), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
self.assertEqual(topState(10, "Rate", "Both", "All", 2018), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
def test_endYear_exception(self): #test exception handling for edge cases of endYear parameter
self.assertEqual(topState(10, "Rate", "Both", "All", 2000, 2018), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
self.assertEqual(topState(10, "Rate", "Both", "All", 2005, 2004), "End year cannot be less than Start year. Please correct one of the values")
self.assertEqual(topState(10, "Rate", "Both", "All", 2000, 1999), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
#Query 2 Unit Testing
class bottomStateTestCase(unittest.TestCase):
def test_n_states_null(self): #test exception handling for edge cases of number of states
dfNull = bottomState()
lengthNull = len(dfNull)
self.assertEqual(lengthNull, 5)
def test_n_states(self): #number of states falls within 1-50 range
df = bottomState(10)
length_df = len(df)
self.assertEqual(length_df, 10)
def test_n_states_exception(self): #test exception handling for edge cases of number of states
self.assertEqual(bottomState(-1), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
self.assertEqual(bottomState(0), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
self.assertEqual(bottomState(51), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
self.assertEqual(bottomState(60), "Please pass the correct value for Number of States.Pick a number from 1 to 50.")
def test_measure_exception(self): #test edge cases of measure exception (correct options are "Rate" or "Deaths")
self.assertEqual(bottomState(10, " "), "Please pass the correct value for Measure.")
self.assertEqual(bottomState(10, "rate"), "Please pass the correct value for Measure.")
self.assertEqual(bottomState(10, "test"), "Please pass the correct value for Measure.")
def test_bottomState_return_df(self): #test that a data frame object is returned by bottomState
df_bottomState = bottomState()
self.assertIsInstance(df_bottomState, object)
def test_gender_exception(self): #test exception handling of edge cases of gender parameter
self.assertEqual(bottomState(10, "Rate", "male" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(bottomState(10, "Rate", "female" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(bottomState(10, "Rate", "both" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(bottomState(10, "Rate", "test" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
def test_drugType_exception(self): #test exception handling for edge cases of drug type parameter
self.assertEqual(bottomState(10, "Rate", "Both", "Beer"), "Please pass the correct value for DrugType. Options are All, Opioid, Cocaine or Amphetamine")
def test_startYear_exception(self): #test exception handling for edge cases of startYear parameter
self.assertEqual(bottomState(10, "Rate", "Both", "All", 1999), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
self.assertEqual(bottomState(10, "Rate", "Both", "All", 2018), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
def test_endYear_exception(self): #test exception handling for edge cases of endYear parameter
self.assertEqual(bottomState(10, "Rate", "Both", "All", 2000, 2018), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
self.assertEqual(bottomState(10, "Rate", "Both", "All", 2005, 2004), "End year cannot be less than Start year. Please correct one of the values")
self.assertEqual(bottomState(10, "Rate", "Both", "All", 2000, 1999), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
#Query 3 Unit Testing
class stateHeatMapTestCase(unittest.TestCase):
def test_measure_exception(self): #test edge cases of measure exception (correct options are "Rate" or "Deaths")
self.assertEqual(stateHeatMap(2000, " "), "Please pass the correct value for Measure.")
self.assertEqual(stateHeatMap(2000, "rate"), "Please pass the correct value for Measure.")
self.assertEqual(stateHeatMap(2000, "test"), "Please pass the correct value for Measure.")
def test_gender_exception(self): #test exception handling of edge cases of gender parameter
self.assertEqual(stateHeatMap(2000, "Rate", "male" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateHeatMap(2000, "Rate", "female" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateHeatMap(2000, "Rate", "both" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateHeatMap(2000, "Rate", "test" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
def test_drugType_exception(self): #test exception handling for edge cases of drug type parameter
self.assertEqual(stateHeatMap(2000, "Rate", "Both", "Beer"), "Please pass the correct value for DrugType. Options are All, Opioid, Cocaine or Amphetamine")
#Query 4 Unit Testing
class stateBarChartTestCase(unittest.TestCase):
def test_state_exception(self): #test state exception handling, must pass full state name as string
self.assertEqual(stateBarChart("test"), "Please pass the correct value for State.")
def test_measure_exception(self): #test edge cases of measure exception (correct options are "Rate" or "Deaths")
self.assertEqual(stateBarChart("Virginia", " "), "Please pass the correct value for Measure.")
self.assertEqual(stateBarChart("Virginia", "rate"), "Please pass the correct value for Measure.")
self.assertEqual(stateBarChart("Virginia", "test"), "Please pass the correct value for Measure.")
def test_gender_exception(self): #test exception handling of edge cases of gender parameter
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 2001, "male" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 2001, "female" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 2001, "both" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 2001, "test" ), "Please pass the correct value for Gender. Options are : Male, Female or Both")
def test_startYear_exception(self): #test exception handling for edge cases of startYear parameter
self.assertEqual(stateBarChart("Virginia", "Rate", 1999, 2001, "Both"), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
self.assertEqual(stateBarChart("Virginia", "Rate", 2018, 2001, "Both"), "Please pass the correct value for Start Year. Pick a year from 2000 to 2017.")
def test_endYear_exception(self): #test exception handling for edge cases of endYear parameter
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 2018, "Both"), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
self.assertEqual(stateBarChart("Virginia", "Rate", 2005, 2004, "Both"), "End year cannot be less than Start year. Please correct one of the values")
self.assertEqual(stateBarChart("Virginia", "Rate", 2000, 1999, "Both"), "Please pass the correct value for End Year.Pick a year from 2000 to 2017.")
if __name__ == '__main__':
unittest.main()
| 77.241379 | 165 | 0.695357 | 1,519 | 11,200 | 5.078341 | 0.085583 | 0.112782 | 0.085948 | 0.132227 | 0.900441 | 0.900441 | 0.886829 | 0.87218 | 0.855458 | 0.855458 | 0 | 0.041835 | 0.206071 | 11,200 | 145 | 166 | 77.241379 | 0.825686 | 0.155 | 0 | 0.298077 | 0 | 0 | 0.447626 | 0 | 0 | 0 | 0 | 0 | 0.576923 | 1 | 0.25 | false | 0.490385 | 0.019231 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
694c9a22737ca5272e5382ef5fe7cc20b9cae874 | 5,939 | py | Python | tests/api/test_httpclient.py | nhausman1/python-client | b15f76977dc3178634ee8e007b53f613ddd2ac7c | [
"Apache-2.0"
] | 13 | 2017-03-17T15:15:20.000Z | 2022-03-14T22:24:10.000Z | tests/api/test_httpclient.py | nhausman1/python-client | b15f76977dc3178634ee8e007b53f613ddd2ac7c | [
"Apache-2.0"
] | 81 | 2017-01-12T23:06:48.000Z | 2022-02-21T18:20:23.000Z | tests/api/test_httpclient.py | nhausman1/python-client | b15f76977dc3178634ee8e007b53f613ddd2ac7c | [
"Apache-2.0"
] | 14 | 2017-05-25T10:49:13.000Z | 2021-12-27T16:39:20.000Z | """HTTPClient test module."""
from splitio.api import client
class HttpClientTests(object):
"""Http Client test cases."""
def test_get(self, mocker):
"""Test HTTP GET verb requests."""
response_mock = mocker.Mock()
response_mock.status_code = 200
response_mock.text = 'ok'
get_mock = mocker.Mock()
get_mock.return_value = response_mock
mocker.patch('splitio.api.client.requests.get', new=get_mock)
httpclient = client.HttpClient()
response = httpclient.get('sdk', '/test1', 'some_api_key', {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
client.HttpClient.SDK_URL + '/test1',
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
get_mock.reset_mock()
response = httpclient.get('events', '/test1', 'some_api_key', {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
client.HttpClient.EVENTS_URL + '/test1',
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert get_mock.mock_calls == [call]
assert response.status_code == 200
assert response.body == 'ok'
def test_get_custom_urls(self, mocker):
"""Test HTTP GET verb requests."""
response_mock = mocker.Mock()
response_mock.status_code = 200
response_mock.text = 'ok'
get_mock = mocker.Mock()
get_mock.return_value = response_mock
mocker.patch('splitio.api.client.requests.get', new=get_mock)
httpclient = client.HttpClient(sdk_url='https://sdk.com', events_url='https://events.com')
response = httpclient.get('sdk', '/test1', 'some_api_key', {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
'https://sdk.com/test1',
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert get_mock.mock_calls == [call]
assert response.status_code == 200
assert response.body == 'ok'
get_mock.reset_mock()
response = httpclient.get('events', '/test1', 'some_api_key', {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
'https://events.com/test1',
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
def test_post(self, mocker):
"""Test HTTP GET verb requests."""
response_mock = mocker.Mock()
response_mock.status_code = 200
response_mock.text = 'ok'
get_mock = mocker.Mock()
get_mock.return_value = response_mock
mocker.patch('splitio.api.client.requests.post', new=get_mock)
httpclient = client.HttpClient()
response = httpclient.post('sdk', '/test1', 'some_api_key', {'p1': 'a'}, {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
client.HttpClient.SDK_URL + '/test1',
json={'p1': 'a'},
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
get_mock.reset_mock()
response = httpclient.post('events', '/test1', 'some_api_key', {'p1': 'a'}, {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
client.HttpClient.EVENTS_URL + '/test1',
json={'p1': 'a'},
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
def test_post_custom_urls(self, mocker):
"""Test HTTP GET verb requests."""
response_mock = mocker.Mock()
response_mock.status_code = 200
response_mock.text = 'ok'
get_mock = mocker.Mock()
get_mock.return_value = response_mock
mocker.patch('splitio.api.client.requests.post', new=get_mock)
httpclient = client.HttpClient(sdk_url='https://sdk.com', events_url='https://events.com')
response = httpclient.post('sdk', '/test1', 'some_api_key', {'p1': 'a'}, {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
'https://sdk.com' + '/test1',
json={'p1': 'a'},
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
get_mock.reset_mock()
response = httpclient.post('events', '/test1', 'some_api_key', {'p1': 'a'}, {'param1': 123}, {'h1': 'abc'})
call = mocker.call(
'https://events.com' + '/test1',
json={'p1': 'a'},
headers={'Authorization': 'Bearer some_api_key', 'h1': 'abc', 'Content-Type': 'application/json'},
params={'param1': 123},
timeout=None
)
assert response.status_code == 200
assert response.body == 'ok'
assert get_mock.mock_calls == [call]
| 42.421429 | 115 | 0.571645 | 671 | 5,939 | 4.886736 | 0.092399 | 0.051235 | 0.048795 | 0.036597 | 0.966148 | 0.966148 | 0.966148 | 0.966148 | 0.966148 | 0.966148 | 0 | 0.032132 | 0.266375 | 5,939 | 139 | 116 | 42.726619 | 0.72045 | 0.027446 | 0 | 0.852459 | 0 | 0 | 0.207137 | 0.021932 | 0 | 0 | 0 | 0 | 0.196721 | 1 | 0.032787 | false | 0 | 0.008197 | 0 | 0.04918 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15c4caaffef443c391aff5dcf2c89a7a125ac02e | 4,069 | py | Python | pines/estimators.py | dmitru/decision-trees | 097bc740db9a46b16a018b2c45b6c28f7d678265 | [
"MIT"
] | 11 | 2016-01-23T18:43:16.000Z | 2021-09-17T14:04:04.000Z | pines/estimators.py | dmitru/decision-trees | 097bc740db9a46b16a018b2c45b6c28f7d678265 | [
"MIT"
] | 6 | 2016-01-17T13:42:27.000Z | 2017-10-24T15:59:12.000Z | pines/estimators.py | dmitru/decision-trees | 097bc740db9a46b16a018b2c45b6c28f7d678265 | [
"MIT"
] | 1 | 2017-03-27T09:15:15.000Z | 2017-03-27T09:15:15.000Z | # coding=utf-8
import numpy as np
from sklearn.base import BaseEstimator, ClassifierMixin, RegressorMixin
from sklearn.utils import check_X_y, check_array
from sklearn.utils.validation import NotFittedError
from pines.tree_builders import TreeType, ProblemType
class DecisionTreeClassifier(BaseEstimator, ClassifierMixin):
def __init__(self, tree_type=TreeType.CART, **kwargs):
"""
Builds a decision tree for a classification problem.
Args:
tree_type (string): One of 'cart' or 'oblivious', default is 'cart'
**kwargs: arguments to pass to a `TreeBuilder` instance
Returns: self
"""
super(DecisionTreeClassifier, self).__init__()
self.tree_ = None
self.tree_type = tree_type
self._tree_builder_kwargs = kwargs
self._tree_builder_class = TreeType.get_tree_builder(tree_type)
def fit(self, X, y, **kwargs):
X, y = check_X_y(X, y, dtype=np.float64)
data_size, n_features = X.shape
self._n_features = n_features
self._tree_builder = self._tree_builder_class(
problem=ProblemType.CLASSIFICATION,
**self._tree_builder_kwargs
)
self.tree_ = self._tree_builder.build_tree(X, y)
return self
def predict(self, X, check_input=True):
if check_input:
X = self._validate_X_predict(X, check_input=True)
return self.tree_.predict(X)
def _validate_X_predict(self, X, check_input):
"""Validate X whenever one tries to predict, apply, predict_proba"""
if self.tree_ is None:
raise NotFittedError("Estimator not fitted, "
"call `fit` before exploiting the model.")
if check_input:
X = check_array(X, dtype='f')
n_features = X.shape[1]
if self._n_features != n_features:
raise ValueError("Number of features of the model must "
" match the input. Model n_features is %s and "
" input n_features is %s "
% (self._n_features, n_features))
return X
class DecisionTreeRegressor(BaseEstimator, RegressorMixin):
def __init__(self, tree_type=TreeType.CART, **kwargs):
"""
Builds a decision tree for a classification problem.
Args:
tree_type (string): One of 'cart' or 'oblivious', default is 'cart'
**kwargs: arguments to pass to a `TreeBuilder` instance
Returns: self
"""
super(DecisionTreeRegressor, self).__init__()
self._tree = None
self.tree_type = tree_type
self._tree_builder_kwargs = kwargs
self._tree_builder_class = TreeType.get_tree_builder(tree_type)
def fit(self, X, y, **kwargs):
X, y = check_X_y(X, y, dtype=np.float64)
data_size, n_features = X.shape
self._n_features = n_features
self._tree_builder = self._tree_builder_class(
problem=ProblemType.REGRESSION,
**self._tree_builder_kwargs
)
self._tree = self._tree_builder.build_tree(X, y)
return self
def predict(self, X, check_input=True):
if check_input:
X = self._validate_X_predict(X, check_input=True)
return self._tree.predict(X)
def _validate_X_predict(self, X, check_input):
"""Validate X whenever one tries to predict, apply, predict_proba"""
if self._tree is None:
raise NotFittedError("Estimator not fitted, "
"call `fit` before exploiting the model.")
if check_input:
X = check_array(X, dtype='f')
n_features = X.shape[1]
if self._n_features != n_features:
raise ValueError("Number of features of the model must "
" match the input. Model n_features is %s and "
" input n_features is %s "
% (self._n_features, n_features))
return X
| 35.692982 | 79 | 0.609486 | 494 | 4,069 | 4.748988 | 0.192308 | 0.081841 | 0.076726 | 0.035806 | 0.832055 | 0.832055 | 0.832055 | 0.832055 | 0.832055 | 0.832055 | 0 | 0.002477 | 0.30548 | 4,069 | 113 | 80 | 36.00885 | 0.827672 | 0.135414 | 0 | 0.712329 | 0 | 0 | 0.098911 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.109589 | false | 0 | 0.068493 | 0 | 0.287671 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15c782543995d6b640c620a364a9f54dd5cc37b7 | 24,761 | py | Python | tests/core_tests_chain_collection.py | gf712/AbPyTools | 9ff0d4346ad80487d43875bc77d99fbe76170db4 | [
"MIT"
] | 13 | 2017-06-13T12:31:47.000Z | 2022-03-23T02:14:01.000Z | tests/core_tests_chain_collection.py | gf712/AbPyTools | 9ff0d4346ad80487d43875bc77d99fbe76170db4 | [
"MIT"
] | 8 | 2018-02-21T22:15:35.000Z | 2022-02-01T12:27:58.000Z | tests/core_tests_chain_collection.py | gf712/AbPyTools | 9ff0d4346ad80487d43875bc77d99fbe76170db4 | [
"MIT"
] | 3 | 2018-04-10T08:01:39.000Z | 2021-10-10T14:37:43.000Z | import unittest
from abpytools import ChainCollection, Chain
import operator
import os
from glob import glob
from . import read_sequence, check_connection, ABNUM_URL, IGBLAST_URL
class ChainCollectionCore(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.antibody_collection_1_name = 'test'
cls.chain_test_sequence = read_sequence('./tests/Data/chain_collection_fasta_test.fasta')
def test_ChainCollection_length_0(self):
antibody_collection = ChainCollection()
self.assertEqual(len(antibody_collection), 0)
def test_ChainCollection_input_exception_1(self):
# when ChainCollection.object_list is instantiated with
# something other than a list it throws an error
self.assertRaises(ValueError, ChainCollection, 0)
def test_ChainCollection_input_exception_2(self):
# when ChainCollection.object_list is instantiated with
# a list with non Chain objects it throws an error
self.assertRaises(ValueError, ChainCollection, [Chain(sequence=""), 0])
def test_ChainCollection_input_exception_3(self):
# when ChainCollection is instantiated with static method
# .load_from_fasta with an invalid file path it throws an error
self.assertRaises(ValueError, ChainCollection.load_from_fasta, './tests/Data/NonExistentFile.fasta')
def test_ChainCollection_input_exception_4(self):
# when ChainCollection is instantiated with .load_from_file with
# path to a file that does not have a .fasta or .json extension
# it throws an error
self.assertRaises(ValueError, ChainCollection.load_from_file, './tests/Data/__init__.py')
def test_ChainCollection_input_1(self):
# instantiate ChainCollection with a Chain (empty) object
# this doesn't make any sense and raises an error
self.assertRaises(ValueError, ChainCollection, [Chain(sequence="")])
def test_ChainCollection_input_2(self):
# instantiate ChainCollection with a loaded Chain object
test_chain = Chain(sequence=self.chain_test_sequence)
test_collection = ChainCollection(antibody_objects=[test_chain])
self.assertIsInstance(test_collection, ChainCollection)
def test_ChainCollection_load_fasta_exception(self):
# throws error when reading a file with fasta extention,
# but with the wrong format
self.assertRaises(ValueError, ChainCollection.load_from_file, './tests/Data/NotAFASTAFile.fasta')
def test_ChainCollection_chain(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json')
self.assertEqual(antibody_collection_1.chain, 'heavy')
@unittest.skipUnless(check_connection(URL=IGBLAST_URL), 'No internet connection, skipping test.')
def test_ChainCollection_chain_2(self):
# checks if the chain type is read properly from a Chain object
test_chain = Chain(sequence=self.chain_test_sequence)
test_chain.load()
test_collection = ChainCollection(antibody_objects=[test_chain])
self.assertEqual(test_collection.chain, 'heavy')
def test_ChainCollection_proto_io_1(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.save(file_format='pb2', path='./tests/chain_collection_1_heavy')
self.assertTrue(os.path.isfile('./tests/chain_collection_1_heavy.pb2'))
def test_ChainCollection_proto_io_2(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/chain_collection_1_heavy.pb2',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.names[0], 'test')
def test_ChainCollection_n_ab(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.n_ab, 1)
def test_ChainCollection_numbering_scheme(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_scheme, 'chothia')
def test_ChainCollection_numbering_scheme_kabat(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.set_numbering_scheme('kabat', realign=False)
self.assertEqual(antibody_collection_1.numbering_scheme, 'kabat')
def test_ChainCollection_Hmatrix_shape(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
# if this fails it means that abysis has been updated
self.assertEqual(antibody_collection_1.hydrophobicity_matrix().shape, (1, 158))
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_ChainCollection_Hmatrix_calculation(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='chothia')
self.assertEqual(antibody_collection_1.hydrophobicity_matrix().shape, (1, 158))
def test_ChainCollection_sequence_length(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(len(antibody_collection_1.sequences), 1)
def test_ChainCollection_obj_length(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(len(antibody_collection_1), 1)
def test_ChainCollection_slicing_1_obj(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
# if returning a single chain abpytools automatically creates a new Chain object
self.assertIsInstance(antibody_collection_1[0], Chain)
def test_ChainCollection_slicing_2_obj(self):
antibody_collection_1 = ChainCollection.load_from_file(
path='./tests/Data/chain_collection_heavy_2_sequences.json', show_progressbar=False, verbose=False)
# slicing multiple sequences returns a ChainCollection object
self.assertIsInstance(antibody_collection_1[[0, 1]], ChainCollection)
def test_ChainCollection_cdr_regions_part1(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertCountEqual(antibody_collection_1.ab_region_index().keys(),
[self.antibody_collection_1_name])
def test_ChainCollection_cdr_regions_part2(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertCountEqual(antibody_collection_1.ab_region_index()[self.antibody_collection_1_name],
['CDR', 'FR'])
def test_ChainCollection_cdr_regions_part3_cdr(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertCountEqual(antibody_collection_1.ab_region_index()[self.antibody_collection_1_name]['CDR'],
['CDR1', 'CDR2', 'CDR3'])
def test_ChainCollection_cdr_regions_part3_fr(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertCountEqual(antibody_collection_1.ab_region_index()[self.antibody_collection_1_name]['FR'],
['FR1', 'FR2', 'FR3', 'FR4'])
def test_ChainCollection_total_charge(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.total_charge[self.antibody_collection_1_name], 1.3278508)
def test_ChainCollection_igblast_parser_germline(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.igblast_local_query('tests/Data/chain_collection_1_igblast.html')
self.assertEqual(antibody_collection_1.germline[self.antibody_collection_1_name][0], 'IGHV4-34*01')
def test_ChainCollection_igblast_parser_germline_score(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.igblast_local_query('tests/Data/chain_collection_1_igblast.html')
self.assertAlmostEqual(antibody_collection_1.germline[self.antibody_collection_1_name][1], 9.11e-69,
delta=10e-9)
@unittest.skipUnless(check_connection(URL=IGBLAST_URL), 'No internet connection, skipping test.')
def test_ChainCollection_igblast_server_query_germline(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.igblast_server_query(show_progressbar=False)
self.assertEqual(antibody_collection_1.germline[self.antibody_collection_1_name][0], 'IGHV4-34*01')
@unittest.skipUnless(check_connection(URL=IGBLAST_URL), 'No internet connection, skipping test.')
def test_ChainCollection_igblast_server_query_score(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.igblast_server_query(show_progressbar=False)
self.assertAlmostEqual(antibody_collection_1.germline[self.antibody_collection_1_name][1], 9.11e-69,
delta=10e-9)
@unittest.skipUnless(check_connection(URL=IGBLAST_URL), 'No internet connection, skipping test.')
def test_ChainCollection_igblast_server_query_identity(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.igblast_server_query(show_progressbar=False)
self.assertEqual(antibody_collection_1.germline_identity[self.antibody_collection_1_name]['Total'], 96.9)
def test_ChainCollection_slicing(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertIsInstance(antibody_collection_1.get_object('test'), Chain)
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_Chain_abysis_parser(self):
antibody = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='chothia', verbose=False, show_progressbar=False)
self.assertEqual(antibody.chain, 'heavy')
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_Chain_abysis_parser_chothia(self):
antibody = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='chothia', verbose=False, show_progressbar=False)
self.assertEqual(antibody.numbering_table(as_array=True)[0][-1], '-')
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_Chain_abysis_parser_kabat(self):
antibody = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='kabat', verbose=False, show_progressbar=False)
self.assertEqual(antibody.numbering_table(as_array=True)[0][-1], '-')
def test_ChainCollection_numbering_tableDataFrame(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_table(as_array=False)['CDR1']['H32'].values[0], 'Y')
def test_ChainCollection_numbering_table_shape_np(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_table(as_array=True).shape, (1, 158))
def test_ChainCollection_numbering_table_shape_pd(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_table(as_array=False).shape, (1, 158))
def test_ChainCollection_numbering_table_region_pd(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(
antibody_collection_1.numbering_table(region='CDR1').loc[self.antibody_collection_1_name].values[-1], 'Y')
def test_ChainCollection_numbering_table_region_np(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_table(as_array=True, region='CDR1')[0][-1], 'Y')
def test_ChainCollection_numbering_table_fr_region(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.numbering_table(region='FR1').loc['test'].values[0], 'Q')
def test_ChainCollection_molecular_weight(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.molecular_weights(monoisotopic=False)[0], 20029.85217699999)
def test_ChainCollection_molecular_weight_monoisotopic(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.molecular_weights(monoisotopic=True)[0], 20042.1121)
def test_ChainCollection_ec(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.extinction_coefficients(reduced=False)[0], 52410.0)
def test_ChainCollection_ec_reduced(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.extinction_coefficients(reduced=True)[0], 52160.0)
def test_ChainCollection_charge(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertAlmostEqual(antibody_collection_1.charge.sum(), 1.7497642167513607)
def test_ChainCollection_get_object_exception(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertRaises(ValueError, antibody_collection_1.get_object, 'foo')
def test_ChainCollection_get_object_1(self):
# check if get_object returns a Chain object
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertIsInstance(antibody_collection_1.get_object('test'), Chain)
def test_ChainCollection_get_object_2(self):
# check if get_object returns a Chain object and keeps the information (i.e. name)
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.get_object('test').name, 'test')
def test_ChainCollection_add(self):
# check if adding two ChainCollection objects with one sequence each
# results in a ChainCollection object with two sequences
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_2 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_2_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_3 = antibody_collection_1 + antibody_collection_2
self.assertEqual(antibody_collection_3.n_ab, 2)
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_ChainCollection_add_exception_1(self):
# check if adding two ChainCollection objects with one sequence each
# results in a ChainCollection object with two sequences
antibody_chothia = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='chothia',
show_progressbar=False, verbose=False)
antibody_kabat = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='kabat',
show_progressbar=False, verbose=False)
self.assertRaises(ValueError, operator.add, antibody_chothia, antibody_kabat)
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_ChainCollection_add_exception_2(self):
antibody_chothia = ChainCollection.load_from_file(path='./tests/Data/chain_collection_fasta_test.fasta',
numbering_scheme='chothia', show_progressbar=False,
verbose=False)
antibody_kabat = Chain(sequence=read_sequence('./tests/Data/chain_collection_fasta_test.fasta'),
numbering_scheme='kabat')
antibody_kabat.load()
self.assertRaises(ValueError, operator.add, antibody_chothia, antibody_kabat)
def test_ChainCollection_add_exception_3(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
self.assertRaises(ValueError, operator.add, antibody_collection_1, 0)
@unittest.skipUnless(check_connection(URL=ABNUM_URL), 'No internet connection, skipping test.')
def test_ChainCollection_fasta(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.save(file_format='fasta', path='./tests/SaveTest')
antibody_collection_2 = ChainCollection.load_from_file(path='./tests/SaveTest.fasta',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.sequences[0], antibody_collection_2.sequences[0])
def test_ChainCollection_json(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.save(file_format='json', path='./tests/SaveTest')
antibody_collection_2 = ChainCollection.load_from_file(path='./tests/SaveTest.json',
show_progressbar=False, verbose=False)
self.assertEqual(antibody_collection_1.sequences[0], antibody_collection_2.sequences[0])
def test_ChainCollection_append_1(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_2 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_2_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.append(antibody_collection_2)
self.assertEqual(antibody_collection_1.n_ab, 2)
def test_ChainCollection_append_2(self):
antibody_collection_1 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_1_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_2 = ChainCollection.load_from_file(path='./tests/Data/chain_collection_2_heavy.json',
show_progressbar=False, verbose=False)
antibody_collection_1.append(antibody_collection_2)
self.assertEqual(antibody_collection_1.hydrophobicity_matrix().shape, (2, 158))
@classmethod
def tearDownClass(cls):
for name in glob('./tests/*'):
if name.split('.')[-1] != 'py' and os.path.isfile(name):
os.remove(name)
| 67.838356 | 118 | 0.667259 | 2,659 | 24,761 | 5.861978 | 0.086499 | 0.107975 | 0.131648 | 0.097004 | 0.870533 | 0.821839 | 0.783217 | 0.767434 | 0.730673 | 0.715404 | 0 | 0.019056 | 0.249748 | 24,761 | 364 | 119 | 68.024725 | 0.819992 | 0.053592 | 0 | 0.487455 | 0 | 0 | 0.139476 | 0.110769 | 0 | 0 | 0 | 0 | 0.204301 | 1 | 0.21147 | false | 0 | 0.021505 | 0 | 0.236559 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15d22516a0ba2f83ae509463ea08a62e04be6398 | 3,006 | py | Python | onnx/model/mlp.py | pxssw/GCN_ActionRecongtionTools | 14a3826817c2066b6188f2139deb0969cc596521 | [
"MIT"
] | 1 | 2020-11-04T09:18:23.000Z | 2020-11-04T09:18:23.000Z | onnx/model/mlp.py | pxssw/GCN_ActionRecongtionTools | 14a3826817c2066b6188f2139deb0969cc596521 | [
"MIT"
] | null | null | null | onnx/model/mlp.py | pxssw/GCN_ActionRecongtionTools | 14a3826817c2066b6188f2139deb0969cc596521 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
from model.activation import activation_factory
import ipdb
from ptflops import get_model_complexity_info
class MLP(nn.Module):
def __init__(self, in_channels, out_channels, activation='relu', dropout=0):
super().__init__()
channels = [in_channels] + out_channels
self.layers = nn.ModuleList()
for i in range(1, len(channels)):
if dropout > 0.001:
self.layers.append(nn.Dropout(p=dropout))
# self.layers.append(nn.Conv2d(channels[i-1], channels[i], kernel_size=1))
#1*1 in_channels-16-32-out_channels
self.layers.append(nn.Conv2d(channels[i-1], 16, kernel_size=1))
self.layers.append(nn.BatchNorm2d(16))
self.layers.append(activation_factory(activation))
# 16-32
# self.layers.append(nn.Conv2d(16, 32, kernel_size=1))
# self.layers.append(nn.BatchNorm2d(32))
# self.layers.append(activation_factory(activation))
# 32-64 out_channels
self.layers.append(nn.Conv2d(16, channels[i], kernel_size=1))
self.layers.append(nn.BatchNorm2d(channels[i]))
self.layers.append(activation_factory(activation))
def forward(self, x):
# Input shape: (N,C,T,V)
# ipdb.set_trace()
for layer in self.layers:
x = layer(x)
return x
class MLP2(nn.Module):
def __init__(self, in_channels, out_channels, activation='relu', dropout=0):
super().__init__()
channels = [in_channels] + out_channels
self.layers = nn.ModuleList()
for i in range(1, len(channels)):
if dropout > 0.001:
self.layers.append(nn.Dropout(p=dropout))
# self.layers.append(nn.Conv2d(channels[i-1], channels[i], kernel_size=1))
#1*1 in_channels-16-32-out_channels
self.layers.append(nn.Conv2d(channels[i-1], 16, kernel_size=1))
self.layers.append(nn.BatchNorm2d(16))
self.layers.append(activation_factory(activation))
# 16-32
# self.layers.append(nn.Conv2d(64, 128, kernel_size=1))
# self.layers.append(nn.BatchNorm2d(128))
# self.layers.append(activation_factory(activation))
# 64-128 out_channels
self.layers.append(nn.Conv2d(16, channels[i], kernel_size=1))
self.layers.append(nn.BatchNorm2d(channels[i]))
self.layers.append(activation_factory(activation))
def forward(self, x):
# Input shape: (N,C,T,V)
for layer in self.layers:
x = layer(x)
return x
if __name__ == "__main__":
# bugs cpu errors
msgcn = MLP2(3 * 13, [128]).cuda()
msgcn.forward(torch.randn(1,3*13,10,25).cuda())
flops, params = get_model_complexity_info(msgcn, (3*13, 10, 25), as_strings=True, print_per_layer_stat=True)
print("%s |%s |%s" % ('MSG3D', flops, params))
| 38.538462 | 112 | 0.615436 | 401 | 3,006 | 4.453865 | 0.201995 | 0.145577 | 0.197088 | 0.161254 | 0.790594 | 0.790594 | 0.741321 | 0.741321 | 0.696529 | 0.696529 | 0 | 0.052023 | 0.25183 | 3,006 | 77 | 113 | 39.038961 | 0.742108 | 0.209914 | 0 | 0.723404 | 0 | 0 | 0.013158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085106 | false | 0 | 0.12766 | 0 | 0.297872 | 0.042553 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6225c27793c2a8bce5347eb5104a418079d94f1 | 205 | py | Python | run_game.py | trascen/u2665 | 95eef1e225a8618eab7c1c5d839329b9df31dc9e | [
"Unlicense"
] | 2 | 2017-10-16T15:27:42.000Z | 2017-10-17T17:13:29.000Z | run_game.py | trascen/u2665 | 95eef1e225a8618eab7c1c5d839329b9df31dc9e | [
"Unlicense"
] | null | null | null | run_game.py | trascen/u2665 | 95eef1e225a8618eab7c1c5d839329b9df31dc9e | [
"Unlicense"
] | null | null | null | #! /usr/bin/env python
from gamelib.depcheck import check_dependencies
from gamelib.depcheck import check_version
if check_version() and check_dependencies():
from gamelib import main
main.main() | 25.625 | 47 | 0.785366 | 28 | 205 | 5.607143 | 0.5 | 0.210191 | 0.242038 | 0.318471 | 0.382166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141463 | 205 | 8 | 48 | 25.625 | 0.892045 | 0.102439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c633502ffcc18174a7064e06704fb2ed7fccdbd8 | 10,637 | py | Python | podite/types/array.py | nimily/pod | fd159d406f907f7e6595b4d4b23c4271b0850da7 | [
"MIT"
] | 3 | 2022-03-11T19:06:07.000Z | 2022-03-14T20:02:44.000Z | podite/types/array.py | nimily/pod | fd159d406f907f7e6595b4d4b23c4271b0850da7 | [
"MIT"
] | 1 | 2022-03-14T18:08:53.000Z | 2022-03-14T18:08:53.000Z | podite/types/array.py | nimily/pod | fd159d406f907f7e6595b4d4b23c4271b0850da7 | [
"MIT"
] | null | null | null | from .atomic import U32
from ..bytes import BYTES_CATALOG
from .._utils import _GetitemToCall, get_concrete_type, get_calling_module
from ..json import JSON_CATALOG
from ..decorators import pod
def _fixed_len_array(name, type_, length, autopad=False):
module = get_calling_module()
@pod(dataclass_fn=None)
class _ArrayPod:
@classmethod
def _is_static(cls) -> bool:
return BYTES_CATALOG.is_static(get_concrete_type(module, type_))
@classmethod
def _calc_size(cls, obj, **kwargs):
return cls._calc_max_size()
@classmethod
def _calc_max_size(cls):
return (
BYTES_CATALOG.calc_max_size(get_concrete_type(module, type_)) * length
)
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
result = []
for _ in range(length):
value = BYTES_CATALOG.unpack_partial(
get_concrete_type(module, type_), buffer
)
result.append(value)
return result
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
if len(obj) != length:
raise ValueError("Length of array does not equal fixed length")
for elem in obj:
BYTES_CATALOG.pack_partial(
get_concrete_type(module, type_), buffer, elem
)
@classmethod
def _to_dict(cls, obj):
return [JSON_CATALOG.pack(get_concrete_type(module, type_), e) for e in obj]
@classmethod
def _from_dict(cls, raw):
return [
JSON_CATALOG.unpack(get_concrete_type(module, type_), e) for e in raw
]
_ArrayPod.__name__ = f"{name}[{type_}, {length}]"
_ArrayPod.__qualname__ = _ArrayPod.__name__
return _ArrayPod
def _fixed_len_bytes(name, length):
@pod(dataclass_fn=None)
class _BytesPod:
@classmethod
def _is_static(cls) -> bool:
return True
@classmethod
def _calc_size(cls, obj, **kwargs):
return cls._calc_max_size()
@classmethod
def _calc_max_size(cls):
return length
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
val = buffer.read(length)
if len(val) != length:
raise ValueError(f"Buffer length is {len(val)}, but expected {length}")
return val
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
buffer.write(obj.ljust(length, b"\x00"))
@classmethod
def _to_dict(cls, obj):
return list(obj)
@classmethod
def _from_dict(cls, raw):
return bytes(raw)
@classmethod
def _to_dict(cls, obj):
return list(obj)
@classmethod
def _from_dict(cls, raw):
return bytes(raw)
_BytesPod.__name__ = f"{name}[{length}]"
_BytesPod.__qualname__ = _BytesPod.__name__
return _BytesPod
def _fixed_len_str(name, length, encoding="UTF-8", autopad=True):
@pod(dataclass_fn=None)
class _StrPod:
@classmethod
def _is_static(cls) -> bool:
return True
@classmethod
def _calc_size(cls, obj, **kwargs):
return length
@classmethod
def _calc_max_size(cls):
return length
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
encoded = buffer.read(length)
if autopad:
last = 0
while last < len(encoded) and encoded[last] != 0:
last += 1
encoded = encoded[:last]
return encoded.decode(encoding)
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
encoded = obj.encode(encoding)
if len(encoded) > length:
raise ValueError("len(value) > length")
elif len(encoded) < length and not autopad:
raise ValueError("len(value) < size")
buffer.write(encoded.ljust(length, b"\x00"))
@classmethod
def _to_dict(cls, obj):
return obj
@classmethod
def _from_dict(cls, raw):
return raw
_StrPod.__name__ = f"{name}[{length}, encoding={encoding}]"
_StrPod.__qualname__ = _StrPod.__name__
return _StrPod
def _var_len_array(name, type_, max_length=None, length_type=None):
module = get_calling_module()
if length_type is None:
length_type = U32
if max_length is None:
max_length = 2 ** (BYTES_CATALOG.calc_max_size(length_type) * 8)
@pod(dataclass_fn=None)
class _ArrayPod:
@classmethod
def is_static(cls) -> bool:
return False
@classmethod
def _calc_size(cls, obj, **kwargs):
len_size = BYTES_CATALOG.calc_max_size(length_type)
ty = get_concrete_type(module, type_)
body_size = sum(
(BYTES_CATALOG.calc_size(ty, elem, **kwargs) for elem in obj)
)
return len_size + body_size
@classmethod
def _calc_max_size(cls):
len_size = BYTES_CATALOG.calc_max_size(length_type)
body_size = (
BYTES_CATALOG.calc_max_size(get_concrete_type(module, type_))
* max_length
)
return len_size + body_size
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
length = BYTES_CATALOG.unpack_partial(length_type, buffer, **kwargs)
if length > max_length:
raise RuntimeError("actual_length > max_length")
result = []
for _ in range(length):
value = BYTES_CATALOG.unpack_partial(
get_concrete_type(module, type_), buffer
)
result.append(value)
return result
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
if len(obj) > max_length:
raise RuntimeError("actual_length > max_length")
BYTES_CATALOG.pack_partial(length_type, buffer, len(obj), **kwargs)
for elem in obj:
BYTES_CATALOG.pack_partial(
get_concrete_type(module, type_), buffer, elem
)
@classmethod
def _to_dict(cls, obj):
return [JSON_CATALOG.pack(get_concrete_type(module, type_), e) for e in obj]
@classmethod
def _from_dict(cls, raw):
return [
JSON_CATALOG.unpack(get_concrete_type(module, type_), e) for e in raw
]
_ArrayPod.__name__ = (
f"{name}[{type_}, length_type={length_type}, max_length={max_length}]"
)
_ArrayPod.__qualname__ = _ArrayPod.__name__
return _ArrayPod
def _var_len_bytes(name, max_length=None, length_type=None):
if length_type is None:
length_type = U32
if max_length is None:
max_length = 2 ** (BYTES_CATALOG.calc_max_size(length_type) * 8)
@pod(dataclass_fn=None)
class _BytesPod:
@classmethod
def _is_static(cls) -> bool:
return False
@classmethod
def _calc_size(cls, obj, **kwargs):
len_size = BYTES_CATALOG.calc_max_size(length_type)
return len_size + len(obj)
@classmethod
def _calc_max_size(cls):
len_size = BYTES_CATALOG.calc_max_size(length_type)
body_size = max_length
return len_size + body_size
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
length = BYTES_CATALOG.unpack_partial(length_type, buffer, **kwargs)
if length > max_length:
raise RuntimeError("actual_length > max_length")
return buffer.read(length)
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
if len(obj) > max_length:
raise RuntimeError("actual_length > max_length")
BYTES_CATALOG.pack_partial(length_type, buffer, len(obj), **kwargs)
buffer.write(obj)
@classmethod
def _to_dict(cls, obj):
return list(obj)
@classmethod
def _from_dict(cls, raw):
return bytes(raw)
_BytesPod.__name__ = f"{name}[length_type={length_type}, max_length={max_length}]"
_BytesPod.__qualname__ = _BytesPod.__name__
return _BytesPod
def _var_len_str(name, max_length=None, length_type=None, encoding="UTF-8"):
if length_type is None:
length_type = U32
if max_length is None:
max_length = 2 ** (BYTES_CATALOG.calc_max_size(length_type) * 8)
@pod(dataclass_fn=None)
class _StrPod:
@classmethod
def _is_static(cls) -> bool:
return False
@classmethod
def _calc_size(cls, obj, **kwargs):
len_size = BYTES_CATALOG.calc_max_size(length_type)
return len_size + len(obj.encode("utf-8"))
@classmethod
def _calc_max_size(cls):
len_size = BYTES_CATALOG.calc_max_size(length_type)
body_size = max_length
return len_size + body_size
@classmethod
def _from_bytes_partial(cls, buffer, **kwargs):
length = BYTES_CATALOG.unpack_partial(length_type, buffer, **kwargs)
if length > max_length:
raise RuntimeError("actual_length > max_length")
return buffer.read(length).decode(encoding)
@classmethod
def _to_bytes_partial(cls, buffer, obj, **kwargs):
if len(obj) > max_length:
raise RuntimeError("actual_length > max_length")
BYTES_CATALOG.pack_partial(length_type, buffer, len(obj), **kwargs)
buffer.write(obj.encode(encoding))
@classmethod
def _to_dict(cls, obj):
return obj
@classmethod
def _from_dict(cls, raw):
return raw
_StrPod.__name__ = f"{name}[max_length={max_length}, length_type={length_type}, encoding={encoding}]"
_StrPod.__qualname__ = _StrPod.__name__
return _StrPod
FixedLenArray = _GetitemToCall("FixedLenArray", _fixed_len_array)
FixedLenBytes = _GetitemToCall("FixedLenBytes", _fixed_len_bytes)
FixedLenStr = _GetitemToCall("FixedLenStr", _fixed_len_str)
Vec = _GetitemToCall("Vec", _var_len_array)
Bytes = _GetitemToCall("Bytes", _var_len_bytes)
Str = _GetitemToCall("Str", _var_len_str)
| 30.048023 | 105 | 0.594246 | 1,204 | 10,637 | 4.866279 | 0.083056 | 0.105137 | 0.035672 | 0.043011 | 0.804916 | 0.79092 | 0.786312 | 0.772145 | 0.71275 | 0.696194 | 0 | 0.003294 | 0.315126 | 10,637 | 353 | 106 | 30.133144 | 0.800961 | 0 | 0 | 0.739777 | 0 | 0 | 0.059979 | 0.015418 | 0 | 0 | 0 | 0 | 0 | 1 | 0.185874 | false | 0 | 0.018587 | 0.096654 | 0.390335 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d68d2060a108f5b35279a06a24abed421fb37011 | 49,265 | py | Python | haas/tests/test_result.py | pzahemszky/haas | d124292862b82813d6e22788a5ca477d2e248936 | [
"BSD-3-Clause"
] | null | null | null | haas/tests/test_result.py | pzahemszky/haas | d124292862b82813d6e22788a5ca477d2e248936 | [
"BSD-3-Clause"
] | null | null | null | haas/tests/test_result.py | pzahemszky/haas | d124292862b82813d6e22788a5ca477d2e248936 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2013-2014 Simon Jagoe
# All rights reserved.
#
# This software may be modified and distributed under the terms
# of the 3-clause BSD license. See the LICENSE.txt file for details.
from __future__ import absolute_import, unicode_literals
from datetime import datetime, timedelta
from time import ctime
import sys
from mock import Mock, patch
from six.moves import StringIO
from testfixtures import ShouldWarn
import six
from ..plugins.i_result_handler_plugin import IResultHandlerPlugin
from ..plugins.result_handler import (
QuietTestResultHandler, StandardTestResultHandler,
VerboseTestResultHandler)
from ..result import (
ResultCollector, TestResult, TestCompletionStatus, TestDuration,
ResultCollecter,
)
from ..testing import unittest
from . import _test_cases, _test_case_data
from .fixtures import ExcInfoFixture, MockDateTime
class TestTextTestResult(ExcInfoFixture, unittest.TestCase):
def test_result_collector_calls_handlers_start_stop_methods(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
case = _test_cases.TestCase('test_method')
# When
handler.reset_mock()
collector.startTestRun()
# Then
handler.start_test_run.assert_called_once_with()
self.assertFalse(handler.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
# When
handler.reset_mock()
collector.stopTestRun()
# Then
handler.stop_test_run.assert_called_once_with()
self.assertFalse(handler.called)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
# When
handler.reset_mock()
collector.startTest(case)
# Then
handler.start_test.assert_called_once_with(case)
self.assertFalse(handler.called)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.stop_test.called)
# When
handler.reset_mock()
collector.stopTest(case)
# Then
handler.stop_test.assert_called_once_with(case)
self.assertFalse(handler.called)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
def test_unicode_traceback(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
msg = '\N{GREEK SMALL LETTER PHI}'.encode('utf-8')
with self.failure_exc_info(msg) as exc_info:
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addError(case, exc_info)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertFalse(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_error(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# When
with self.exc_info(RuntimeError) as exc_info:
# Given
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addError(case, exc_info)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertFalse(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_failure(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
with self.failure_exc_info() as exc_info:
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addFailure(case, exc_info)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertFalse(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_success(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.success, expected_duration)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addSuccess(case)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertTrue(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_skip(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.skipped, expected_duration,
message='reason')
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addSkip(case, 'reason')
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertTrue(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_expected_fail(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
with self.exc_info(RuntimeError) as exc_info:
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.expected_failure, expected_duration,
exception=exc_info)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addExpectedFailure(case, exc_info)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertTrue(collector.wasSuccessful())
def test_result_collector_calls_handlers_on_unexpected_success(self):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector()
collector.add_result_handler(handler)
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# Given
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.unexpected_success, expected_duration)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addUnexpectedSuccess(case)
# Then
handler.assert_called_once_with(expected_result)
self.assertFalse(handler.start_test_run.called)
self.assertFalse(handler.stop_test_run.called)
self.assertFalse(handler.start_test.called)
self.assertFalse(handler.stop_test.called)
self.assertFalse(collector.wasSuccessful())
def test_result_collector_should_stop(self):
# Given
collector = ResultCollector()
# Then
self.assertFalse(collector.shouldStop)
# When
collector.stop()
# Then
self.assertTrue(collector.shouldStop)
def test_multiple_errors_from_one_test(self):
# Given
collector = ResultCollector()
case = _test_case_data.TestWithTwoErrors('test_with_two_errors')
start_time = datetime(2016, 4, 12, 8, 17, 32)
test_end_time = datetime(2016, 4, 12, 8, 17, 38)
tear_down_end_time = datetime(2016, 4, 12, 8, 17, 39)
# When
with patch('haas.result.datetime',
new=MockDateTime([start_time, test_end_time,
tear_down_end_time])):
case.run(collector)
# Then
self.assertEqual(len(collector.errors), 2)
class TestFailfast(ExcInfoFixture, unittest.TestCase):
def test_failfast_enabled_on_error(self):
# Given
collector = ResultCollector(failfast=True)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
with self.exc_info(RuntimeError) as exc_info:
collector.addError(case, exc_info)
# Then
self.assertTrue(collector.shouldStop)
def test_failfast_enabled_on_failure(self):
# Given
collector = ResultCollector(failfast=True)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
with self.failure_exc_info() as exc_info:
collector.addFailure(case, exc_info)
# Then
self.assertTrue(collector.shouldStop)
def test_failfast_enabled_on_unexpected_success(self):
# Given
collector = ResultCollector(failfast=False)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
collector.addUnexpectedSuccess(case)
# Then
self.assertFalse(collector.shouldStop)
def test_failfast_disabled_on_error(self):
# Given
collector = ResultCollector(failfast=False)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
with self.exc_info(RuntimeError) as exc_info:
collector.addError(case, exc_info)
# Then
self.assertFalse(collector.shouldStop)
def test_failfast_disabled_on_failure(self):
# Given
collector = ResultCollector(failfast=False)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
with self.failure_exc_info() as exc_info:
collector.addFailure(case, exc_info)
# Then
self.assertFalse(collector.shouldStop)
def test_failfast_disabled_on_unexpected_success(self):
# Given
collector = ResultCollector(failfast=False)
self.assertFalse(collector.shouldStop)
case = _test_cases.TestCase('test_method')
collector.startTest(case)
# When
collector.addUnexpectedSuccess(case)
# Then
self.assertFalse(collector.shouldStop)
class TestBuffering(ExcInfoFixture, unittest.TestCase):
@patch('sys.stderr', new_callable=StringIO)
def test_buffering_stderr(self, stderr):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector(buffer=True)
collector.add_result_handler(handler)
test_stderr = 'My Test Output'
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# When
sys.stderr.write(test_stderr)
# Then
self.assertEqual(stderr.getvalue(), '')
# Given
with self.exc_info(RuntimeError) as exc_info:
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info, stderr=test_stderr)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addError(case, exc_info)
collector.stopTest(case)
# Then
self.assertIn(test_stderr, expected_result.exception)
handler.assert_called_once_with(expected_result)
@patch('sys.stdout', new_callable=StringIO)
def test_buffering_stdout(self, stdout):
# Given
handler = Mock(spec=IResultHandlerPlugin)
collector = ResultCollector(buffer=True)
collector.add_result_handler(handler)
test_stdout = 'My Test Output'
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
# When
with patch('haas.result.datetime', new=MockDateTime(start_time)):
collector.startTest(case)
# Then
self.assertTrue(handler.start_test.called)
handler.start_test.reset_mock()
# When
sys.stdout.write(test_stdout)
# Then
self.assertEqual(stdout.getvalue(), '')
# Given
with self.exc_info(RuntimeError) as exc_info:
expected_result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info, stdout=test_stdout)
# When
with patch('haas.result.datetime', new=MockDateTime(end_time)):
collector.addError(case, exc_info)
collector.stopTest(case)
# Then
self.assertIn(test_stdout, expected_result.exception)
handler.assert_called_once_with(expected_result)
class TestQuietResultHandler(ExcInfoFixture, unittest.TestCase):
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_start_test_run(self, stderr):
# Given
handler = QuietTestResultHandler(test_count=1)
# When
handler.start_test_run()
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_output_stop_test_run(self, stderr):
# Given
handler = QuietTestResultHandler(test_count=1)
handler.start_test_run()
# When
handler.stop_test_run()
# Then
output = stderr.getvalue()
self.assertTrue(output.startswith('\n' + handler.separator2))
self.assertTrue(output.endswith('OK\n'))
self.assertRegexpMatches(
output.replace('\n', ''), r'--+.*?Ran 0 tests.*?OK')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_start_test(self, stderr):
# Given
handler = QuietTestResultHandler(test_count=1)
case = _test_cases.TestCase('test_method')
# When
handler.start_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_stop_test(self, stderr):
# Given
handler = QuietTestResultHandler(test_count=1)
case = _test_cases.TestCase('test_method')
# When
handler.stop_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_error(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_failure(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_skip(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.skipped, expected_duration,
message='reason')
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_expected_fail(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.expected_failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_unexpected_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.unexpected_success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_with_error_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
handler.start_test_run()
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case,).replace('(', r'\(').replace(')', r'\)')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?RuntimeError'.format(
description))
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_with_failure_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = QuietTestResultHandler(test_count=1)
handler.start_test_run()
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case,).replace('(', r'\(').replace(')', r'\)').replace('\n', '')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?AssertionError'.format(
description))
# The contents of unittest.TestCase should not be in the traceback
self.assertNotIn('raise', output)
class TestStandardResultHandler(ExcInfoFixture, unittest.TestCase):
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_start_test_run(self, stderr):
# Given
handler = StandardTestResultHandler(test_count=1)
# When
handler.start_test_run()
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_output_stop_test_run(self, stderr):
# Given
handler = StandardTestResultHandler(test_count=1)
handler.start_test_run()
# When
handler.stop_test_run()
# Then
output = stderr.getvalue()
self.assertTrue(output.startswith('\n' + handler.separator2))
self.assertTrue(output.endswith('OK\n'))
self.assertRegexpMatches(
output.replace('\n', ''), r'--+.*?Ran 0 tests.*?OK')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_start_test(self, stderr):
# Given
handler = StandardTestResultHandler(test_count=1)
case = _test_cases.TestCase('test_method')
# When
handler.start_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_stop_test(self, stderr):
# Given
handler = StandardTestResultHandler(test_count=1)
case = _test_cases.TestCase('test_method')
# When
handler.stop_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_error(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'E')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_failure(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'F')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, '.')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_skip(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.skipped, expected_duration,
message='reason')
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 's')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_expected_fail(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.expected_failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'x')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_on_unexpected_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.unexpected_success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'u')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_with_error_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
handler.start_test_run()
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case).replace('(', r'\(').replace(')', r'\)')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?RuntimeError'.format(
description))
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_with_failure_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = StandardTestResultHandler(test_count=1)
handler.start_test_run()
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case).replace('(', r'\(').replace(')', r'\)').replace('\n', '')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?AssertionError'.format(
description))
# The contents of unittest.TestCase should not be in the traceback
self.assertNotIn('raise', output)
class TestVerboseResultHandler(ExcInfoFixture, unittest.TestCase):
@patch('sys.stderr', new_callable=StringIO)
def test_output_start_test_run(self, stderr):
# Given
handler = VerboseTestResultHandler(test_count=1)
# When
handler.start_test_run()
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_stop_test_run(self, stderr):
# Given
handler = VerboseTestResultHandler(test_count=1)
handler.start_test_run()
# When
handler.stop_test_run()
# Then
output = stderr.getvalue()
self.assertTrue(output.startswith('\n' + handler.separator2))
self.assertTrue(output.endswith('OK\n'))
self.assertRegexpMatches(
output.replace('\n', ''), r'--+.*?Ran 0 tests.*?OK')
@patch('time.ctime')
@patch('sys.stderr', new_callable=StringIO)
def test_output_start_test(self, stderr, mock_ctime):
# Given
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
mock_ctime.return_value = expected_time = ctime()
expected_description = handler.get_test_description(case)
# When
handler.start_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(
output, '[{0}] (1/1) {1} ... '.format(
expected_time, expected_description))
@patch('sys.stderr', new_callable=StringIO)
def test_no_output_stop_test(self, stderr):
# Given
handler = VerboseTestResultHandler(test_count=1)
case = _test_cases.TestCase('test_method')
# When
handler.stop_test(case)
# Then
output = stderr.getvalue()
self.assertEqual(output, '')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_error(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'ERROR\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_failure(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'FAIL\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'ok\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_skip(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.skipped, expected_duration,
message='reason')
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'skipped \'reason\'\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_expected_fail(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.expected_failure, expected_duration,
exception=exc_info)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'expected failure\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_on_unexpected_success(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
result = TestResult.from_test_case(
case, TestCompletionStatus.unexpected_success, expected_duration)
# When
handler(result)
# Then
output = stderr.getvalue()
self.assertEqual(output, 'unexpected success\n')
@patch('sys.stderr', new_callable=StringIO)
def test_output_with_error_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
handler.start_test_run()
with self.exc_info(RuntimeError) as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.error, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case).replace('(', r'\(').replace(')', r'\)')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?RuntimeError'.format(
description))
@patch('sys.stderr', new_callable=StringIO)
def test_output_with_failure_on_stop_test_run(self, stderr):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
expected_duration = TestDuration(start_time, end_time)
case = _test_cases.TestCase('test_method')
handler = VerboseTestResultHandler(test_count=1)
handler.start_test_run()
with self.failure_exc_info() as exc_info:
result = TestResult.from_test_case(
case, TestCompletionStatus.failure, expected_duration,
exception=exc_info)
# When
handler(result)
handler.stop_test_run()
# Then
output = stderr.getvalue().replace('\n', '')
description = handler.get_test_description(
case).replace('(', r'\(').replace(')', r'\)').replace('\n', '')
self.assertRegexpMatches(
output, '{0}.*?Traceback.*?AssertionError'.format(
description))
# The contents of unittest.TestCase should not be in the traceback
self.assertNotIn('raise', output)
class TestTestDurationOrdering(unittest.TestCase):
@unittest.skipIf(sys.version_info < (3,),
'Python 2 does not raise on unorderable types')
def test_unorderable_types(self):
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
duration = TestDuration(start_time, end_time)
# Then
self.assertNotEqual(duration, object())
with self.assertRaises(TypeError):
duration < object()
with self.assertRaises(TypeError):
duration > object()
def test_hash_equal(self):
# Given
start_time1 = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time1 = start_time1 + duration
duration1 = TestDuration(start_time1, end_time1)
start_time2 = datetime(2015, 12, 23, 8, 14, 12)
end_time2 = start_time2 + duration
duration2 = TestDuration(start_time2, end_time2)
# Then
self.assertEqual(hash(duration1), hash(duration2))
def test_hash_not_equal(self):
# Given
start_time1 = datetime(2015, 12, 23, 8, 14, 12)
duration1 = timedelta(seconds=10)
end_time1 = start_time1 + duration1
duration1 = TestDuration(start_time1, end_time1)
start_time2 = datetime(2015, 12, 23, 8, 14, 12)
duration2 = timedelta(seconds=15)
end_time2 = start_time2 + duration2
duration2 = TestDuration(start_time2, end_time2)
# Then
self.assertNotEqual(hash(duration1), hash(duration2))
def test_equality(self):
# Given
start_time = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time = start_time + duration
duration1 = TestDuration(start_time, end_time)
duration2 = TestDuration(start_time, end_time)
self.assertIsNot(duration1, duration2)
# When/Then
self.assertEqual(duration1, duration2)
self.assertLessEqual(duration1, duration2)
self.assertGreaterEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not less than'):
self.assertLess(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not greater than'):
self.assertGreater(duration1, duration2)
self.assertNotEqual(duration1, object())
# Given
start_time1 = datetime(2015, 12, 23, 8, 14, 12)
duration = timedelta(seconds=10)
end_time1 = start_time1 + duration
duration1 = TestDuration(start_time1, end_time1)
start_time2 = datetime(2015, 12, 23, 8, 14, 12)
end_time2 = start_time2 + duration
duration2 = TestDuration(start_time2, end_time2)
# When/Then
self.assertEqual(duration1, duration2)
self.assertLessEqual(duration1, duration2)
self.assertGreaterEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not less than'):
self.assertLess(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not greater than'):
self.assertGreater(duration1, duration2)
def test_lessthan(self):
# Given
start_time1 = datetime(2015, 12, 23, 8, 14, 12)
duration1 = timedelta(seconds=10)
end_time1 = start_time1 + duration1
duration1 = TestDuration(start_time1, end_time1)
start_time2 = datetime(2014, 12, 23, 8, 14, 12)
duration2 = timedelta(seconds=15)
end_time2 = start_time2 + duration2
duration2 = TestDuration(start_time2, end_time2)
# When/Then
self.assertNotEqual(duration1, duration2)
self.assertLess(duration1, duration2)
self.assertLessEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not greater than or equal to'):
self.assertGreaterEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not greater than'):
self.assertGreater(duration1, duration2)
def test_greaterthan(self):
# Given
start_time1 = datetime(2015, 12, 23, 8, 14, 12)
duration1 = timedelta(seconds=15)
end_time1 = start_time1 + duration1
duration1 = TestDuration(start_time1, end_time1)
start_time2 = datetime(2014, 12, 23, 8, 14, 12)
duration2 = timedelta(seconds=5)
end_time2 = start_time2 + duration2
duration2 = TestDuration(start_time2, end_time2)
# When/Then
self.assertNotEqual(duration1, duration2)
self.assertGreater(duration1, duration2)
self.assertGreaterEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not less than or equal to'):
self.assertLessEqual(duration1, duration2)
with six.assertRaisesRegex(
self, self.failureException, 'not less than'):
self.assertLess(duration1, duration2)
class TestResultCollecterDepricated(unittest.TestCase):
def test_deprecation_warning(self):
# Given
expected_warning = DeprecationWarning(
'ResultCollecter is deprecated in favour of ResultCollector and '
'will be removed in the next release.',
)
# When/Then
with ShouldWarn(expected_warning):
ResultCollecter()
| 34.475157 | 79 | 0.642261 | 5,253 | 49,265 | 5.786789 | 0.048544 | 0.03464 | 0.028949 | 0.031778 | 0.923877 | 0.914962 | 0.90677 | 0.903744 | 0.895651 | 0.894302 | 0 | 0.024596 | 0.263026 | 49,265 | 1,428 | 80 | 34.4993 | 0.812653 | 0.032031 | 0 | 0.852846 | 0 | 0 | 0.044352 | 0.003919 | 0 | 0 | 0 | 0 | 0.18797 | 1 | 0.065521 | false | 0 | 0.015038 | 0 | 0.089151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d697e1571c1fca8475acf1ba0c30edfa62222e4f | 1,014 | py | Python | test/dev_test/recv.py | loopyme/knight-bus | f0bc5b5d9aa65bb9119617e4a3afc6f68929cf15 | [
"MIT"
] | 1 | 2020-05-15T08:37:52.000Z | 2020-05-15T08:37:52.000Z | test/dev_test/recv.py | loopyme/knight-bus | f0bc5b5d9aa65bb9119617e4a3afc6f68929cf15 | [
"MIT"
] | null | null | null | test/dev_test/recv.py | loopyme/knight-bus | f0bc5b5d9aa65bb9119617e4a3afc6f68929cf15 | [
"MIT"
] | 2 | 2020-03-02T00:47:23.000Z | 2021-04-02T10:23:04.000Z | from knight_bus.Receiver import Receiver
key = b"-----BEGIN RSA PRIVATE KEY-----\nMIICXQIBAAKBgQDZopcygBiwWt0LE7FeWohNHtq/bQG8gWHdYnvjrJDYDGyl4vAy\nrc7Ce96k1V+JAsLYCaSMabw5Y0c18Dt7lw8cXiAEylZbycECpRvssKM+wEkf6Hml\n3Lyc7xOLTTad9zpYePxjgjPRAyLzbG7ADKMPBBmLW9kNxNv1llZa6wMrpQIDAQAB\nAoGALUTkrlx2xjggQm2WN0odj+0bEzZZZhyDfsk9e94pQsdS0i6iR+hfWZTqet7n\nQFiSrt1SnOJhhI5iAZY2yT5ipeGD31s+ldF0b/k2lH3dSHGHkRYn7wJ+esmo+KMw\nUyhwk8P+otnydxHr9ABVu05F9Nk0B5NsYGv269WUHsAK8AECQQDcxfLooTiVILop\nbD5tg4wcG7PMKjbX4anqyasbPtoNt0DJMfPZmRqwkMqlg+6oU2eC2w9qQP47Q7D0\n7qdK2b/xAkEA/Fx1DZ1hZumSJG1R70JvDtgKM0bsa1zNmw29HMREDEvgnWrCIGRO\n4O/LZpWRBCMgTzMNOyKBQj7h132N/SAa9QJADwEM/y5l0AzHWiVXIM496XMghxGf\nZJCboa9PB6z/2MrJhmL0tacoHzPX8ePDhoEUmdoVdB0yqghxsFO/3uBpoQJBANin\nZ7awjpaTn+u2Dsmh90Z/IwKyuPXTTpD3UowH04PbAJMkvFSiyTVDqRQBA+bRYUOd\nSJakIOSGp80g9W2CyqECQQCuA9w9YwZm/KioEx+Wi9l9VaejU/lGSU1CE0RWOGlT\n2VYP0tcZ2/NSXFNnGHCk70SmJnEG3pVuh0kPnTl1lC8U\n-----END RSA PRIVATE KEY-----"
r = Receiver(key=key, encrypt_method='none')
print(r.recv())
| 144.857143 | 909 | 0.905325 | 65 | 1,014 | 14.092308 | 0.861538 | 0.024017 | 0.028384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12173 | 0.019724 | 1,014 | 6 | 910 | 169 | 0.799799 | 0 | 0 | 0 | 0 | 0.25 | 0.891519 | 0.844181 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6c904faba5498bc73880ff2aab3d3603d19f506 | 2,707 | py | Python | app/controllers/feed/routes.py | Igor-Felipy/Tcc-back | 9bc6030c93ff96302b6d020cf665caff672363fb | [
"Apache-2.0"
] | null | null | null | app/controllers/feed/routes.py | Igor-Felipy/Tcc-back | 9bc6030c93ff96302b6d020cf665caff672363fb | [
"Apache-2.0"
] | 1 | 2021-03-21T18:35:13.000Z | 2021-03-21T18:35:13.000Z | app/controllers/feed/routes.py | Igor-Felipy/Tcc-back | 9bc6030c93ff96302b6d020cf665caff672363fb | [
"Apache-2.0"
] | null | null | null | from . import feed
from ..auth.authenticate import jwt_required
from app.models.tables import Follow, Post
from flask import jsonify
from app import db
@feed.route("/neutral", methods=["POST"])
@jwt_required
def feed_neutral(current_user):
posts = db.session.connection().execute(f"""
SELECT * FROM posts
INNER JOIN follow
ON posts.user_id = follow.user_id
INNER JOIN users
ON follow.follower_id = {current_user.id};
""")
profile_image = db.session.connection().execute(f"""
SELECT profile_image FROM users
WHERE id = {current_user.id};
""")
post_converted = list()
count = 1
print('teste')
for post in posts:
post_converted.append({
"id":post.id,
"caption":post.caption,
"image":post.image,
"date":post.date,
"user_id":post.user_id,
"profile_image":profile_image
}
)
count+=1
return jsonify(post_converted)
@feed.route("/happy", methods=["POST"])
@jwt_required
def feed_happy(current_user):
posts = db.session.connection().execute(f"""
SELECT * FROM posts
INNER JOIN follow
ON posts.user_id = follow.user_id
INNER JOIN users
ON follow.follower_id = {current_user.id};
""")
post_converted = list()
count = 1
print('teste')
for post in posts:
post_feeling = db.session.connection().execute(f"""
SELECET
""")
if post_feeling.pos >= 0.1:
post_converted.append({
"id":post.id,
"caption":post.caption,
"image":post.image,
"date":post.date,
"user_id":post.user_id
}
)
return jsonify(post_converted)
@feed.route("/sad", methods=["POST"])
@jwt_required
def feed_sad(current_user):
posts = db.session.connection().execute(f"""
SELECT * FROM posts
INNER JOIN follow
ON posts.user_id = follow.user_id
INNER JOIN users
ON follow.follower_id = {current_user.id};
""")
post_converted = list()
count = 1
print('teste')
for post in posts:
post_feeling = db.session.connection().execute(f"""
SELECET
""")
if post_feeling.neg >= 0.1:
post_converted.append({
"id":post.id,
"caption":post.caption,
"image":post.image,
"date":post.date,
"user_id":post.user_id
}
)
return jsonify(post_converted) | 28.797872 | 59 | 0.54082 | 303 | 2,707 | 4.679868 | 0.184818 | 0.067701 | 0.080395 | 0.110014 | 0.833568 | 0.833568 | 0.717913 | 0.717913 | 0.717913 | 0.717913 | 0 | 0.004502 | 0.343554 | 2,707 | 94 | 60 | 28.797872 | 0.793472 | 0 | 0 | 0.715909 | 0 | 0 | 0.303176 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034091 | false | 0 | 0.056818 | 0 | 0.125 | 0.034091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba9394143cdab5f31f42bba3b85b044afc1edeef | 65 | py | Python | lists/views.py | alexanderrq/to-do-tdd | 9d249a37ac246753ea5ff42841d06b33fa051db0 | [
"MIT"
] | null | null | null | lists/views.py | alexanderrq/to-do-tdd | 9d249a37ac246753ea5ff42841d06b33fa051db0 | [
"MIT"
] | null | null | null | lists/views.py | alexanderrq/to-do-tdd | 9d249a37ac246753ea5ff42841d06b33fa051db0 | [
"MIT"
] | null | null | null | from django.shortcuts import render
def home_page():
pass
| 9.285714 | 35 | 0.723077 | 9 | 65 | 5.111111 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 65 | 6 | 36 | 10.833333 | 0.901961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
241231b23c8b323253e45c3f2b9956c0b89ba8ea | 48,132 | py | Python | py_proto/modules/drivers/gnss/proto/gnss_raw_observation_pb2.py | yujianyi/fusion_localization | c0057e29cbf690d6260f021080fd951c1a6b6baa | [
"Apache-2.0"
] | 2 | 2019-03-04T02:11:04.000Z | 2019-04-18T11:19:45.000Z | py_proto/modules/drivers/gnss/proto/gnss_raw_observation_pb2.py | yujianyi/fusion_localization | c0057e29cbf690d6260f021080fd951c1a6b6baa | [
"Apache-2.0"
] | 1 | 2019-03-15T08:37:53.000Z | 2019-03-15T08:37:53.000Z | py_proto/modules/drivers/gnss/proto/gnss_raw_observation_pb2.py | yujianyi/fusion_localization | c0057e29cbf690d6260f021080fd951c1a6b6baa | [
"Apache-2.0"
] | 1 | 2019-03-04T02:11:09.000Z | 2019-03-04T02:11:09.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: modules/drivers/gnss/proto/gnss_raw_observation.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='modules/drivers/gnss/proto/gnss_raw_observation.proto',
package='apollo.drivers.gnss',
syntax='proto2',
serialized_pb=_b('\n5modules/drivers/gnss/proto/gnss_raw_observation.proto\x12\x13\x61pollo.drivers.gnss\"\x92\x02\n\x0f\x42\x61ndObservation\x12>\n\x07\x62\x61nd_id\x18\x01 \x01(\x0e\x32\x1f.apollo.drivers.gnss.GnssBandID:\x0c\x42\x41ND_UNKNOWN\x12\x17\n\x0f\x66requency_value\x18\x02 \x01(\x01\x12\x42\n\x0bpseudo_type\x18\x03 \x01(\x0e\x32\x1f.apollo.drivers.gnss.PseudoType:\x0c\x43ODE_UNKNOWN\x12\x14\n\x0cpseudo_range\x18\x04 \x01(\x01\x12\x15\n\rcarrier_phase\x18\x05 \x01(\x01\x12\x17\n\x0floss_lock_index\x18\x06 \x01(\r\x12\x0f\n\x07\x64oppler\x18\x07 \x01(\x01\x12\x0b\n\x03snr\x18\x08 \x01(\x02\"\xae\x01\n\x14SatelliteObservation\x12\x0f\n\x07sat_prn\x18\x01 \x01(\r\x12\x37\n\x07sat_sys\x18\x02 \x01(\x0e\x32\x1d.apollo.drivers.gnss.GnssType:\x07GPS_SYS\x12\x14\n\x0c\x62\x61nd_obs_num\x18\x03 \x01(\r\x12\x36\n\x08\x62\x61nd_obs\x18\x04 \x03(\x0b\x32$.apollo.drivers.gnss.BandObservation\"\xbb\x02\n\x10\x45pochObservation\x12\x13\n\x0breceiver_id\x18\x01 \x01(\r\x12\x43\n\x0egnss_time_type\x18\x02 \x01(\x0e\x32!.apollo.drivers.gnss.GnssTimeType:\x08GPS_TIME\x12\x11\n\tgnss_week\x18\x03 \x01(\r\x12\x15\n\rgnss_second_s\x18\x04 \x01(\x01\x12\x12\n\nposition_x\x18\x05 \x01(\x01\x12\x12\n\nposition_y\x18\x06 \x01(\x01\x12\x12\n\nposition_z\x18\x07 \x01(\x01\x12\x16\n\x0bhealth_flag\x18\x08 \x01(\r:\x01\x30\x12\x13\n\x0bsat_obs_num\x18\t \x01(\r\x12:\n\x07sat_obs\x18\n \x03(\x0b\x32).apollo.drivers.gnss.SatelliteObservation\"\xa7\x05\n\x0cKepplerOrbit\x12\x39\n\tgnss_type\x18\x01 \x01(\x0e\x32\x1d.apollo.drivers.gnss.GnssType:\x07GPS_SYS\x12\x0f\n\x07sat_prn\x18\x02 \x01(\r\x12\x43\n\x0egnss_time_type\x18\x03 \x01(\x0e\x32!.apollo.drivers.gnss.GnssTimeType:\x08GPS_TIME\x12\x0c\n\x04year\x18\x04 \x01(\r\x12\r\n\x05month\x18\x05 \x01(\r\x12\x0b\n\x03\x64\x61y\x18\x06 \x01(\r\x12\x0c\n\x04hour\x18\x07 \x01(\r\x12\x0e\n\x06minute\x18\x08 \x01(\r\x12\x10\n\x08second_s\x18\t \x01(\x01\x12\x10\n\x08week_num\x18\n \x01(\r\x12\x10\n\x08reserved\x18\x0b \x01(\x01\x12\x0b\n\x03\x61\x66\x30\x18\x0c \x01(\x01\x12\x0b\n\x03\x61\x66\x31\x18\r \x01(\x01\x12\x0b\n\x03\x61\x66\x32\x18\x0e \x01(\x01\x12\x0c\n\x04iode\x18\x0f \x01(\x01\x12\x0e\n\x06\x64\x65ltan\x18\x10 \x01(\x01\x12\n\n\x02m0\x18\x11 \x01(\x01\x12\t\n\x01\x65\x18\x12 \x01(\x01\x12\r\n\x05roota\x18\x13 \x01(\x01\x12\x0b\n\x03toe\x18\x14 \x01(\x01\x12\x0b\n\x03toc\x18\x15 \x01(\x01\x12\x0b\n\x03\x63ic\x18\x16 \x01(\x01\x12\x0b\n\x03\x63rc\x18\x17 \x01(\x01\x12\x0b\n\x03\x63is\x18\x18 \x01(\x01\x12\x0b\n\x03\x63rs\x18\x19 \x01(\x01\x12\x0b\n\x03\x63uc\x18\x1a \x01(\x01\x12\x0b\n\x03\x63us\x18\x1b \x01(\x01\x12\x0e\n\x06omega0\x18\x1c \x01(\x01\x12\r\n\x05omega\x18\x1d \x01(\x01\x12\n\n\x02i0\x18\x1e \x01(\x01\x12\x10\n\x08omegadot\x18\x1f \x01(\x01\x12\x0c\n\x04idot\x18 \x01(\x01\x12\x18\n\x10\x63odesonL2channel\x18! \x01(\x01\x12\x13\n\x0bL2Pdataflag\x18\" \x01(\r\x12\x10\n\x08\x61\x63\x63uracy\x18# \x01(\r\x12\x0e\n\x06health\x18$ \x01(\r\x12\x0b\n\x03tgd\x18% \x01(\x01\x12\x0c\n\x04iodc\x18& \x01(\x01\"\xda\x04\n\x0cGlonassOrbit\x12\x39\n\tgnss_type\x18\x01 \x01(\x0e\x32\x1d.apollo.drivers.gnss.GnssType:\x07GLO_SYS\x12\x10\n\x08slot_prn\x18\x02 \x01(\r\x12\x43\n\x0egnss_time_type\x18\x03 \x01(\x0e\x32!.apollo.drivers.gnss.GnssTimeType:\x08GLO_TIME\x12\x0b\n\x03toe\x18\x04 \x01(\x01\x12\x0c\n\x04year\x18\x05 \x01(\r\x12\r\n\x05month\x18\x06 \x01(\r\x12\x0b\n\x03\x64\x61y\x18\x07 \x01(\r\x12\x0c\n\x04hour\x18\x08 \x01(\r\x12\x0e\n\x06minute\x18\t \x01(\r\x12\x10\n\x08second_s\x18\n \x01(\x01\x12\x14\n\x0c\x66requency_no\x18\x0b \x01(\x05\x12\x10\n\x08week_num\x18\x0c \x01(\r\x12\x15\n\rweek_second_s\x18\r \x01(\x01\x12\n\n\x02tk\x18\x0e \x01(\x01\x12\x14\n\x0c\x63lock_offset\x18\x0f \x01(\x01\x12\x13\n\x0b\x63lock_drift\x18\x10 \x01(\x01\x12\x0e\n\x06health\x18\x11 \x01(\r\x12\x12\n\nposition_x\x18\x12 \x01(\x01\x12\x12\n\nposition_y\x18\x13 \x01(\x01\x12\x12\n\nposition_z\x18\x14 \x01(\x01\x12\x12\n\nvelocity_x\x18\x15 \x01(\x01\x12\x12\n\nvelocity_y\x18\x16 \x01(\x01\x12\x12\n\nvelocity_z\x18\x17 \x01(\x01\x12\x14\n\x0c\x61\x63\x63\x65lerate_x\x18\x18 \x01(\x01\x12\x14\n\x0c\x61\x63\x63\x65lerate_y\x18\x19 \x01(\x01\x12\x14\n\x0c\x61\x63\x63\x65lerate_z\x18\x1a \x01(\x01\x12\x11\n\tinfor_age\x18\x1b \x01(\x01\"\xbe\x01\n\rGnssEphemeris\x12\x39\n\tgnss_type\x18\x01 \x01(\x0e\x32\x1d.apollo.drivers.gnss.GnssType:\x07GLO_SYS\x12\x38\n\rkeppler_orbit\x18\x02 \x01(\x0b\x32!.apollo.drivers.gnss.KepplerOrbit\x12\x38\n\rglonass_orbit\x18\x03 \x01(\x0b\x32!.apollo.drivers.gnss.GlonassOrbit*\x8a\x01\n\nGnssBandID\x12\x10\n\x0c\x42\x41ND_UNKNOWN\x10\x00\x12\n\n\x06GPS_L1\x10\x01\x12\n\n\x06GPS_L2\x10\x02\x12\n\n\x06GPS_L5\x10\x03\x12\n\n\x06\x42\x44S_B1\x10\x04\x12\n\n\x06\x42\x44S_B2\x10\x05\x12\n\n\x06\x42\x44S_B3\x10\x06\x12\n\n\x06GLO_G1\x10\x07\x12\n\n\x06GLO_G2\x10\x08\x12\n\n\x06GLO_G3\x10\t*X\n\x0cGnssTimeType\x12\x10\n\x0cTIME_UNKNOWN\x10\x00\x12\x0c\n\x08GPS_TIME\x10\x01\x12\x0c\n\x08\x42\x44S_TIME\x10\x02\x12\x0c\n\x08GLO_TIME\x10\x03\x12\x0c\n\x08GAL_TIME\x10\x04*O\n\x08GnssType\x12\x0f\n\x0bSYS_UNKNOWN\x10\x00\x12\x0b\n\x07GPS_SYS\x10\x01\x12\x0b\n\x07\x42\x44S_SYS\x10\x02\x12\x0b\n\x07GLO_SYS\x10\x03\x12\x0b\n\x07GAL_SYS\x10\x04*B\n\nPseudoType\x12\x10\n\x0c\x43ODE_UNKNOWN\x10\x00\x12\x0e\n\nCORSE_CODE\x10\x01\x12\x12\n\x0ePRECISION_CODE\x10\x02')
)
_GNSSBANDID = _descriptor.EnumDescriptor(
name='GnssBandID',
full_name='apollo.drivers.gnss.GnssBandID',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='BAND_UNKNOWN', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GPS_L1', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GPS_L2', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GPS_L5', index=3, number=3,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BDS_B1', index=4, number=4,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BDS_B2', index=5, number=5,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BDS_B3', index=6, number=6,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GLO_G1', index=7, number=7,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GLO_G2', index=8, number=8,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GLO_G3', index=9, number=9,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=2331,
serialized_end=2469,
)
_sym_db.RegisterEnumDescriptor(_GNSSBANDID)
GnssBandID = enum_type_wrapper.EnumTypeWrapper(_GNSSBANDID)
_GNSSTIMETYPE = _descriptor.EnumDescriptor(
name='GnssTimeType',
full_name='apollo.drivers.gnss.GnssTimeType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='TIME_UNKNOWN', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GPS_TIME', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BDS_TIME', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GLO_TIME', index=3, number=3,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GAL_TIME', index=4, number=4,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=2471,
serialized_end=2559,
)
_sym_db.RegisterEnumDescriptor(_GNSSTIMETYPE)
GnssTimeType = enum_type_wrapper.EnumTypeWrapper(_GNSSTIMETYPE)
_GNSSTYPE = _descriptor.EnumDescriptor(
name='GnssType',
full_name='apollo.drivers.gnss.GnssType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='SYS_UNKNOWN', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GPS_SYS', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='BDS_SYS', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GLO_SYS', index=3, number=3,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='GAL_SYS', index=4, number=4,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=2561,
serialized_end=2640,
)
_sym_db.RegisterEnumDescriptor(_GNSSTYPE)
GnssType = enum_type_wrapper.EnumTypeWrapper(_GNSSTYPE)
_PSEUDOTYPE = _descriptor.EnumDescriptor(
name='PseudoType',
full_name='apollo.drivers.gnss.PseudoType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='CODE_UNKNOWN', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='CORSE_CODE', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='PRECISION_CODE', index=2, number=2,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=2642,
serialized_end=2708,
)
_sym_db.RegisterEnumDescriptor(_PSEUDOTYPE)
PseudoType = enum_type_wrapper.EnumTypeWrapper(_PSEUDOTYPE)
BAND_UNKNOWN = 0
GPS_L1 = 1
GPS_L2 = 2
GPS_L5 = 3
BDS_B1 = 4
BDS_B2 = 5
BDS_B3 = 6
GLO_G1 = 7
GLO_G2 = 8
GLO_G3 = 9
TIME_UNKNOWN = 0
GPS_TIME = 1
BDS_TIME = 2
GLO_TIME = 3
GAL_TIME = 4
SYS_UNKNOWN = 0
GPS_SYS = 1
BDS_SYS = 2
GLO_SYS = 3
GAL_SYS = 4
CODE_UNKNOWN = 0
CORSE_CODE = 1
PRECISION_CODE = 2
_BANDOBSERVATION = _descriptor.Descriptor(
name='BandObservation',
full_name='apollo.drivers.gnss.BandObservation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='band_id', full_name='apollo.drivers.gnss.BandObservation.band_id', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='frequency_value', full_name='apollo.drivers.gnss.BandObservation.frequency_value', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pseudo_type', full_name='apollo.drivers.gnss.BandObservation.pseudo_type', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pseudo_range', full_name='apollo.drivers.gnss.BandObservation.pseudo_range', index=3,
number=4, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='carrier_phase', full_name='apollo.drivers.gnss.BandObservation.carrier_phase', index=4,
number=5, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='loss_lock_index', full_name='apollo.drivers.gnss.BandObservation.loss_lock_index', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='doppler', full_name='apollo.drivers.gnss.BandObservation.doppler', index=6,
number=7, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='snr', full_name='apollo.drivers.gnss.BandObservation.snr', index=7,
number=8, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=79,
serialized_end=353,
)
_SATELLITEOBSERVATION = _descriptor.Descriptor(
name='SatelliteObservation',
full_name='apollo.drivers.gnss.SatelliteObservation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='sat_prn', full_name='apollo.drivers.gnss.SatelliteObservation.sat_prn', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sat_sys', full_name='apollo.drivers.gnss.SatelliteObservation.sat_sys', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='band_obs_num', full_name='apollo.drivers.gnss.SatelliteObservation.band_obs_num', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='band_obs', full_name='apollo.drivers.gnss.SatelliteObservation.band_obs', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=356,
serialized_end=530,
)
_EPOCHOBSERVATION = _descriptor.Descriptor(
name='EpochObservation',
full_name='apollo.drivers.gnss.EpochObservation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='receiver_id', full_name='apollo.drivers.gnss.EpochObservation.receiver_id', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gnss_time_type', full_name='apollo.drivers.gnss.EpochObservation.gnss_time_type', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gnss_week', full_name='apollo.drivers.gnss.EpochObservation.gnss_week', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gnss_second_s', full_name='apollo.drivers.gnss.EpochObservation.gnss_second_s', index=3,
number=4, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_x', full_name='apollo.drivers.gnss.EpochObservation.position_x', index=4,
number=5, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_y', full_name='apollo.drivers.gnss.EpochObservation.position_y', index=5,
number=6, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_z', full_name='apollo.drivers.gnss.EpochObservation.position_z', index=6,
number=7, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='health_flag', full_name='apollo.drivers.gnss.EpochObservation.health_flag', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sat_obs_num', full_name='apollo.drivers.gnss.EpochObservation.sat_obs_num', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sat_obs', full_name='apollo.drivers.gnss.EpochObservation.sat_obs', index=9,
number=10, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=533,
serialized_end=848,
)
_KEPPLERORBIT = _descriptor.Descriptor(
name='KepplerOrbit',
full_name='apollo.drivers.gnss.KepplerOrbit',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='gnss_type', full_name='apollo.drivers.gnss.KepplerOrbit.gnss_type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sat_prn', full_name='apollo.drivers.gnss.KepplerOrbit.sat_prn', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gnss_time_type', full_name='apollo.drivers.gnss.KepplerOrbit.gnss_time_type', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='year', full_name='apollo.drivers.gnss.KepplerOrbit.year', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='month', full_name='apollo.drivers.gnss.KepplerOrbit.month', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='day', full_name='apollo.drivers.gnss.KepplerOrbit.day', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='hour', full_name='apollo.drivers.gnss.KepplerOrbit.hour', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='minute', full_name='apollo.drivers.gnss.KepplerOrbit.minute', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='second_s', full_name='apollo.drivers.gnss.KepplerOrbit.second_s', index=8,
number=9, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='week_num', full_name='apollo.drivers.gnss.KepplerOrbit.week_num', index=9,
number=10, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='reserved', full_name='apollo.drivers.gnss.KepplerOrbit.reserved', index=10,
number=11, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='af0', full_name='apollo.drivers.gnss.KepplerOrbit.af0', index=11,
number=12, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='af1', full_name='apollo.drivers.gnss.KepplerOrbit.af1', index=12,
number=13, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='af2', full_name='apollo.drivers.gnss.KepplerOrbit.af2', index=13,
number=14, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='iode', full_name='apollo.drivers.gnss.KepplerOrbit.iode', index=14,
number=15, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='deltan', full_name='apollo.drivers.gnss.KepplerOrbit.deltan', index=15,
number=16, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='m0', full_name='apollo.drivers.gnss.KepplerOrbit.m0', index=16,
number=17, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='e', full_name='apollo.drivers.gnss.KepplerOrbit.e', index=17,
number=18, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='roota', full_name='apollo.drivers.gnss.KepplerOrbit.roota', index=18,
number=19, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='toe', full_name='apollo.drivers.gnss.KepplerOrbit.toe', index=19,
number=20, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='toc', full_name='apollo.drivers.gnss.KepplerOrbit.toc', index=20,
number=21, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cic', full_name='apollo.drivers.gnss.KepplerOrbit.cic', index=21,
number=22, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='crc', full_name='apollo.drivers.gnss.KepplerOrbit.crc', index=22,
number=23, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cis', full_name='apollo.drivers.gnss.KepplerOrbit.cis', index=23,
number=24, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='crs', full_name='apollo.drivers.gnss.KepplerOrbit.crs', index=24,
number=25, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cuc', full_name='apollo.drivers.gnss.KepplerOrbit.cuc', index=25,
number=26, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cus', full_name='apollo.drivers.gnss.KepplerOrbit.cus', index=26,
number=27, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='omega0', full_name='apollo.drivers.gnss.KepplerOrbit.omega0', index=27,
number=28, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='omega', full_name='apollo.drivers.gnss.KepplerOrbit.omega', index=28,
number=29, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='i0', full_name='apollo.drivers.gnss.KepplerOrbit.i0', index=29,
number=30, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='omegadot', full_name='apollo.drivers.gnss.KepplerOrbit.omegadot', index=30,
number=31, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='idot', full_name='apollo.drivers.gnss.KepplerOrbit.idot', index=31,
number=32, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='codesonL2channel', full_name='apollo.drivers.gnss.KepplerOrbit.codesonL2channel', index=32,
number=33, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='L2Pdataflag', full_name='apollo.drivers.gnss.KepplerOrbit.L2Pdataflag', index=33,
number=34, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='accuracy', full_name='apollo.drivers.gnss.KepplerOrbit.accuracy', index=34,
number=35, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='health', full_name='apollo.drivers.gnss.KepplerOrbit.health', index=35,
number=36, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='tgd', full_name='apollo.drivers.gnss.KepplerOrbit.tgd', index=36,
number=37, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='iodc', full_name='apollo.drivers.gnss.KepplerOrbit.iodc', index=37,
number=38, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=851,
serialized_end=1530,
)
_GLONASSORBIT = _descriptor.Descriptor(
name='GlonassOrbit',
full_name='apollo.drivers.gnss.GlonassOrbit',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='gnss_type', full_name='apollo.drivers.gnss.GlonassOrbit.gnss_type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='slot_prn', full_name='apollo.drivers.gnss.GlonassOrbit.slot_prn', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='gnss_time_type', full_name='apollo.drivers.gnss.GlonassOrbit.gnss_time_type', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='toe', full_name='apollo.drivers.gnss.GlonassOrbit.toe', index=3,
number=4, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='year', full_name='apollo.drivers.gnss.GlonassOrbit.year', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='month', full_name='apollo.drivers.gnss.GlonassOrbit.month', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='day', full_name='apollo.drivers.gnss.GlonassOrbit.day', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='hour', full_name='apollo.drivers.gnss.GlonassOrbit.hour', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='minute', full_name='apollo.drivers.gnss.GlonassOrbit.minute', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='second_s', full_name='apollo.drivers.gnss.GlonassOrbit.second_s', index=9,
number=10, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='frequency_no', full_name='apollo.drivers.gnss.GlonassOrbit.frequency_no', index=10,
number=11, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='week_num', full_name='apollo.drivers.gnss.GlonassOrbit.week_num', index=11,
number=12, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='week_second_s', full_name='apollo.drivers.gnss.GlonassOrbit.week_second_s', index=12,
number=13, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='tk', full_name='apollo.drivers.gnss.GlonassOrbit.tk', index=13,
number=14, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clock_offset', full_name='apollo.drivers.gnss.GlonassOrbit.clock_offset', index=14,
number=15, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clock_drift', full_name='apollo.drivers.gnss.GlonassOrbit.clock_drift', index=15,
number=16, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='health', full_name='apollo.drivers.gnss.GlonassOrbit.health', index=16,
number=17, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_x', full_name='apollo.drivers.gnss.GlonassOrbit.position_x', index=17,
number=18, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_y', full_name='apollo.drivers.gnss.GlonassOrbit.position_y', index=18,
number=19, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='position_z', full_name='apollo.drivers.gnss.GlonassOrbit.position_z', index=19,
number=20, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='velocity_x', full_name='apollo.drivers.gnss.GlonassOrbit.velocity_x', index=20,
number=21, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='velocity_y', full_name='apollo.drivers.gnss.GlonassOrbit.velocity_y', index=21,
number=22, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='velocity_z', full_name='apollo.drivers.gnss.GlonassOrbit.velocity_z', index=22,
number=23, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='accelerate_x', full_name='apollo.drivers.gnss.GlonassOrbit.accelerate_x', index=23,
number=24, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='accelerate_y', full_name='apollo.drivers.gnss.GlonassOrbit.accelerate_y', index=24,
number=25, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='accelerate_z', full_name='apollo.drivers.gnss.GlonassOrbit.accelerate_z', index=25,
number=26, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='infor_age', full_name='apollo.drivers.gnss.GlonassOrbit.infor_age', index=26,
number=27, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=1533,
serialized_end=2135,
)
_GNSSEPHEMERIS = _descriptor.Descriptor(
name='GnssEphemeris',
full_name='apollo.drivers.gnss.GnssEphemeris',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='gnss_type', full_name='apollo.drivers.gnss.GnssEphemeris.gnss_type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='keppler_orbit', full_name='apollo.drivers.gnss.GnssEphemeris.keppler_orbit', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='glonass_orbit', full_name='apollo.drivers.gnss.GnssEphemeris.glonass_orbit', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=2138,
serialized_end=2328,
)
_BANDOBSERVATION.fields_by_name['band_id'].enum_type = _GNSSBANDID
_BANDOBSERVATION.fields_by_name['pseudo_type'].enum_type = _PSEUDOTYPE
_SATELLITEOBSERVATION.fields_by_name['sat_sys'].enum_type = _GNSSTYPE
_SATELLITEOBSERVATION.fields_by_name['band_obs'].message_type = _BANDOBSERVATION
_EPOCHOBSERVATION.fields_by_name['gnss_time_type'].enum_type = _GNSSTIMETYPE
_EPOCHOBSERVATION.fields_by_name['sat_obs'].message_type = _SATELLITEOBSERVATION
_KEPPLERORBIT.fields_by_name['gnss_type'].enum_type = _GNSSTYPE
_KEPPLERORBIT.fields_by_name['gnss_time_type'].enum_type = _GNSSTIMETYPE
_GLONASSORBIT.fields_by_name['gnss_type'].enum_type = _GNSSTYPE
_GLONASSORBIT.fields_by_name['gnss_time_type'].enum_type = _GNSSTIMETYPE
_GNSSEPHEMERIS.fields_by_name['gnss_type'].enum_type = _GNSSTYPE
_GNSSEPHEMERIS.fields_by_name['keppler_orbit'].message_type = _KEPPLERORBIT
_GNSSEPHEMERIS.fields_by_name['glonass_orbit'].message_type = _GLONASSORBIT
DESCRIPTOR.message_types_by_name['BandObservation'] = _BANDOBSERVATION
DESCRIPTOR.message_types_by_name['SatelliteObservation'] = _SATELLITEOBSERVATION
DESCRIPTOR.message_types_by_name['EpochObservation'] = _EPOCHOBSERVATION
DESCRIPTOR.message_types_by_name['KepplerOrbit'] = _KEPPLERORBIT
DESCRIPTOR.message_types_by_name['GlonassOrbit'] = _GLONASSORBIT
DESCRIPTOR.message_types_by_name['GnssEphemeris'] = _GNSSEPHEMERIS
DESCRIPTOR.enum_types_by_name['GnssBandID'] = _GNSSBANDID
DESCRIPTOR.enum_types_by_name['GnssTimeType'] = _GNSSTIMETYPE
DESCRIPTOR.enum_types_by_name['GnssType'] = _GNSSTYPE
DESCRIPTOR.enum_types_by_name['PseudoType'] = _PSEUDOTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
BandObservation = _reflection.GeneratedProtocolMessageType('BandObservation', (_message.Message,), dict(
DESCRIPTOR = _BANDOBSERVATION,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.BandObservation)
))
_sym_db.RegisterMessage(BandObservation)
SatelliteObservation = _reflection.GeneratedProtocolMessageType('SatelliteObservation', (_message.Message,), dict(
DESCRIPTOR = _SATELLITEOBSERVATION,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.SatelliteObservation)
))
_sym_db.RegisterMessage(SatelliteObservation)
EpochObservation = _reflection.GeneratedProtocolMessageType('EpochObservation', (_message.Message,), dict(
DESCRIPTOR = _EPOCHOBSERVATION,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.EpochObservation)
))
_sym_db.RegisterMessage(EpochObservation)
KepplerOrbit = _reflection.GeneratedProtocolMessageType('KepplerOrbit', (_message.Message,), dict(
DESCRIPTOR = _KEPPLERORBIT,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.KepplerOrbit)
))
_sym_db.RegisterMessage(KepplerOrbit)
GlonassOrbit = _reflection.GeneratedProtocolMessageType('GlonassOrbit', (_message.Message,), dict(
DESCRIPTOR = _GLONASSORBIT,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.GlonassOrbit)
))
_sym_db.RegisterMessage(GlonassOrbit)
GnssEphemeris = _reflection.GeneratedProtocolMessageType('GnssEphemeris', (_message.Message,), dict(
DESCRIPTOR = _GNSSEPHEMERIS,
__module__ = 'modules.drivers.gnss.proto.gnss_raw_observation_pb2'
# @@protoc_insertion_point(class_scope:apollo.drivers.gnss.GnssEphemeris)
))
_sym_db.RegisterMessage(GnssEphemeris)
# @@protoc_insertion_point(module_scope)
| 46.015296 | 5,296 | 0.730491 | 6,686 | 48,132 | 5.018546 | 0.058181 | 0.072242 | 0.060798 | 0.062586 | 0.819336 | 0.779132 | 0.706116 | 0.672886 | 0.63444 | 0.628211 | 0 | 0.05638 | 0.13918 | 48,132 | 1,045 | 5,297 | 46.05933 | 0.753457 | 0.013151 | 0 | 0.712575 | 1 | 0.000998 | 0.234649 | 0.202788 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006986 | 0 | 0.006986 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
241e63b95a62d7e8c6e062760c667bcb666ca0ad | 2,905 | py | Python | hidparser/ItemLocal.py | NZSmartie/PyHIDParser | a2758929c82a4316a665a779b9a391740103b318 | [
"MIT"
] | 22 | 2016-04-28T10:29:11.000Z | 2022-02-02T17:30:08.000Z | hidparser/ItemLocal.py | NZSmartie/PyHIDParser | a2758929c82a4316a665a779b9a391740103b318 | [
"MIT"
] | 12 | 2016-04-24T03:29:00.000Z | 2018-11-26T22:34:37.000Z | hidparser/ItemLocal.py | NZSmartie/PyHIDParser | a2758929c82a4316a665a779b9a391740103b318 | [
"MIT"
] | 5 | 2017-02-21T13:01:25.000Z | 2021-10-04T07:13:53.000Z | from hidparser.DeviceBuilder import DeviceBuilder
from hidparser.Item import Item, ItemType, ValueItem
class UsageItem(ValueItem):
def __init__(self, *args, **kwargs):
kwargs["signed"] = False
super(UsageItem, self).__init__(*args, **kwargs)
def visit(self, descriptor: DeviceBuilder):
descriptor.add_usage(self.value)
@classmethod
def _get_tag(cls):
return 0x08
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class UsageMinimumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_usage_range(minimum=self.value)
@classmethod
def _get_tag(cls):
return 0x18
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class UsageMaximumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_usage_range(maximum=self.value)
@classmethod
def _get_tag(cls):
return 0x28
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class DesignatorIndexItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_designator_range(minimum=self.value, maximum=self.value)
@classmethod
def _get_tag(cls):
return 0x38
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class DesignatorMaximumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_designator_range(maximum=self.value)
@classmethod
def _get_tag(cls):
return 0x48
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class DesignatorMinimumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_designator_range(minimum=self.value)
@classmethod
def _get_tag(cls):
return 0x58
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class StringIndexItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_string_range(minimum=self.value, maximum=self.value)
@classmethod
def _get_tag(cls):
return 0x78
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class StringMinimumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_string_range(minimum=self.value)
@classmethod
def _get_tag(cls):
return 0x88
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class StringMaximumItem(ValueItem):
def visit(self, descriptor: DeviceBuilder):
descriptor.set_string_range(maximum=self.value)
@classmethod
def _get_tag(cls):
return 0x98
@classmethod
def _get_type(cls):
return ItemType.LOCAL
class DelimiterItem(Item):
@classmethod
def _get_tag(cls):
return 0xA8
@classmethod
def _get_type(cls):
return ItemType.LOCAL
| 21.518519 | 79 | 0.683993 | 318 | 2,905 | 6.044025 | 0.169811 | 0.145682 | 0.176899 | 0.104058 | 0.797607 | 0.797607 | 0.759105 | 0.759105 | 0.716962 | 0.492196 | 0 | 0.01301 | 0.232702 | 2,905 | 134 | 80 | 21.679104 | 0.84926 | 0 | 0 | 0.634409 | 0 | 0 | 0.002066 | 0 | 0 | 0 | 0.013774 | 0 | 0 | 1 | 0.322581 | false | 0 | 0.021505 | 0.215054 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
24396864aa6e17e77d727a798ee6b7b891ff2e0a | 37,317 | py | Python | nova/tests/unit/api/openstack/compute/test_access_ips.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/api/openstack/compute/test_access_ips.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/api/openstack/compute/test_access_ips.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2013 IBM Corp.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
op|'.'
name|'compute'
name|'import'
name|'access_ips'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
op|'.'
name|'compute'
name|'import'
name|'extension_info'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
op|'.'
name|'compute'
name|'import'
name|'servers'
name|'as'
name|'servers_v21'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'wsgi'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'test'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'fakes'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'image'
name|'import'
name|'fake'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|AccessIPsExtTestV21
name|'class'
name|'AccessIPsExtTestV21'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'AccessIPsExtTestV21'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'access_ips_ext'
op|'='
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'('
name|'None'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test
dedent|''
name|'def'
name|'_test'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'1.1.1.1'"
op|','
nl|'\n'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'fe80::'"
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v4'"
op|':'
string|"'1.1.1.1'"
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
string|"'fe80::'"
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv4_only
dedent|''
name|'def'
name|'_test_with_ipv4_only'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'1.1.1.1'"
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v4'"
op|':'
string|"'1.1.1.1'"
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv6_only
dedent|''
name|'def'
name|'_test_with_ipv6_only'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'fe80::'"
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v6'"
op|':'
string|"'fe80::'"
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_without_ipv4_and_ipv6
dedent|''
name|'def'
name|'_test_without_ipv4_and_ipv6'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv4_null
dedent|''
name|'def'
name|'_test_with_ipv4_null'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
name|'None'
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v4'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv6_null
dedent|''
name|'def'
name|'_test_with_ipv6_null'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
name|'None'
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v6'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv4_blank
dedent|''
name|'def'
name|'_test_with_ipv4_blank'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"''"
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v4'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_ipv6_blank
dedent|''
name|'def'
name|'_test_with_ipv6_blank'
op|'('
name|'self'
op|','
name|'func'
op|')'
op|':'
newline|'\n'
indent|' '
name|'server_dict'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"''"
op|'}'
newline|'\n'
name|'create_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'func'
op|'('
name|'server_dict'
op|','
name|'create_kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'create_kwargs'
op|','
op|'{'
string|"'access_ip_v6'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create
dedent|''
name|'def'
name|'test_server_create'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv4_only
dedent|''
name|'def'
name|'test_server_create_with_ipv4_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv6_only
dedent|''
name|'def'
name|'test_server_create_with_ipv6_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_without_ipv4_and_ipv6
dedent|''
name|'def'
name|'test_server_create_without_ipv4_and_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_ipv4_and_ipv6'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv4_null
dedent|''
name|'def'
name|'test_server_create_with_ipv4_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv6_null
dedent|''
name|'def'
name|'test_server_create_with_ipv6_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv4_blank
dedent|''
name|'def'
name|'test_server_create_with_ipv4_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_create_with_ipv6_blank
dedent|''
name|'def'
name|'test_server_create_with_ipv6_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_create'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update
dedent|''
name|'def'
name|'test_server_update'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv4_only
dedent|''
name|'def'
name|'test_server_update_with_ipv4_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv6_only
dedent|''
name|'def'
name|'test_server_update_with_ipv6_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_without_ipv4_and_ipv6
dedent|''
name|'def'
name|'test_server_update_without_ipv4_and_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_ipv4_and_ipv6'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv4_null
dedent|''
name|'def'
name|'test_server_update_with_ipv4_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv6_null
dedent|''
name|'def'
name|'test_server_update_with_ipv6_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv4_blank
dedent|''
name|'def'
name|'test_server_update_with_ipv4_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_update_with_ipv6_blank
dedent|''
name|'def'
name|'test_server_update_with_ipv6_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_update'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild
dedent|''
name|'def'
name|'test_server_rebuild'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv4_only
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv4_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv6_only
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv6_only'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_only'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_without_ipv4_and_ipv6
dedent|''
name|'def'
name|'test_server_rebuild_without_ipv4_and_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_ipv4_and_ipv6'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv4_null
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv4_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv6_null
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv6_null'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_null'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv4_blank
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv4_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv4_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_server_rebuild_with_ipv6_blank
dedent|''
name|'def'
name|'test_server_rebuild_with_ipv6_blank'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_ipv6_blank'
op|'('
name|'self'
op|'.'
name|'access_ips_ext'
op|'.'
name|'server_rebuild'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|AccessIPsExtAPIValidationTestV21
dedent|''
dedent|''
name|'class'
name|'AccessIPsExtAPIValidationTestV21'
op|'('
name|'test'
op|'.'
name|'TestCase'
op|')'
op|':'
newline|'\n'
DECL|variable|validation_error
indent|' '
name|'validation_error'
op|'='
name|'exception'
op|'.'
name|'ValidationError'
newline|'\n'
nl|'\n'
DECL|member|setUp
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'AccessIPsExtAPIValidationTestV21'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_save
name|'def'
name|'fake_save'
op|'('
name|'context'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|function|fake_rebuild
dedent|''
name|'def'
name|'fake_rebuild'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'_set_up_controller'
op|'('
op|')'
newline|'\n'
name|'fake'
op|'.'
name|'stub_out_image_service'
op|'('
name|'self'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.instance_get_by_uuid'"
op|','
nl|'\n'
name|'fakes'
op|'.'
name|'fake_instance_get'
op|'('
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.objects.instance.Instance.save'"
op|','
name|'fake_save'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.compute.api.API.rebuild'"
op|','
name|'fake_rebuild'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'req'
op|'='
name|'fakes'
op|'.'
name|'HTTPRequest'
op|'.'
name|'blank'
op|'('
string|"''"
op|')'
newline|'\n'
nl|'\n'
DECL|member|_set_up_controller
dedent|''
name|'def'
name|'_set_up_controller'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ext_info'
op|'='
name|'extension_info'
op|'.'
name|'LoadedExtensionInfo'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'='
name|'servers_v21'
op|'.'
name|'ServersController'
op|'('
nl|'\n'
name|'extension_info'
op|'='
name|'ext_info'
op|')'
newline|'\n'
nl|'\n'
comment|'# Note(gmann): V2.1 has Access IP as separate extension. This class tests'
nl|'\n'
comment|'# calls controller directly so Access IPs will not be present in server'
nl|'\n'
comment|'# response. Those are being tested in AccessIPsExtTest class.'
nl|'\n'
DECL|member|_verify_update_access_ip
dedent|''
name|'def'
name|'_verify_update_access_ip'
op|'('
name|'self'
op|','
name|'res_dict'
op|','
name|'params'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|member|_test_create
dedent|''
name|'def'
name|'_test_create'
op|'('
name|'self'
op|','
name|'params'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
nl|'\n'
string|"'server'"
op|':'
op|'{'
nl|'\n'
string|"'name'"
op|':'
string|"'server_test'"
op|','
nl|'\n'
string|"'imageRef'"
op|':'
string|"'76fa36fc-c930-4bf3-8c8a-ea2a2420deb6'"
op|','
nl|'\n'
string|"'flavorRef'"
op|':'
string|"'http://localhost/123/flavors/3'"
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'body'
op|'['
string|"'server'"
op|']'
op|'.'
name|'update'
op|'('
name|'params'
op|')'
newline|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'create'
op|'('
name|'self'
op|'.'
name|'req'
op|','
name|'body'
op|'='
name|'body'
op|')'
op|'.'
name|'obj'
newline|'\n'
name|'return'
name|'res_dict'
newline|'\n'
nl|'\n'
DECL|member|_test_update
dedent|''
name|'def'
name|'_test_update'
op|'('
name|'self'
op|','
name|'params'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
nl|'\n'
string|"'server'"
op|':'
op|'{'
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'body'
op|'['
string|"'server'"
op|']'
op|'.'
name|'update'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
name|'res_dict'
op|'='
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|'('
name|'self'
op|'.'
name|'req'
op|','
name|'fakes'
op|'.'
name|'FAKE_UUID'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_verify_update_access_ip'
op|'('
name|'res_dict'
op|','
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_rebuild
dedent|''
name|'def'
name|'_test_rebuild'
op|'('
name|'self'
op|','
name|'params'
op|')'
op|':'
newline|'\n'
indent|' '
name|'body'
op|'='
op|'{'
nl|'\n'
string|"'rebuild'"
op|':'
op|'{'
nl|'\n'
string|"'imageRef'"
op|':'
string|"'76fa36fc-c930-4bf3-8c8a-ea2a2420deb6'"
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'body'
op|'['
string|"'rebuild'"
op|']'
op|'.'
name|'update'
op|'('
name|'params'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'_action_rebuild'
op|'('
name|'self'
op|'.'
name|'req'
op|','
name|'fakes'
op|'.'
name|'FAKE_UUID'
op|','
name|'body'
op|'='
name|'body'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_server_with_access_ipv4
dedent|''
name|'def'
name|'test_create_server_with_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'192.168.0.10'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_create'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_server_with_access_ip_pass_disabled
dedent|''
name|'def'
name|'test_create_server_with_access_ip_pass_disabled'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# test with admin passwords disabled See lp bug 921814'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'flags'
op|'('
name|'enable_instance_password'
op|'='
name|'False'
op|')'
newline|'\n'
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'192.168.0.10'"
op|','
nl|'\n'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'2001:db8::9abc'"
op|'}'
newline|'\n'
name|'res'
op|'='
name|'self'
op|'.'
name|'_test_create'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
name|'server'
op|'='
name|'res'
op|'['
string|"'server'"
op|']'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|'"admin_password"'
op|','
name|'server'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_server_with_invalid_access_ipv4
dedent|''
name|'def'
name|'test_create_server_with_invalid_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'1.1.1.1.1.1'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_create'
op|','
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_server_with_access_ipv6
dedent|''
name|'def'
name|'test_create_server_with_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'2001:db8::9abc'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_create'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_server_with_invalid_access_ipv6
dedent|''
name|'def'
name|'test_create_server_with_invalid_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'fe80:::::::'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_create'
op|','
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update_server_with_access_ipv4
dedent|''
name|'def'
name|'test_update_server_with_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'192.168.0.10'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_update'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update_server_with_invalid_access_ipv4
dedent|''
name|'def'
name|'test_update_server_with_invalid_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'1.1.1.1.1.1'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_update'
op|','
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update_server_with_access_ipv6
dedent|''
name|'def'
name|'test_update_server_with_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'2001:db8::9abc'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_update'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update_server_with_invalid_access_ipv6
dedent|''
name|'def'
name|'test_update_server_with_invalid_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'fe80:::::::'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_update'
op|','
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild_server_with_access_ipv4
dedent|''
name|'def'
name|'test_rebuild_server_with_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'192.168.0.10'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_rebuild'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild_server_with_invalid_access_ipv4
dedent|''
name|'def'
name|'test_rebuild_server_with_invalid_access_ipv4'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|':'
string|"'1.1.1.1.1.1'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_rebuild'
op|','
nl|'\n'
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild_server_with_access_ipv6
dedent|''
name|'def'
name|'test_rebuild_server_with_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'2001:db8::9abc'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'_test_rebuild'
op|'('
name|'params'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild_server_with_invalid_access_ipv6
dedent|''
name|'def'
name|'test_rebuild_server_with_invalid_access_ipv6'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'='
op|'{'
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|':'
string|"'fe80:::::::'"
op|'}'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'self'
op|'.'
name|'validation_error'
op|','
name|'self'
op|'.'
name|'_test_rebuild'
op|','
nl|'\n'
name|'params'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|AccessIPsControllerTestV21
dedent|''
dedent|''
name|'class'
name|'AccessIPsControllerTestV21'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'AccessIPsControllerTestV21'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'='
name|'access_ips'
op|'.'
name|'AccessIPsController'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_with_access_ips
dedent|''
name|'def'
name|'_test_with_access_ips'
op|'('
name|'self'
op|','
name|'func'
op|','
name|'kwargs'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|'}'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'wsgi'
op|'.'
name|'Request'
op|'('
op|'{'
string|"'nova.context'"
op|':'
nl|'\n'
name|'fakes'
op|'.'
name|'FakeRequestContext'
op|'('
string|"'fake_user'"
op|','
string|"'fake'"
op|','
nl|'\n'
name|'is_admin'
op|'='
name|'True'
op|')'
op|'}'
op|')'
newline|'\n'
name|'instance'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
string|"'1.1.1.1'"
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
string|"'fe80::'"
op|'}'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance'
op|')'
newline|'\n'
name|'resp_obj'
op|'='
name|'wsgi'
op|'.'
name|'ResponseObject'
op|'('
nl|'\n'
op|'{'
string|'"server"'
op|':'
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|'}'
op|'}'
op|')'
newline|'\n'
name|'func'
op|'('
name|'req'
op|','
name|'resp_obj'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'server'"
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
nl|'\n'
string|"'1.1.1.1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'server'"
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
nl|'\n'
string|"'fe80::'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_without_access_ips
dedent|''
name|'def'
name|'_test_without_access_ips'
op|'('
name|'self'
op|','
name|'func'
op|','
name|'kwargs'
op|'='
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|'}'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'wsgi'
op|'.'
name|'Request'
op|'('
op|'{'
string|"'nova.context'"
op|':'
nl|'\n'
name|'fakes'
op|'.'
name|'FakeRequestContext'
op|'('
string|"'fake_user'"
op|','
string|"'fake'"
op|','
nl|'\n'
name|'is_admin'
op|'='
name|'True'
op|')'
op|'}'
op|')'
newline|'\n'
name|'instance'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
name|'None'
op|'}'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance'
op|')'
newline|'\n'
name|'resp_obj'
op|'='
name|'wsgi'
op|'.'
name|'ResponseObject'
op|'('
nl|'\n'
op|'{'
string|'"server"'
op|':'
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|'}'
op|'}'
op|')'
newline|'\n'
name|'func'
op|'('
name|'req'
op|','
name|'resp_obj'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'server'"
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
nl|'\n'
string|"''"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'server'"
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
nl|'\n'
string|"''"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show
dedent|''
name|'def'
name|'test_show'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'show'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_show_without_access_ips
dedent|''
name|'def'
name|'test_show_without_access_ips'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'show'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_detail
dedent|''
name|'def'
name|'test_detail'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'wsgi'
op|'.'
name|'Request'
op|'('
op|'{'
string|"'nova.context'"
op|':'
nl|'\n'
name|'fakes'
op|'.'
name|'FakeRequestContext'
op|'('
string|"'fake_user'"
op|','
string|"'fake'"
op|','
nl|'\n'
name|'is_admin'
op|'='
name|'True'
op|')'
op|'}'
op|')'
newline|'\n'
name|'instance1'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake1'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
string|"'1.1.1.1'"
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
string|"'fe80::'"
op|'}'
newline|'\n'
name|'instance2'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake2'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
string|"'1.1.1.2'"
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
string|"'fe81::'"
op|'}'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance1'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance2'
op|')'
newline|'\n'
name|'resp_obj'
op|'='
name|'wsgi'
op|'.'
name|'ResponseObject'
op|'('
nl|'\n'
op|'{'
string|'"servers"'
op|':'
op|'['
op|'{'
string|"'id'"
op|':'
string|"'fake1'"
op|'}'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake2'"
op|'}'
op|']'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'detail'
op|'('
name|'req'
op|','
name|'resp_obj'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'0'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
nl|'\n'
string|"'1.1.1.1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'0'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
nl|'\n'
string|"'fe80::'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'1'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
nl|'\n'
string|"'1.1.1.2'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'1'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
nl|'\n'
string|"'fe81::'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_detail_without_access_ips
dedent|''
name|'def'
name|'test_detail_without_access_ips'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'req'
op|'='
name|'wsgi'
op|'.'
name|'Request'
op|'('
op|'{'
string|"'nova.context'"
op|':'
nl|'\n'
name|'fakes'
op|'.'
name|'FakeRequestContext'
op|'('
string|"'fake_user'"
op|','
string|"'fake'"
op|','
nl|'\n'
name|'is_admin'
op|'='
name|'True'
op|')'
op|'}'
op|')'
newline|'\n'
name|'instance1'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake1'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
name|'None'
op|'}'
newline|'\n'
name|'instance2'
op|'='
op|'{'
string|"'uuid'"
op|':'
string|"'fake2'"
op|','
nl|'\n'
string|"'access_ip_v4'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
name|'None'
op|'}'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance1'
op|')'
newline|'\n'
name|'req'
op|'.'
name|'cache_db_instance'
op|'('
name|'instance2'
op|')'
newline|'\n'
name|'resp_obj'
op|'='
name|'wsgi'
op|'.'
name|'ResponseObject'
op|'('
nl|'\n'
op|'{'
string|'"servers"'
op|':'
op|'['
op|'{'
string|"'id'"
op|':'
string|"'fake1'"
op|'}'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake2'"
op|'}'
op|']'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'controller'
op|'.'
name|'detail'
op|'('
name|'req'
op|','
name|'resp_obj'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'0'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
string|"''"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'0'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
string|"''"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'1'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v4_key'
op|']'
op|','
string|"''"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
nl|'\n'
name|'resp_obj'
op|'.'
name|'obj'
op|'['
string|"'servers'"
op|']'
op|'['
number|'1'
op|']'
op|'['
name|'access_ips'
op|'.'
name|'AccessIPs'
op|'.'
name|'v6_key'
op|']'
op|','
string|"''"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update
dedent|''
name|'def'
name|'test_update'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'body'"
op|':'
op|'{'
op|'}'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_update_without_access_ips
dedent|''
name|'def'
name|'test_update_without_access_ips'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'update'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'body'"
op|':'
op|'{'
op|'}'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild
dedent|''
name|'def'
name|'test_rebuild'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_with_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'rebuild'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'body'"
op|':'
op|'{'
op|'}'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_rebuild_without_access_ips
dedent|''
name|'def'
name|'test_rebuild_without_access_ips'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_without_access_ips'
op|'('
name|'self'
op|'.'
name|'controller'
op|'.'
name|'rebuild'
op|','
op|'{'
string|"'id'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'body'"
op|':'
op|'{'
op|'}'
op|'}'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.766678 | 88 | 0.613769 | 5,641 | 37,317 | 3.887431 | 0.039709 | 0.1576 | 0.098956 | 0.088103 | 0.915363 | 0.902412 | 0.890647 | 0.876374 | 0.857267 | 0.804186 | 0 | 0.012166 | 0.0969 | 37,317 | 2,922 | 89 | 12.771047 | 0.638527 | 0 | 0 | 0.945927 | 0 | 0 | 0.352386 | 0.059062 | 0 | 0 | 0 | 0 | 0.00924 | 0 | null | null | 0.002738 | 0.002738 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2462487649823815011fe38fe5e61625418f2760 | 96,202 | py | Python | k8sclient/apis/autoscaling_v1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | 1 | 2021-06-16T02:57:18.000Z | 2021-06-16T02:57:18.000Z | k8sclient/apis/autoscaling_v1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | null | null | null | k8sclient/apis/autoscaling_v1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Kubernetes
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen)
OpenAPI spec version: unversioned
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class AutoscalingV1Api(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def create_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, namespace, body, **kwargs):
"""
create a HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_autoscaling_v1_namespaced_horizontal_pod_autoscaler(namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, body, **kwargs)
else:
(data) = self.create_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, body, **kwargs)
return data
def create_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, namespace, body, **kwargs):
"""
create a HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `create_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler(self, namespace, **kwargs):
"""
delete collection of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, **kwargs)
else:
(data) = self.delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, **kwargs)
return data
def delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler_with_http_info(self, namespace, **kwargs):
"""
delete collection of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['namespace', 'pretty', 'field_selector', 'label_selector', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `delete_autoscaling_v1_collection_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedStatus',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, name, namespace, body, **kwargs):
"""
delete a HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1DeleteOptions body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
else:
(data) = self.delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
return data
def delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, name, namespace, body, **kwargs):
"""
delete a HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1DeleteOptions body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `delete_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedStatus',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def get_autoscaling_v1_api_resources(self, **kwargs):
"""
get available resources
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_autoscaling_v1_api_resources(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: UnversionedAPIResourceList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_autoscaling_v1_api_resources_with_http_info(**kwargs)
else:
(data) = self.get_autoscaling_v1_api_resources_with_http_info(**kwargs)
return data
def get_autoscaling_v1_api_resources_with_http_info(self, **kwargs):
"""
get available resources
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_autoscaling_v1_api_resources_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: UnversionedAPIResourceList
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_autoscaling_v1_api_resources" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/autoscaling/v1/'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedAPIResourceList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces(self, **kwargs):
"""
list or watch objects of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1HorizontalPodAutoscalerList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces_with_http_info(**kwargs)
else:
(data) = self.list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces_with_http_info(**kwargs)
return data
def list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces_with_http_info(self, **kwargs):
"""
list or watch objects of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1HorizontalPodAutoscalerList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_autoscaling_v1_horizontal_pod_autoscaler_for_all_namespaces" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/autoscaling/v1/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscalerList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def list_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, namespace, **kwargs):
"""
list or watch objects of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_autoscaling_v1_namespaced_horizontal_pod_autoscaler(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1HorizontalPodAutoscalerList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.list_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, **kwargs)
else:
(data) = self.list_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, **kwargs)
return data
def list_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, namespace, **kwargs):
"""
list or watch objects of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1HorizontalPodAutoscalerList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['namespace', 'pretty', 'field_selector', 'label_selector', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `list_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscalerList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, name, namespace, body, **kwargs):
"""
partially update the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
else:
(data) = self.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
return data
def patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, name, namespace, body, **kwargs):
"""
partially update the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json-patch+json', 'application/merge-patch+json', 'application/strategic-merge-patch+json'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(self, name, namespace, body, **kwargs):
"""
partially update status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, **kwargs)
else:
(data) = self.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, **kwargs)
return data
def patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(self, name, namespace, body, **kwargs):
"""
partially update status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}/status'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json-patch+json', 'application/merge-patch+json', 'application/strategic-merge-patch+json'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def read_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, name, namespace, **kwargs):
"""
read the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param bool exact: Should the export be exact. Exact export maintains cluster-specific fields like 'Namespace'
:param bool export: Should this value be exported. Export strips fields that a user can not specify.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, **kwargs)
else:
(data) = self.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, **kwargs)
return data
def read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, name, namespace, **kwargs):
"""
read the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:param bool exact: Should the export be exact. Exact export maintains cluster-specific fields like 'Namespace'
:param bool export: Should this value be exported. Export strips fields that a user can not specify.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'pretty', 'exact', 'export']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method read_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `read_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `read_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'exact' in params:
query_params['exact'] = params['exact']
if 'export' in params:
query_params['export'] = params['export']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(self, name, namespace, **kwargs):
"""
read status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, **kwargs)
else:
(data) = self.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, **kwargs)
return data
def read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(self, name, namespace, **kwargs):
"""
read status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `read_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}/status'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, name, namespace, body, **kwargs):
"""
replace the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
else:
(data) = self.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, **kwargs)
return data
def replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, name, namespace, body, **kwargs):
"""
replace the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(self, name, namespace, body, **kwargs):
"""
replace status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, **kwargs)
else:
(data) = self.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, **kwargs)
return data
def replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(self, name, namespace, body, **kwargs):
"""
replace status of the specified HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status_with_http_info(name, namespace, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param V1HorizontalPodAutoscaler body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1HorizontalPodAutoscaler
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `replace_autoscaling_v1_namespaced_horizontal_pod_autoscaler_status`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/namespaces/{namespace}/horizontalpodautoscalers/{name}/status'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1HorizontalPodAutoscaler',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces(self, **kwargs):
"""
watch individual changes to a list of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces_with_http_info(**kwargs)
else:
(data) = self.watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces_with_http_info(**kwargs)
return data
def watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces_with_http_info(self, **kwargs):
"""
watch individual changes to a list of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method watch_autoscaling_v1_horizontal_pod_autoscaler_list_for_all_namespaces" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/autoscaling/v1/watch/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VersionedEvent',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler(self, name, namespace, **kwargs):
"""
watch changes to an object of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, **kwargs)
else:
(data) = self.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, **kwargs)
return data
def watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(self, name, namespace, **kwargs):
"""
watch changes to an object of kind HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_with_http_info(name, namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the HorizontalPodAutoscaler (required)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'namespace', 'field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/watch/namespaces/{namespace}/horizontalpodautoscalers/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VersionedEvent',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list(self, namespace, **kwargs):
"""
watch individual changes to a list of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list_with_http_info(namespace, **kwargs)
else:
(data) = self.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list_with_http_info(namespace, **kwargs)
return data
def watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list_with_http_info(self, namespace, **kwargs):
"""
watch individual changes to a list of HorizontalPodAutoscaler
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list_with_http_info(namespace, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str namespace: object name and auth scope, such as for teams and projects (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['namespace', 'field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'namespace' is set
if ('namespace' not in params) or (params['namespace'] is None):
raise ValueError("Missing the required parameter `namespace` when calling `watch_autoscaling_v1_namespaced_horizontal_pod_autoscaler_list`")
collection_formats = {}
resource_path = '/apis/autoscaling/v1/watch/namespaces/{namespace}/horizontalpodautoscalers'.replace('{format}', 'json')
path_params = {}
if 'namespace' in params:
path_params['namespace'] = params['namespace']
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VersionedEvent',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
| 49.742503 | 198 | 0.624343 | 10,125 | 96,202 | 5.719111 | 0.027654 | 0.041446 | 0.049252 | 0.062688 | 0.981902 | 0.980589 | 0.979795 | 0.976635 | 0.972991 | 0.971385 | 0 | 0.002807 | 0.299994 | 96,202 | 1,933 | 199 | 49.768236 | 0.857073 | 0.371884 | 0 | 0.837514 | 1 | 0 | 0.23365 | 0.106956 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033806 | false | 0 | 0.007634 | 0 | 0.091603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
24675eb06aeb57a8ec1192e95da44262f969d767 | 55,247 | py | Python | sdk/python/pulumi_vault/database/_inputs.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2019-10-07T17:44:18.000Z | 2022-03-30T20:46:33.000Z | sdk/python/pulumi_vault/database/_inputs.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 79 | 2019-10-11T18:13:07.000Z | 2022-03-31T21:09:41.000Z | sdk/python/pulumi_vault/database/_inputs.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-28T10:08:40.000Z | 2020-03-17T14:20:55.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = [
'SecretBackendConnectionCassandraArgs',
'SecretBackendConnectionElasticsearchArgs',
'SecretBackendConnectionHanaArgs',
'SecretBackendConnectionMongodbArgs',
'SecretBackendConnectionMongodbatlasArgs',
'SecretBackendConnectionMssqlArgs',
'SecretBackendConnectionMysqlArgs',
'SecretBackendConnectionMysqlAuroraArgs',
'SecretBackendConnectionMysqlLegacyArgs',
'SecretBackendConnectionMysqlRdsArgs',
'SecretBackendConnectionOracleArgs',
'SecretBackendConnectionPostgresqlArgs',
'SecretBackendConnectionSnowflakeArgs',
]
@pulumi.input_type
class SecretBackendConnectionCassandraArgs:
def __init__(__self__, *,
connect_timeout: Optional[pulumi.Input[int]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
insecure_tls: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
pem_bundle: Optional[pulumi.Input[str]] = None,
pem_json: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
protocol_version: Optional[pulumi.Input[int]] = None,
tls: Optional[pulumi.Input[bool]] = None,
username: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[int] connect_timeout: The number of seconds to use as a connection
timeout.
:param pulumi.Input[Sequence[pulumi.Input[str]]] hosts: The hosts to connect to.
:param pulumi.Input[bool] insecure_tls: Whether to skip verification of the server
certificate when using TLS.
:param pulumi.Input[str] password: The password to be used in the connection.
:param pulumi.Input[str] pem_bundle: Concatenated PEM blocks configuring the certificate
chain.
:param pulumi.Input[str] pem_json: A JSON structure configuring the certificate chain.
:param pulumi.Input[int] port: The default port to connect to if no port is specified as
part of the host.
:param pulumi.Input[int] protocol_version: The CQL protocol version to use.
:param pulumi.Input[bool] tls: Whether to use TLS when connecting to Cassandra.
:param pulumi.Input[str] username: The username to be used in the connection (the account admin level).
"""
if connect_timeout is not None:
pulumi.set(__self__, "connect_timeout", connect_timeout)
if hosts is not None:
pulumi.set(__self__, "hosts", hosts)
if insecure_tls is not None:
pulumi.set(__self__, "insecure_tls", insecure_tls)
if password is not None:
pulumi.set(__self__, "password", password)
if pem_bundle is not None:
pulumi.set(__self__, "pem_bundle", pem_bundle)
if pem_json is not None:
pulumi.set(__self__, "pem_json", pem_json)
if port is not None:
pulumi.set(__self__, "port", port)
if protocol_version is not None:
pulumi.set(__self__, "protocol_version", protocol_version)
if tls is not None:
pulumi.set(__self__, "tls", tls)
if username is not None:
pulumi.set(__self__, "username", username)
@property
@pulumi.getter(name="connectTimeout")
def connect_timeout(self) -> Optional[pulumi.Input[int]]:
"""
The number of seconds to use as a connection
timeout.
"""
return pulumi.get(self, "connect_timeout")
@connect_timeout.setter
def connect_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "connect_timeout", value)
@property
@pulumi.getter
def hosts(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The hosts to connect to.
"""
return pulumi.get(self, "hosts")
@hosts.setter
def hosts(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "hosts", value)
@property
@pulumi.getter(name="insecureTls")
def insecure_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to skip verification of the server
certificate when using TLS.
"""
return pulumi.get(self, "insecure_tls")
@insecure_tls.setter
def insecure_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "insecure_tls", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password to be used in the connection.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="pemBundle")
def pem_bundle(self) -> Optional[pulumi.Input[str]]:
"""
Concatenated PEM blocks configuring the certificate
chain.
"""
return pulumi.get(self, "pem_bundle")
@pem_bundle.setter
def pem_bundle(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pem_bundle", value)
@property
@pulumi.getter(name="pemJson")
def pem_json(self) -> Optional[pulumi.Input[str]]:
"""
A JSON structure configuring the certificate chain.
"""
return pulumi.get(self, "pem_json")
@pem_json.setter
def pem_json(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pem_json", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The default port to connect to if no port is specified as
part of the host.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="protocolVersion")
def protocol_version(self) -> Optional[pulumi.Input[int]]:
"""
The CQL protocol version to use.
"""
return pulumi.get(self, "protocol_version")
@protocol_version.setter
def protocol_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "protocol_version", value)
@property
@pulumi.getter
def tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS when connecting to Cassandra.
"""
return pulumi.get(self, "tls")
@tls.setter
def tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "tls", value)
@property
@pulumi.getter
def username(self) -> Optional[pulumi.Input[str]]:
"""
The username to be used in the connection (the account admin level).
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username", value)
@pulumi.input_type
class SecretBackendConnectionElasticsearchArgs:
def __init__(__self__, *,
password: pulumi.Input[str],
url: pulumi.Input[str],
username: pulumi.Input[str]):
"""
:param pulumi.Input[str] password: The password to be used in the connection.
:param pulumi.Input[str] url: The URL for Elasticsearch's API. https requires certificate
by trusted CA if used.
:param pulumi.Input[str] username: The username to be used in the connection (the account admin level).
"""
pulumi.set(__self__, "password", password)
pulumi.set(__self__, "url", url)
pulumi.set(__self__, "username", username)
@property
@pulumi.getter
def password(self) -> pulumi.Input[str]:
"""
The password to be used in the connection.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: pulumi.Input[str]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The URL for Elasticsearch's API. https requires certificate
by trusted CA if used.
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def username(self) -> pulumi.Input[str]:
"""
The username to be used in the connection (the account admin level).
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: pulumi.Input[str]):
pulumi.set(self, "username", value)
@pulumi.input_type
class SecretBackendConnectionHanaArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@pulumi.input_type
class SecretBackendConnectionMongodbArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionMongodbatlasArgs:
def __init__(__self__, *,
private_key: pulumi.Input[str],
project_id: pulumi.Input[str],
public_key: pulumi.Input[str]):
"""
:param pulumi.Input[str] private_key: The Private Programmatic API Key used to connect with MongoDB Atlas API.
:param pulumi.Input[str] project_id: The Project ID the Database User should be created within.
:param pulumi.Input[str] public_key: The Public Programmatic API Key used to authenticate with the MongoDB Atlas API.
"""
pulumi.set(__self__, "private_key", private_key)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "public_key", public_key)
@property
@pulumi.getter(name="privateKey")
def private_key(self) -> pulumi.Input[str]:
"""
The Private Programmatic API Key used to connect with MongoDB Atlas API.
"""
return pulumi.get(self, "private_key")
@private_key.setter
def private_key(self, value: pulumi.Input[str]):
pulumi.set(self, "private_key", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The Project ID the Database User should be created within.
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> pulumi.Input[str]:
"""
The Public Programmatic API Key used to authenticate with the MongoDB Atlas API.
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: pulumi.Input[str]):
pulumi.set(self, "public_key", value)
@pulumi.input_type
class SecretBackendConnectionMssqlArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionMysqlArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
tls_ca: Optional[pulumi.Input[str]] = None,
tls_certificate_key: Optional[pulumi.Input[str]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] tls_ca: x509 CA file for validating the certificate presented by the MySQL server. Must be PEM encoded.
:param pulumi.Input[str] tls_certificate_key: x509 certificate for connecting to the database. This must be a PEM encoded version of the private key and the certificate combined.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if tls_ca is not None:
pulumi.set(__self__, "tls_ca", tls_ca)
if tls_certificate_key is not None:
pulumi.set(__self__, "tls_certificate_key", tls_certificate_key)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="tlsCa")
def tls_ca(self) -> Optional[pulumi.Input[str]]:
"""
x509 CA file for validating the certificate presented by the MySQL server. Must be PEM encoded.
"""
return pulumi.get(self, "tls_ca")
@tls_ca.setter
def tls_ca(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca", value)
@property
@pulumi.getter(name="tlsCertificateKey")
def tls_certificate_key(self) -> Optional[pulumi.Input[str]]:
"""
x509 certificate for connecting to the database. This must be a PEM encoded version of the private key and the certificate combined.
"""
return pulumi.get(self, "tls_certificate_key")
@tls_certificate_key.setter
def tls_certificate_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_certificate_key", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionMysqlAuroraArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionMysqlLegacyArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionMysqlRdsArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionOracleArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionPostgresqlArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
@pulumi.input_type
class SecretBackendConnectionSnowflakeArgs:
def __init__(__self__, *,
connection_url: Optional[pulumi.Input[str]] = None,
max_connection_lifetime: Optional[pulumi.Input[int]] = None,
max_idle_connections: Optional[pulumi.Input[int]] = None,
max_open_connections: Optional[pulumi.Input[int]] = None,
password: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None,
username_template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] connection_url: A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
:param pulumi.Input[int] max_connection_lifetime: The maximum number of seconds to keep
a connection alive for.
:param pulumi.Input[int] max_idle_connections: The maximum number of idle connections to
maintain.
:param pulumi.Input[int] max_open_connections: The maximum number of open connections to
use.
:param pulumi.Input[str] password: The password to be used in the connection.
:param pulumi.Input[str] username: The username to be used in the connection (the account admin level).
:param pulumi.Input[str] username_template: - [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
if connection_url is not None:
pulumi.set(__self__, "connection_url", connection_url)
if max_connection_lifetime is not None:
pulumi.set(__self__, "max_connection_lifetime", max_connection_lifetime)
if max_idle_connections is not None:
pulumi.set(__self__, "max_idle_connections", max_idle_connections)
if max_open_connections is not None:
pulumi.set(__self__, "max_open_connections", max_open_connections)
if password is not None:
pulumi.set(__self__, "password", password)
if username is not None:
pulumi.set(__self__, "username", username)
if username_template is not None:
pulumi.set(__self__, "username_template", username_template)
@property
@pulumi.getter(name="connectionUrl")
def connection_url(self) -> Optional[pulumi.Input[str]]:
"""
A URL containing connection information. See
the [Vault
docs](https://www.vaultproject.io/api-docs/secret/databases/snowflake#sample-payload)
for an example.
"""
return pulumi.get(self, "connection_url")
@connection_url.setter
def connection_url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_url", value)
@property
@pulumi.getter(name="maxConnectionLifetime")
def max_connection_lifetime(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of seconds to keep
a connection alive for.
"""
return pulumi.get(self, "max_connection_lifetime")
@max_connection_lifetime.setter
def max_connection_lifetime(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_connection_lifetime", value)
@property
@pulumi.getter(name="maxIdleConnections")
def max_idle_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of idle connections to
maintain.
"""
return pulumi.get(self, "max_idle_connections")
@max_idle_connections.setter
def max_idle_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_idle_connections", value)
@property
@pulumi.getter(name="maxOpenConnections")
def max_open_connections(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of open connections to
use.
"""
return pulumi.get(self, "max_open_connections")
@max_open_connections.setter
def max_open_connections(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_open_connections", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password to be used in the connection.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def username(self) -> Optional[pulumi.Input[str]]:
"""
The username to be used in the connection (the account admin level).
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username", value)
@property
@pulumi.getter(name="usernameTemplate")
def username_template(self) -> Optional[pulumi.Input[str]]:
"""
- [Template](https://www.vaultproject.io/docs/concepts/username-templating) describing how dynamic usernames are generated.
"""
return pulumi.get(self, "username_template")
@username_template.setter
def username_template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username_template", value)
| 41.045319 | 186 | 0.660832 | 6,363 | 55,247 | 5.534182 | 0.031275 | 0.091526 | 0.101976 | 0.06185 | 0.935679 | 0.900835 | 0.884279 | 0.863719 | 0.854546 | 0.84563 | 0 | 0.000308 | 0.234963 | 55,247 | 1,345 | 187 | 41.075836 | 0.832844 | 0.267834 | 0 | 0.784657 | 1 | 0 | 0.127323 | 0.036713 | 0 | 0 | 0 | 0 | 0 | 1 | 0.20323 | false | 0.030956 | 0.006729 | 0 | 0.320323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
03151fe7831d9b78e1353782f91a3464e0cbaad7 | 6,549 | py | Python | jupyter/ebucore_schema.py | claudio-walser/srgssr-publication-data-api | d4c671ca969da686db2efc0753fe388df3bd96fd | [
"MIT"
] | 1 | 2022-01-12T10:58:00.000Z | 2022-01-12T10:58:00.000Z | jupyter/ebucore_schema.py | claudio-walser/srgssr-publication-data-api | d4c671ca969da686db2efc0753fe388df3bd96fd | [
"MIT"
] | 3 | 2021-09-07T09:08:29.000Z | 2021-09-07T09:09:05.000Z | jupyter/ebucore_schema.py | claudio-walser/srgssr-publication-data-api | d4c671ca969da686db2efc0753fe388df3bd96fd | [
"MIT"
] | 1 | 2021-04-13T22:37:33.000Z | 2021-04-13T22:37:33.000Z | import sgqlc.types
ebucore_schema = sgqlc.types.Schema()
########################################################################
# Scalars and Enumerations
########################################################################
Boolean = sgqlc.types.Boolean
String = sgqlc.types.String
########################################################################
# Input Objects
########################################################################
########################################################################
# Output Objects and Interfaces
########################################################################
class Agent(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('agent_name', 'has_role')
agent_name = sgqlc.types.Field(String, graphql_name='agentName')
has_role = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(String))), graphql_name='hasRole')
class Asset(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('asset_id', 'title', 'abstract', 'date', 'has_contributor')
asset_id = sgqlc.types.Field(sgqlc.types.non_null(String), graphql_name='assetId')
title = sgqlc.types.Field(String, graphql_name='title')
abstract = sgqlc.types.Field(String, graphql_name='abstract')
date = sgqlc.types.Field(String, graphql_name='date')
has_contributor = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(Agent))), graphql_name='hasContributor')
class BusinessObject(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('orientation', 'asset_id', 'title', 'abstract', 'date', 'has_contributor')
orientation = sgqlc.types.Field(String, graphql_name='orientation')
asset_id = sgqlc.types.Field(sgqlc.types.non_null(String), graphql_name='assetId')
title = sgqlc.types.Field(String, graphql_name='title')
abstract = sgqlc.types.Field(String, graphql_name='abstract')
date = sgqlc.types.Field(String, graphql_name='date')
has_contributor = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(Agent))), graphql_name='hasContributor')
class EditorialObject(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('approved_by', 'is_distributed_on', 'orientation', 'asset_id', 'title', 'abstract', 'date', 'has_contributor')
approved_by = sgqlc.types.Field(Agent, graphql_name='approvedBy')
is_distributed_on = sgqlc.types.Field(String, graphql_name='isDistributedOn')
orientation = sgqlc.types.Field(String, graphql_name='orientation')
asset_id = sgqlc.types.Field(sgqlc.types.non_null(String), graphql_name='assetId')
title = sgqlc.types.Field(String, graphql_name='title')
abstract = sgqlc.types.Field(String, graphql_name='abstract')
date = sgqlc.types.Field(String, graphql_name='date')
has_contributor = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(Agent))), graphql_name='hasContributor')
class Group(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('total_number_of_episodes', 'asset_id', 'title', 'abstract', 'date', 'has_contributor')
total_number_of_episodes = sgqlc.types.Field(String, graphql_name='totalNumberOfEpisodes')
asset_id = sgqlc.types.Field(sgqlc.types.non_null(String), graphql_name='assetId')
title = sgqlc.types.Field(String, graphql_name='title')
abstract = sgqlc.types.Field(String, graphql_name='abstract')
date = sgqlc.types.Field(String, graphql_name='date')
has_contributor = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(Agent))), graphql_name='hasContributor')
class Person(sgqlc.types.Interface):
__schema__ = ebucore_schema
__field_names__ = ('agent_name', 'date_of_birth', 'has_role')
agent_name = sgqlc.types.Field(String, graphql_name='agentName')
date_of_birth = sgqlc.types.Field(String, graphql_name='dateOfBirth')
has_role = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(String))), graphql_name='hasRole')
class Query(sgqlc.types.Type):
__schema__ = ebucore_schema
__field_names__ = ('assets',)
assets = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(Asset)), graphql_name='assets', args=sgqlc.types.ArgDict((
('ids', sgqlc.types.Arg(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(String))), graphql_name='ids', default=None)),
))
)
class Character(sgqlc.types.Type, Person, Agent):
__schema__ = ebucore_schema
__field_names__ = ('given_name', 'family_name', 'character_name')
given_name = sgqlc.types.Field(String, graphql_name='givenName')
family_name = sgqlc.types.Field(String, graphql_name='familyName')
character_name = sgqlc.types.Field(String, graphql_name='characterName')
class Episode(sgqlc.types.Type, EditorialObject, BusinessObject, Asset):
__schema__ = ebucore_schema
__field_names__ = ('has_manifestation',)
has_manifestation = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null('MediaResource'))), graphql_name='hasManifestation')
class MediaResource(sgqlc.types.Type, Asset):
__schema__ = ebucore_schema
__field_names__ = ('has_format', 'has_manifestation')
has_format = sgqlc.types.Field(String, graphql_name='hasFormat')
has_manifestation = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null('MediaResource'))), graphql_name='hasManifestation')
class Series(sgqlc.types.Type, Group, Asset):
__schema__ = ebucore_schema
__field_names__ = ('has_manifestation',)
has_manifestation = sgqlc.types.Field(sgqlc.types.non_null(sgqlc.types.list_of(sgqlc.types.non_null(MediaResource))), graphql_name='hasManifestation')
class Staff(sgqlc.types.Type, Person, Agent):
__schema__ = ebucore_schema
__field_names__ = ('given_name', 'family_name')
given_name = sgqlc.types.Field(String, graphql_name='givenName')
family_name = sgqlc.types.Field(String, graphql_name='familyName')
########################################################################
# Unions
########################################################################
########################################################################
# Schema Entry Points
########################################################################
ebucore_schema.query_type = Query
ebucore_schema.mutation_type = None
ebucore_schema.subscription_type = None
| 47.456522 | 156 | 0.670637 | 750 | 6,549 | 5.506667 | 0.110667 | 0.227603 | 0.145278 | 0.127119 | 0.801211 | 0.794189 | 0.763196 | 0.727119 | 0.656901 | 0.656901 | 0 | 0 | 0.09971 | 6,549 | 137 | 157 | 47.80292 | 0.700475 | 0.014506 | 0 | 0.534884 | 0 | 0 | 0.142308 | 0.007858 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011628 | 0 | 0.895349 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
0330189515983ac5d7b75493a1a3b33135772c0a | 49,328 | py | Python | eureka/lib/demc.py | iancrossfield/Eureka | 88b178d1b830c16915045b6387cf91955e0071e2 | [
"MIT"
] | 15 | 2020-08-07T12:12:17.000Z | 2022-03-29T10:20:38.000Z | eureka/lib/demc.py | iancrossfield/Eureka | 88b178d1b830c16915045b6387cf91955e0071e2 | [
"MIT"
] | 159 | 2020-08-05T14:34:59.000Z | 2022-03-31T21:02:10.000Z | eureka/lib/demc.py | iancrossfield/Eureka | 88b178d1b830c16915045b6387cf91955e0071e2 | [
"MIT"
] | 17 | 2021-06-16T09:40:41.000Z | 2022-03-22T18:28:07.000Z |
import numpy as np
import numpy.random as npr
import time#, timer
from . import gelmanrubin as gr
#reload(gr)
#import python_models as mc
#import models_c as mc
import multiprocessing as mp
def calcModel(nchains, functype, myfuncs, pedit, nextp, iortholist, funcx, cummodels, numparams, j, iblock=None, chains=None):
'''
Compute model light curve by combining model components. Also returns correlated noise parameters.
'''
#Build final model from model components
ymodels = np.ones((nchains, fit[j].nobj))
noisepars = [[] for i in range(nchains)]
k = 0
if chains == None:
chains = range(nchains)
if iblock == None:
iblock = range(cummodels[j],cummodels[j+1])
for i in range(cummodels[j],cummodels[j+1]):
if iblock.__contains__(i):
for n in chains:
if functype[i] == 'ortho':
#MODIFY COPY OF nextp ONLY
pedit[n,iortholist] = myfuncs[i](pedit[n,iortholist], funcx[i], fit[j].etc[k])
elif (functype[i] == 'ipmap') or (functype[i] == 'spline'):
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], ymodels[n])
elif functype[i] == 'posoffset':
# Record change in Position 0 => cannot orthogonalize position parameters
ymodels[n] *= myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
elif hasattr(fit[j], 'timebins') and (functype[i] == 'ecl/tr'
or functype[i] == 'ramp'
or functype[i] == 'sinusoidal'):
# Average over high-resolution model
hiresmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
if len(fit[j].timebins) == fit[j].nobj:
for tb in range(len(fit[j].timebins)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebins[tb]])
else:
for tb in range(len(fit[j].timebinsuc)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebinsuc[tb]])
elif functype[i] == 'noise':
noisepars[n] = pedit[n,numparams[i]:numparams[i+1]]
else:
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
k += 1
return ymodels, noisepars
# Calculate chi^2
def calcChisq(y, sigma, ymodels, nchains, nextp, j, noisepars, isrednoise, wavelet, noisefunc, chains=None):
'''
Compute chi-squared with priors.
'''
if chains == None:
chains = range(nchains)
chi2 = np.zeros(nchains)
for n in chains:
if isrednoise == False:
#chi2[n] = mc.chisq(ymodels[n], y, sigma)
chi2[n] += np.sum((ymodels[n] - y)**2 / sigma**2)
else:
chi2[n] = noisefunc(noisepars[n], ymodels[n]-y, wavelet)
# Apply prior, if one exists
if len(fit[j].ipriors) > 0:
pbar = fit[j].priorvals[:,0] #prior mean
psigma = np.zeros(len(pbar)) #prior standard deviation
# Determine psigma based on which side of asymmetric Gaussian nextp is on
for i in range(len(fit[j].ipriors)):
if nextp[n,fit[j].ipriors[i]] < pbar[i]:
psigma[i] = fit[j].priorvals[i,1]
else:
psigma[i] = fit[j].priorvals[i,2]
#chi2[n] += fit[j].nobj*((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
chi2[n] += ((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
return chi2
def demc_block(y, pars, pmin, pmax, stepsize, numit, sigma, numparams, cummodels, functype, myfuncs, funcx, iortholist, fits, gamma=None, isGR=True, ncpu=1):
"""
This function uses a differential evolution Markov chain with block updating to assess uncertainties.
PARAMETERS
----------
y: Array containing dependent data
Params: Array of initial guess for parameters
#Pmin: Array of parameter minimum values
#Pmax: Array of parameter maximum values
stepsize: Array of 1-sigma change in parameter per iteration
Numit: Number of iterations to perform
Sigma: Standard deviation of data noise in y
Numparams: Number of parameters for each model
Cummodels: Cumulative number of models used
Functype: Define function type (eclipse, ramp, ip, etc), see models.py
Myfuncs: Pointers to model functions
Funcx: Array of x-axis values for myfuncs
fit: List of fit objects
gamma: Multiplcation factor in parameter differential, establishes acceptance rate
OUTPUTS
-------
This function returns an array of the best fitting parameters,
an array of all parameters over all iterations, and numaccept.
REFERENCES
----------
Cajo J. F. Ter Braak, "Genetic algorithms and Markov Chain Monte Carlo: Differential Evolution Markov Chain makes Bayesian computing easy," Biometrics, 2006.
HISTORY
-------
Adapted from mcmc.py
Kevin Stevenson, UChicago August 2012
"""
global fit
fit = fits
params = np.copy(pars)
nchains, nump = params.shape
nextp = np.copy(params) #Proposed parameters
bestp = np.copy(params[0]) #Best-fit parameters
pedit = np.copy(params) #Editable parameters
numaccept = 0
allparams = np.zeros((nump, nchains, numit))
inotfixed = np.where(stepsize != 0)[0]
ishare = np.where(stepsize < 0)[0]
#ifree = np.where(stepsize > 0)[0]
outside = np.zeros((nchains, nump))
numevents = len(fit)
intsteps = np.min((numit/5,1e5))
isrednoise = False
wavelet = None
noisefunc = None
#UPDATE PARAMTER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
for s in range(ishare.size):
params[:,ishare[s]] = params[:,int(abs(stepsize[ishare[s]])-1)]
#Define blocks
blocks = []
for j in range(numevents):
#Build list of blocks
blocks = np.concatenate((blocks, fit[j].blocks))
for i in range(cummodels[j],cummodels[j+1]):
if functype[i] == 'noise':
# Set up for modified chi-squared calculation using correlated noise
isrednoise = True
wavelet = fit[j].etc[k]
noisefunc = myfuncs[i]
blocks = blocks.astype(int)
iblocks = []
eps = []
numblocks = blocks.max() + 1
numbp = np.zeros(numblocks)
ifree = [[] for i in range(numblocks)]
for b in range(numblocks):
#Map block indices
whereb = np.where(blocks == b)[0]
iblocks.append(whereb)
#Locate indices of free parameters in each block
for w in whereb:
ifree[b] = np.concatenate((ifree[b],numparams[w]+np.where(stepsize[numparams[w]:numparams[w+1]] > 0)[0])).astype(int)
#Calculate number of free parameters per block
numbp[b] += len(ifree[b])
eps.append(npr.normal(0, stepsize[ifree[b]]/100., [numit,numbp[b]]))
print("Number of free parameters per block:")
print(numbp)
numa = np.zeros(numblocks)
if gamma == None:
gamma = 2.38/np.sqrt(2.*numbp)
print("gamma:")
print(gamma)
#Calc chi-squared for model type using current params
currchisq = np.zeros(nchains)
currmodel = [[] for i in range(numevents)]
for j in range(numevents):
currmodel[j], noisepars = calcModel(nchains, functype, myfuncs, pedit, params, iortholist[j],
funcx, cummodels, numparams, j)
currchisq += calcChisq(y[j], sigma[j], currmodel[j], nchains, params, j, noisepars, isrednoise, wavelet, noisefunc)
bestchisq = currchisq[0]
#GENERATE RANDOM NUMBERS FOR MCMC
numnotfixed = len(inotfixed)
unif = npr.rand(numit,nchains)
randchains = npr.randint(0,nchains,[numit,nchains,2])
#START TIMER
clock = timer.Timer(numit,progress = np.arange(0.05,1.01,0.05))
#Run Differential Evolution Monte Carlo algorithm 'numit' times
for m in range(numit):
#Select next event (block) to update
b = m % numblocks
#Remove model component(s) that are taking a step
pedit = np.copy(params)
nextmodel = currmodel[:]
for j in range(numevents):
ymodels, noisepars = calcModel(nchains, functype, myfuncs, pedit, params, iortholist[j],
funcx, cummodels, numparams, j, iblocks[b])
nextmodel[j] = np.divide(currmodel[j],ymodels)
#Generate next step using differential evolution
for n in range(nchains):
rand1, rand2 = randchains[m,n]
while rand1 == n or rand2 == n or rand1 == rand2:
rand1, rand2 = npr.randint(0,nchains,2)
nextp[n,ifree[b]] = params[n,ifree[b]] + gamma[b]*(params[rand1,ifree[b]]-params[rand2,ifree[b]]) + eps[b][m]
#CHECK FOR NEW STEPS OUTSIDE BOUNDARIES
ioutside = np.where(np.bitwise_or(nextp[n] < pmin, nextp[n] > pmax))[0]
if (len(ioutside) > 0):
nextp[n,ioutside] = np.copy(params[n,ioutside])
outside[n,ioutside] += 1
#UPDATE PARAMTER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
for s in range(ishare.size):
nextp[:,ishare[s]] = nextp[:,int(abs(stepsize[ishare[s]])-1)]
#COMPUTE NEXT CHI SQUARED AND ACCEPTANCE VALUES
pedit = np.copy(nextp)
nextchisq = np.zeros(nchains)
for j in range(numevents):
ymodels, noisepars = calcModel(nchains, functype, myfuncs, pedit, params, iortholist[j], funcx, cummodels, numparams, j, iblocks[b])
nextmodel[j] = np.multiply(nextmodel[j],ymodels)
nextchisq += calcChisq(y[j], sigma[j], nextmodel[j], nchains, params, j, noisepars, isrednoise, wavelet, noisefunc)
#CALCULATE ACCEPTANCE PROBABILITY
accept = np.exp(0.5 * (currchisq - nextchisq))
#print(b,currchisq[0], nextchisq[0], accept[0])
for n in range(nchains):
if accept[n] >= 1:
#ACCEPT BETTER STEP
numaccept += 1
numa[b] += 1
params[n] = np.copy(nextp[n])
currchisq[n] = np.copy(nextchisq[n])
if (currchisq[n] < bestchisq):
bestp = np.copy(params[n])
bestchisq = np.copy(currchisq[n])
elif unif[m,n] <= accept[n]:
#ACCEPT WORSE STEP
numaccept += 1
numa[b] += 1
params[n] = np.copy(nextp[n])
currchisq[n] = np.copy(nextchisq[n])
allparams[:,:,m] = params.T
#PRINT INTERMEDIATE INFO
if ((m+1) % intsteps == 0) and (m > 0):
print("\n" + time.ctime())
#print("Number of times parameter tries to step outside its prior:")
#print(outside)
print("Current Best Parameters: ")
print(bestp)
#Apply Gelman-Rubin statistic
if isGR:
#Check for no accepted steps in each chain
#stdev = np.std(allparams[inotfixed],axis=1)
#ichain = np.where(stdev > 0.)[0]
#Call test
#psrf, meanpsrf = gr.convergetest(allparams[inotfixed,ichain,:m+1], len(ichain))
psrf, meanpsrf = gr.convergetest(allparams[inotfixed,:,:m+1], nchains)
numconv = np.sum(np.bitwise_and(psrf < 1.01, psrf >= 1.00))
print("Gelman-Rubin statistic for free parameters:")
print(psrf)
if numconv == numnotfixed: #and m >= 1e4:
print("All parameters have converged to within 1% of unity. Halting MCMC.")
allparams = allparams[:,:,:m+1]
break
clock.check(m+1)
print("Acceptance rate per block (%):")
print(100.*numa*numblocks/numit/nchains)
allparams = np.reshape(allparams,(nump, (m+1)*nchains))
return allparams, bestp, numaccept, (m+1)*nchains
#****************************************************************
def calcChi2(nchains, functype, myfuncs, pedit, nextp, iortholist, funcx, cummodels, numparams, j, isrednoise, wavelet, noisefunc, systematics, chains=None):
'''
Compute model light curve by combining model components.
'''
#Build final model from model components
ymodels = np.ones((nchains, fit[j].nobj))
noisepars = [[] for i in range(nchains)]
k = 0
if chains == None:
chains = range(nchains)
for i in range(cummodels[j],cummodels[j+1]):
for n in chains:
if functype[i] == 'ortho':
#MODIFY COPY OF nextp ONLY
pedit[n,iortholist] = myfuncs[i](pedit[n,iortholist], funcx[i], fit[j].etc[k])
elif (functype[i] == 'ipmap') or (functype[i] == 'spline'):
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], ymodels[n])
elif functype[i] == 'posoffset':
# Record change in Position 0 => cannot orthogonalize position parameters
ymodels[n] *= myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
elif hasattr(fit[j], 'timebins') and (functype[i] == 'ecl/tr'
or functype[i] == 'ramp'
or functype[i] == 'sinusoidal'):
# Average over high-resolution model
hiresmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
if len(fit[j].timebins) == fit[j].nobj:
for tb in range(len(fit[j].timebins)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebins[tb]])
else:
for tb in range(len(fit[j].timebinsuc)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebinsuc[tb]])
elif functype[i] == 'noise':
noisepars[n] = pedit[n,numparams[i]:numparams[i+1]]
else:
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
k += 1
# Calculate chi^2
chi2 = np.zeros(nchains)
for n in chains:
if isrednoise == False:
#chi2[n] = mc.chisq(ymodels[n]*systematics[n][j], data[j], unc[j])
chi2[n] = np.sum((ymodels[n]*systematics[n][j] - data[j])**2 / unc[j]**2)
else:
chi2[n] = noisefunc(noisepars[n], ymodels[n]*systematics[n][j]-data[j], wavelet)
# Apply prior, if one exists
if len(fit[j].ipriors) > 0:
pbar = fit[j].priorvals[:,0] #prior mean
psigma = np.zeros(len(pbar)) #prior standard deviation
# Determine psigma based on which side of asymmetric Gaussian nextp is on
for i in range(len(fit[j].ipriors)):
if nextp[n,fit[j].ipriors[i]] < pbar[i]:
psigma[i] = fit[j].priorvals[i,1]
else:
psigma[i] = fit[j].priorvals[i,2]
#chi2[n] += fit[j].nobj*((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
chi2[n] += ((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
return chi2
#
def writeChi2(chi2):
'''
Write models after multiprocessing.
'''
global nextchisq
nextchisq += chi2
return
def demc(y, pars, pmin, pmax, stepsize, numit, sigma, numparams, cummodels, functype, myfuncs, funcx, iortholist, nights, fits, gamma=None, isGR=True, ncpu=1):
"""
This function uses a differential evolution Markov chain to assess uncertainties.
PARAMETERS
----------
y: Array containing dependent data
Params: Array of initial guess for parameters
#Pmin: Array of parameter minimum values
#Pmax: Array of parameter maximum values
stepsize: Array of 1-sigma change in parameter per iteration
Numit: Number of iterations to perform
Sigma: Standard deviation of data noise in y
Numparams: Number of parameters for each model
Cummodels: Cumulative number of models used
Functype: Define function type (eclipse, ramp, ip, etc), see models.py
Myfuncs: Pointers to model functions
Funcx: Array of x-axis values for myfuncs
fit: List of fit objects
gamma: Multiplcation factor in parameter differential, establishes acceptance rate
OUTPUTS
-------
This function returns an array of the best fitting parameters,
an array of all parameters over all iterations, and numaccept.
REFERENCES
----------
Cajo J. F. Ter Braak, "Genetic algorithms and Markov Chain Monte Carlo: Differential Evolution Markov Chain makes Bayesian computing easy," Biometrics, 2006.
HISTORY
-------
Adapted from mcmc.py
Kevin Stevenson, UChicago August 2012
Multiplied prior by number of points in fit
January 2014
"""
global nextchisq, fit, data, unc
fit = fits
data = y
unc = sigma
params = np.copy(pars)
nchains, nump = params.shape
nextp = np.copy(params) #Proposed parameters
bestp = np.copy(params[0]) #Best-fit parameters
pedit = np.copy(params) #Editable parameters
numaccept = 0
#allparams must be 64-bit!
allparams = np.zeros((nump, nchains, numit))
inotfixed = np.where(stepsize != 0)[0]
ishare = np.where(stepsize < 0)[0]
ifree = np.where(stepsize > 0)[0]
outside = np.zeros((nchains, nump))
numevents = len(fit)
intsteps = np.min((numit/5,1e5))
isrednoise = False
wavelet = None
noisefunc = None
numfree = len(ifree)
print("Number of free parameters:")
print(len(ifree))
if gamma == None:
gamma = 2.38/np.sqrt(2*numfree)
print('Gamma = ' + str(gamma))
#UPDATE PARAMTER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
for s in range(ishare.size):
params[:,ishare[s]] = params[:,int(abs(stepsize[ishare[s]])-1)]
# Construct non-analytic systematic model
for nn in np.unique(nights):
tonight = np.where(nights == nn)[0]
if hasattr(fit[tonight[0]], 'whiteparams') and fit[tonight[0]].whiteparams != None:
if type(fit[tonight[0]].whiteparams) == type(np.array([])):
#Only 1 model in model for white LC, grandfathered code
#print("WARNING: You are using grandfathered code. Update whiteparams to handle multiple models.")
#whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
i = int(fit[tonight[0]].whiteparams[0])
whitemodel = myfuncs[i](fit[tonight[0]].whiteparams[1:], fit[tonight[0]].tuall, None)
else:
#Any number of models can be used to build white LC
whitemodel = np.ones((nchains,len(fit[tonight[0]].good)))
for k in range(len(fit[tonight[0]].whiteparams)):
i = int(fit[tonight[0]].whiteparams[k][0])
whitemodel *= myfuncs[i](fit[tonight[0]].whiteparams[k][1:], fit[tonight[0]].tuall, None)
#whitemodel *= myfuncs[i](fit[tonight[0]].whiteparams[k][1:], funcxuc[i], None)
for j in tonight:
fit[j].whitemodel = whitemodel
elif hasattr(fit[tonight[0]], 'iswhitelc') and fit[tonight[0]].iswhitelc != False:
whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
weight = np.zeros((nchains,len(fit[tonight[0]].good)))
for n in range(nchains):
for j in tonight:
k = 0
for i in range(cummodels[j],cummodels[j+1]):
if functype[i] == 'ecl/tr':
specmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
specmodeluc = np.zeros(len(fit[j].clipmask))
specmodeluc[fit[j].isclipmask] = specmodel
whitemodel[n,fit[j].isgood] += specmodeluc
weight [n,fit[j].isgood] += specmodel[0]
k += 1
whitemodel[n] /= weight[n]
#FINDME: Need to determine exact anchor point
#slope = fit[0].iswhitelc / (1-whitemodel[n].min())
#offset = 1 - slope
#whitemodel[n] = slope*whitemodel[n] + offset
for j in tonight:
fit[j].whitemodel = whitemodel
else:
for j in tonight:
fit[j].whitemodel = np.ones((nchains,len(fit[j].good)))
#Calc chi-squared for model type using current params
currchisq = np.zeros(nchains)
noisepars = [[] for i in range(nchains)]
for j in range(numevents):
#Build final model from model components
ymodels = np.ones((nchains, fit[j].nobj))
k = 0
for i in range(cummodels[j],cummodels[j+1]):
for n in range(nchains):
if functype[i] == 'ortho':
#MODIFY COPY OF nextp ONLY
pedit[n,iortholist[j]] = myfuncs[i](pedit[n,iortholist[j]], funcx[i], fit[j].etc[k])
elif (functype[i] == 'ipmap') or (functype[i] == 'spline'):
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], ymodels[n])
elif functype[i] == 'posoffset':
# Record change in Position 0 => cannot orthogonalize position parameters
ymodels[n] *= myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
elif hasattr(fit[j], 'timebins') and (functype[i] == 'ecl/tr'
or functype[i] == 'ramp'
or functype[i] == 'sinusoidal'):
# Average over high-resolution model
hiresmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
if len(fit[j].timebins) == fit[j].nobj:
for tb in range(len(fit[j].timebins)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebins[tb]])
else:
for tb in range(len(fit[j].timebinsuc)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebinsuc[tb]])
elif functype[i] == 'noise':
# Set up for modified chi-squared calculation using correlated noise
isrednoise = True
wavelet = fit[j].etc[k]
noisefunc = myfuncs[i]
noisepars[n] = pedit[n,numparams[i]:numparams[i+1]]
else:
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
k += 1
#Multiply analytic model by non-analytic systematics model
systematics = fit[j].whitelc*fit[j].refspeclc/(fit[j].whitemodel[n][fit[j].isgood].flatten())[fit[j].isclipmask]
ymodels[n] *= systematics
# Calculate chi^2
for n in range(nchains):
if isrednoise == False:
currchisq[n] += np.sum((ymodels[n] - y[j])**2 / sigma[j]**2)
#currchisq[n] += mc.chisq(ymodels[n], y[j], sigma[j])
else:
currchisq[n] += noisefunc(noisepars[n], ymodels[n]-y[j], wavelet)
# Apply prior, if one exists
if len(fit[j].ipriors) > 0:
pbar = fit[j].priorvals[:,0] #prior mean
psigma = np.zeros(len(pbar)) #prior standard deviation
# Determine psigma based on which side of asymmetric Gaussian nextp is on
for i in range(len(fit[j].ipriors)):
if nextp[n,fit[j].ipriors[i]] < pbar[i]:
psigma[i] = fit[j].priorvals[i,1]
else:
psigma[i] = fit[j].priorvals[i,2]
#currchisq[n] += fit[j].nobj*((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
currchisq[n] += ((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
bestchisq = currchisq[0]
#GENERATE RANDOM NUMBERS FOR MCMC
numnotfixed = len(inotfixed)
unif = npr.rand(numit,nchains)
#START TIMER
clock = timer.Timer(numit,progress = np.arange(0.05,1.01,0.05))
#Run Differential Evolution Monte Carlo algorithm 'numit' times
b = gamma*stepsize[ifree]/100.
for m in range(numit):
for n in range(nchains):
#Generate next step using differential evolution
rand1, rand2 = npr.randint(0,nchains,2)
while rand1 == n or rand2 == n or rand1 == rand2:
rand1, rand2 = npr.randint(0,nchains,2)
nextp[n,ifree] = params[n,ifree] + gamma*(params[rand1,ifree]-params[rand2,ifree]) \
+ npr.normal(0, b, numfree)
#CHECK FOR NEW STEPS OUTSIDE BOUNDARIES
ioutside = np.where(np.bitwise_or(nextp[n] < pmin, nextp[n] > pmax))[0]
if (len(ioutside) > 0):
nextp[n,ioutside] = np.copy(params[n,ioutside])
outside[n,ioutside] += 1
#UPDATE PARAMTER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
for s in range(ishare.size):
nextp[:,ishare[s]] = nextp[:,int(abs(stepsize[ishare[s]])-1)]
# Construct non-analytic systematic model
for nn in np.unique(nights):
tonight = np.where(nights == nn)[0]
if hasattr(fit[tonight[0]], 'whiteparams') and fit[tonight[0]].whiteparams != None:
pass
elif hasattr(fit[tonight[0]], 'iswhitelc') and fit[tonight[0]].iswhitelc != False:
whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
weight = np.zeros((nchains,len(fit[tonight[0]].good)))
for n in range(nchains):
for j in tonight:
k = 0
for i in range(cummodels[j],cummodels[j+1]):
if functype[i] == 'ecl/tr':
specmodel = myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
specmodeluc = np.zeros(len(fit[j].clipmask))
specmodeluc[fit[j].isclipmask] = specmodel
whitemodel[n,fit[j].isgood] += specmodeluc
weight [n,fit[j].isgood] += specmodel[0]
k += 1
whitemodel[n] /= weight[n]
#FINDME: Need to determine exact anchor point
#Also modify statement in w6model.py
#slope = fit[0].iswhitelc / (1-whitemodel[n].min())
#offset = 1 - slope
#whitemodel[n] = slope*whitemodel[n] + offset
for j in tonight:
fit[j].whitemodel = whitemodel
else:
for j in tonight:
fit[j].whitemodel = np.ones((nchains,len(fit[j].good)))
# Assemble systematics models
systematics = [[] for n in range(nchains)]
for n in range(nchains):
for j in range(numevents):
systematics[n].append(fit[j].whitelc*fit[j].refspeclc/(fit[j].whitemodel[n][fit[j].isgood].flatten())[fit[j].isclipmask])
#systematics[n].append(((fit[j].whitelc/whitemodel[n])[fit[j].isgood].flatten())[fit[j].isclipmask])
#COMPUTE NEXT CHI SQUARED AND ACCEPTANCE VALUES
pedit = np.copy(nextp)
nextchisq = np.zeros(nchains)
if ncpu == 1:
# Only 1 CPU
for j in range(numevents):
nextchisq += calcChi2(nchains, functype, myfuncs, pedit, nextp, iortholist[j], funcx, cummodels, numparams, j, isrednoise=isrednoise, wavelet=wavelet, noisefunc=noisefunc, systematics=systematics)
else:
# Multiple CPUs
# Code works but is less efficient
pool = mp.Pool(ncpu)
for j in range(numevents):
res = pool.apply_async(calcChi2, args=(nchains, functype, myfuncs, pedit, nextp, iortholist[j], funcx, cummodels, numparams, j, isrednoise, wavelet, noisefunc, systematics), callback=writeChi2)
pool.close()
pool.join()
res.wait()
#CALCULATE ACCEPTANCE PROBABILITY
accept = np.exp(0.5 * (currchisq - nextchisq))
for n in range(nchains):
if (accept[n] >= 1) or (unif[m,n] <= accept[n]):
#ACCEPT STEP
numaccept += 1
params[n] = np.copy(nextp[n])
currchisq[n] = nextchisq[n]
if (currchisq[n] < bestchisq):
bestp = np.copy(params[n])
bestchisq = np.copy(currchisq[n])
allparams[:,:,m] = params.T
#PRINT INTERMEDIATE INFO
if ((m+1) % intsteps == 0) and (m > 0):
print("\n" + time.ctime())
#print("Number of times parameter tries to step outside its prior:")
#print(outside)
print("Current Best Parameters: ")
print(bestp)
#Apply Gelman-Rubin statistic
if isGR:
#Check for no accepted steps in each chain
stdev = np.std(allparams[inotfixed[0],:,:m+1],axis=1)
ichain = np.where(stdev > 1e-8)[0]
#Call test
foo = allparams[inotfixed]
psrf, meanpsrf = gr.convergetest(foo[:,ichain,:m+1], len(ichain))
#psrf, meanpsrf = gr.convergetest(allparams[inotfixed,:,:m+1], nchains)
numconv = np.sum(np.bitwise_and(psrf < 1.01, psrf >= 1.00))
print("Gelman-Rubin statistic for free parameters:")
print(psrf)
if numconv == numnotfixed: #and j >= 1e4:
print("All parameters have converged to within 1% of unity. Halting MCMC.")
allparams = allparams[:,:,:m+1]
break
clock.check(m+1)
#Check for no accepted steps in each chain
stdev = np.std(allparams[inotfixed[0]],axis=1)
ichain = np.where(stdev > 1e-8)[0]
print("Number of good chains: " + str(len(ichain)))
#print(len(ichain), ichain)
#print(stdev)
allparams = allparams[:,ichain]
allparams = np.reshape(allparams,(nump, (m+1)*len(ichain)))
return allparams, bestp, numaccept, (m+1)*len(ichain)
def demcz(y, pars, stdpburnin, pmin, pmax, stepsize, numit, sigma, numparams, cummodels, functype, myfuncs, funcx, iortholist, nights, fits, gamma=None, isGR=True, ncpu=1):
"""
This function uses a differential evolution Markov chain with fewer chains to assess uncertainties.
PARAMETERS
----------
y: Array containing dependent data
Params: Array of initial guess for parameters
stdpburnin:Standard deviation of allparams from burn-in
Pmin: Array of parameter minimum values
Pmax: Array of parameter maximum values
stepsize: Array of 1-sigma change in parameter per iteration
Numit: Number of iterations to perform
Sigma: Standard deviation of data noise in y
Numparams: Number of parameters for each model
Cummodels: Cumulative number of models used
Functype: Define function type (eclipse, ramp, ip, etc), see models.py
Myfuncs: Pointers to model functions
Funcx: Array of x-axis values for myfuncs
fit: List of fit objects
gamma: Multiplication factor in parameter differential, establishes acceptance rate
OUTPUTS
-------
This function returns an array of the best fitting parameters,
an array of all parameters over all iterations, and numaccept.
REFERENCES
----------
Cajo J. F. Ter Braak, "Differential Evolution Markov Chain with snooker updater and fewer chains" Stat Comput, 2008.
HISTORY
-------
Adapted from mcmc.py August 2012
Kevin Stevenson, UChicago
Multiplied prior by number of points in fit January 2014
Adapted from demc() August 2014
"""
global nextchisq, fit, data, unc
fit = fits
data = y
unc = sigma
params = np.copy(pars)
nchains, nump = params.shape
nextp = np.copy(params) #Proposed parameters
bestp = np.copy(params[0]) #Best-fit parameters
pedit = np.copy(params) #Editable parameters
numaccept = 0
ifixed = np.where(stepsize == 0)[0] #Indices of fixed parameters
inotfixed = np.where(stepsize != 0)[0] #Indices of non-fixed parameters
ishare = np.where(stepsize < 0)[0] #Indices of shared parameters
ifree = np.where(stepsize > 0)[0] #Indices of free parameters
#outside = np.zeros((nchains, nump))
numevents = len(fit)
intsteps = np.min((numit/5,1e5)) #Number of steps before checking G-R statistic
isrednoise = False
wavelet = None
noisefunc = None
numfree = len(ifree)
print("Number of free parameters: " + str(len(ifree)))
if gamma == None:
gamma = 2.38/np.sqrt(2*numfree)
print('Gamma = ' + str(gamma))
#UPDATE PARAMETER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
ishareptr = []
for s in range(ishare.size):
ishareptr.append(int(abs(stepsize[ishare[s]])-1)) #Pointer to where parameter is shared from
params[:,ishare[s]] = params[:,ishareptr[s]]
#INITIALIZE FIRST 10*numfree PARAMETERS IN allparams
ninit = 10*numfree
numit += ninit
#if numit < (ninit + 10):
# numit == 1ninit + 10
allparams = np.zeros((nump, nchains, numit)) #allparams must be 64-bit!
#Populate fixed parameters
allparams[ifixed,:,:ninit] = params[:,ifixed].T[:,:,np.newaxis]
#Populate free parameters
for p in ifree:
allparams[p,:,:ninit] = np.random.normal(params[0,p],stdpburnin[p],[nchains,ninit])
#Update shared parameters
if (ishare.size > 0):
allparams[ishare,:,:ninit] = allparams[ishareptr,:,:ninit]
# Construct non-analytic systematic model
for nn in np.unique(nights):
tonight = np.where(nights == nn)[0]
if hasattr(fit[tonight[0]], 'whiteparams') and fit[tonight[0]].whiteparams != None:
if type(fit[tonight[0]].whiteparams) == type(np.array([])):
#Only 1 model in model for white LC, grandfathered code
#print("WARNING: You are using grandfathered code. Update whiteparams to handle multiple models.")
#whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
i = int(fit[tonight[0]].whiteparams[0])
whitemodel = myfuncs[i](fit[tonight[0]].whiteparams[1:], fit[tonight[0]].tuall, None)
else:
#Any number of models can be used to build white LC
whitemodel = np.ones((nchains,len(fit[tonight[0]].good)))
for k in range(len(fit[tonight[0]].whiteparams)):
i = int(fit[tonight[0]].whiteparams[k][0])
whitemodel *= myfuncs[i](fit[tonight[0]].whiteparams[k][1:], fit[tonight[0]].tuall, None)
#whitemodel *= myfuncs[i](fit[tonight[0]].whiteparams[k][1:], funcxuc[i], None)
for j in tonight:
fit[j].whitemodel = whitemodel
elif hasattr(fit[tonight[0]], 'iswhitelc') and fit[tonight[0]].iswhitelc != False:
whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
weight = np.zeros((nchains,len(fit[tonight[0]].good)))
for n in range(nchains):
for j in tonight:
k = 0
for i in range(cummodels[j],cummodels[j+1]):
if functype[i] == 'ecl/tr':
specmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
specmodeluc = np.zeros(len(fit[j].clipmask))
specmodeluc[fit[j].isclipmask] = specmodel
whitemodel[n,fit[j].isgood] += specmodeluc
weight [n,fit[j].isgood] += specmodel[0]
k += 1
whitemodel[n] /= weight[n]
#FINDME: Need to determine exact anchor point
#slope = fit[0].iswhitelc / (1-whitemodel[n].min())
#offset = 1 - slope
#whitemodel[n] = slope*whitemodel[n] + offset
for j in tonight:
fit[j].whitemodel = whitemodel
else:
for j in tonight:
fit[j].whitemodel = np.ones((nchains,len(fit[j].good)))
#Calc chi-squared for model type using current params
currchisq = np.zeros(nchains)
noisepars = [[] for i in range(nchains)]
for j in range(numevents):
#Build final model from model components
ymodels = np.ones((nchains, fit[j].nobj))
k = 0
for i in range(cummodels[j],cummodels[j+1]):
for n in range(nchains):
if functype[i] == 'ortho':
#MODIFY COPY OF nextp ONLY
pedit[n,iortholist[j]] = myfuncs[i](pedit[n,iortholist[j]], funcx[i], fit[j].etc[k])
elif (functype[i] == 'ipmap') or (functype[i] == 'spline'):
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], ymodels[n])
elif functype[i] == 'posoffset':
# Record change in Position 0 => cannot orthogonalize position parameters
ymodels[n] *= myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
elif hasattr(fit[j], 'timebins') and (functype[i] == 'ecl/tr'
or functype[i] == 'ramp'
or functype[i] == 'sinusoidal'):
# Average over high-resolution model
hiresmodel = myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
if len(fit[j].timebins) == fit[j].nobj:
for tb in range(len(fit[j].timebins)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebins[tb]])
else:
for tb in range(len(fit[j].timebinsuc)):
ymodels[n,tb] *= np.mean(hiresmodel[fit[j].timebinsuc[tb]])
elif functype[i] == 'noise':
# Set up for modified chi-squared calculation using correlated noise
isrednoise = True
wavelet = fit[j].etc[k]
noisefunc = myfuncs[i]
noisepars[n] = pedit[n,numparams[i]:numparams[i+1]]
else:
ymodels[n] *= myfuncs[i](pedit[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
k += 1
#Multiply analytic model by non-analytic systematics model
systematics = fit[j].whitelc*fit[j].refspeclc/(fit[j].whitemodel[n][fit[j].isgood].flatten())[fit[j].isclipmask]
ymodels[n] *= systematics
# Calculate chi^2
for n in range(nchains):
if isrednoise == False:
#currchisq[n] += mc.chisq(ymodels[n], y[j], sigma[j])
currchisq[n] += np.sum((ymodels[n] - y[j])**2 / sigma[j]**2)
else:
currchisq[n] += noisefunc(noisepars[n], ymodels[n]-y[j], wavelet)
# Apply prior, if one exists
if len(fit[j].ipriors) > 0:
pbar = fit[j].priorvals[:,0] #prior mean
psigma = np.zeros(len(pbar)) #prior standard deviation
# Determine psigma based on which side of asymmetric Gaussian nextp is on
for i in range(len(fit[j].ipriors)):
if nextp[n,fit[j].ipriors[i]] < pbar[i]:
psigma[i] = fit[j].priorvals[i,1]
else:
psigma[i] = fit[j].priorvals[i,2]
#currchisq[n] += fit[j].nobj*((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
currchisq[n] += ((nextp[n,fit[j].ipriors[i]] - pbar[i])/psigma[i])**2
bestchisq = currchisq[0]
#GENERATE RANDOM NUMBERS FOR MCMC
unif = npr.rand(numit,nchains) #Acceptance
snooker = npr.rand(numit,nchains) #If <0.1, set gamma = 1 during step
randchain = npr.randint(0,nchains,[2,numit,nchains]) #Pairs of chains
b = gamma*stepsize[ifree]/100.
epsilon = npr.normal(0, b, [numit,nchains,numfree])
#START TIMER
clock = timer.Timer(numit-ninit,progress = np.arange(0.05,1.01,0.05))
#Run Differential Evolution Monte Carlo algorithm 'numit' times
numnotfixed = len(inotfixed)
for m in range(ninit,numit):
'''
#Code below is slower, possibly because array copies are made???
#Generate next step using differential evolution
randstep = npr.randint(0,m,[2,nchains])
nextp[:,ifree] = params[:,ifree] + gamma*(allparams[ifree][:,randchain[0,m],randstep[0]] \
- allparams[ifree][:,randchain[1,m],randstep[1]]).T \
+ epsilon[m]
'''
randstep = npr.randint(0,m,[2,nchains])
for n in range(nchains):
#Generate next step using differential evolution
if snooker[m,n] < 0.1:
#Set gamma = 1 to jump between modes (bimodal distribution)
nextp[n,ifree] = params[n,ifree] + (allparams[ifree,randchain[0,m,n],randstep[0,n]] \
- allparams[ifree,randchain[1,m,n],randstep[1,n]]) \
+ epsilon[m,n]
else:
nextp[n,ifree] = params[n,ifree] + gamma*(allparams[ifree,randchain[0,m,n],randstep[0,n]] \
- allparams[ifree,randchain[1,m,n],randstep[1,n]]) \
+ epsilon[m,n]
#CHECK FOR NEW STEPS OUTSIDE BOUNDARIES
ioutside = np.where(np.bitwise_or(nextp < pmin, nextp > pmax))
if (len(ioutside) > 0):
nextp[ioutside] = np.copy(params[ioutside])
#UPDATE PARAMTER(S) EQUAL TO OTHER PARAMETER(S)
if (ishare.size > 0):
nextp[:,ishare] = nextp[:,ishareptr]
# Construct non-analytic systematic model
for nn in np.unique(nights):
tonight = np.where(nights == nn)[0]
if hasattr(fit[tonight[0]], 'whiteparams') and fit[tonight[0]].whiteparams != None:
pass
elif hasattr(fit[tonight[0]], 'iswhitelc') and fit[tonight[0]].iswhitelc != False:
print("***WARNING: whiteparams not defined.***")
whitemodel = np.zeros((nchains,len(fit[tonight[0]].good)))
weight = np.zeros((nchains,len(fit[tonight[0]].good)))
for n in range(nchains):
for j in tonight:
k = 0
for i in range(cummodels[j],cummodels[j+1]):
if functype[i] == 'ecl/tr':
specmodel = myfuncs[i](nextp[n,numparams[i]:numparams[i+1]], funcx[i], fit[j].etc[k])
specmodeluc = np.zeros(len(fit[j].clipmask))
specmodeluc[fit[j].isclipmask] = specmodel
whitemodel[n,fit[j].isgood] += specmodeluc
weight [n,fit[j].isgood] += specmodel[0]
k += 1
whitemodel[n] /= weight[n]
#FINDME: Need to determine exact anchor point
#Also modify statement in w6model.py
#slope = fit[0].iswhitelc / (1-whitemodel[n].min())
#offset = 1 - slope
#whitemodel[n] = slope*whitemodel[n] + offset
for j in tonight:
fit[j].whitemodel = whitemodel
else:
for j in tonight:
fit[j].whitemodel = np.ones((nchains,len(fit[j].good)))
# Assemble systematics models
systematics = [[] for n in range(nchains)]
for n in range(nchains):
for j in range(numevents):
systematics[n].append(fit[j].whitelc*fit[j].refspeclc/(fit[j].whitemodel[n][fit[j].isgood].flatten())[fit[j].isclipmask])
#systematics[n].append(((fit[j].whitelc/whitemodel[n])[fit[j].isgood].flatten())[fit[j].isclipmask])
#COMPUTE NEXT CHI SQUARED AND ACCEPTANCE VALUES
pedit = np.copy(nextp)
nextchisq = np.zeros(nchains)
if ncpu == 1:
# Only 1 CPU
for j in range(numevents):
nextchisq += calcChi2(nchains, functype, myfuncs, pedit, nextp, iortholist[j], funcx, cummodels, numparams, j, isrednoise=isrednoise, wavelet=wavelet, noisefunc=noisefunc, systematics=systematics)
else:
# Multiple CPUs
# Code works but is less efficient
pool = mp.Pool(ncpu)
for j in range(numevents):
res = pool.apply_async(calcChi2, args=(nchains, functype, myfuncs, pedit, nextp, iortholist[j], funcx, cummodels, numparams, j, isrednoise, wavelet, noisefunc, systematics), callback=writeChi2)
pool.close()
pool.join()
res.wait()
#CALCULATE ACCEPTANCE PROBABILITY
accept = np.exp(0.5 * (currchisq - nextchisq))
for n in range(nchains):
if (accept[n] >= 1) or (unif[m,n] <= accept[n]):
#ACCEPT STEP
numaccept += 1
params[n] = np.copy(nextp[n])
currchisq[n] = nextchisq[n]
if (currchisq[n] < bestchisq):
bestp = np.copy(params[n])
bestchisq = np.copy(currchisq[n])
allparams[:,:,m] = params.T
#PRINT INTERMEDIATE INFO
if ((m+1-ninit) % intsteps == 0) and (m > ninit):
print("\n" + time.ctime())
#print("Number of times parameter tries to step outside its prior:")
#print(outside)
print("Current Best Parameters: ")
print(bestp)
#Apply Gelman-Rubin statistic
if isGR:
#Check for no accepted steps in each chain
stdev = np.std(allparams[inotfixed[0],:,ninit:m+1],axis=1)
ichain = np.where(stdev > 1e-8)[0]
if len(ichain) > 1:
#Call test
foo = allparams[inotfixed]
psrf, meanpsrf = gr.convergetest(foo[:,ichain,ninit:m+1], len(ichain))
#psrf, meanpsrf = gr.convergetest(allparams[inotfixed,:,:m+1], nchains)
numconv = np.sum(np.bitwise_and(psrf < 1.01, psrf >= 1.00))
print("Gelman-Rubin statistic for free parameters:")
print(psrf)
if numconv == numnotfixed: #and j >= 1e4:
print("All parameters have converged to within 1% of unity. Halting MCMC.")
allparams = allparams[:,:,:m+1]
break
clock.check(m+1-ninit)
#Check for no accepted steps in each chain
stdev = np.std(allparams[inotfixed[0]],axis=1)
ichain = np.where(stdev > 1e-8)[0]
print("Number of good chains: " + str(len(ichain)))
#FINDME
allparams = allparams[:,ichain,ninit:]
allparams = np.reshape(allparams,(nump, (m+1-ninit)*len(ichain)))
#allparams = allparams[:,ichain]
#allparams = np.reshape(allparams,(nump, (m+1)*len(ichain)))
return allparams, bestp, numaccept, (m+1-ninit)*len(ichain)
| 49.229541 | 212 | 0.538781 | 5,875 | 49,328 | 4.521191 | 0.080511 | 0.022287 | 0.01905 | 0.018071 | 0.894097 | 0.878097 | 0.859499 | 0.843272 | 0.838115 | 0.833296 | 0 | 0.015791 | 0.332408 | 49,328 | 1,001 | 213 | 49.278721 | 0.790805 | 0.239965 | 0 | 0.831008 | 0 | 0 | 0.026767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010853 | false | 0.003101 | 0.007752 | 0 | 0.029457 | 0.049612 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
035c723a1a36eb3a8ac7ed78a238bb61acce820e | 44 | py | Python | src/rez/data/tests/builds/packages/floob/floob/__init__.py | alexey-pelykh/rez | ad12105d89d658e4d2ea9249e537b3de90391f0e | [
"Apache-2.0"
] | null | null | null | src/rez/data/tests/builds/packages/floob/floob/__init__.py | alexey-pelykh/rez | ad12105d89d658e4d2ea9249e537b3de90391f0e | [
"Apache-2.0"
] | null | null | null | src/rez/data/tests/builds/packages/floob/floob/__init__.py | alexey-pelykh/rez | ad12105d89d658e4d2ea9249e537b3de90391f0e | [
"Apache-2.0"
] | 1 | 2020-09-24T08:33:43.000Z | 2020-09-24T08:33:43.000Z | def hello():
return "yes this is floob"
| 14.666667 | 30 | 0.636364 | 7 | 44 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 44 | 2 | 31 | 22 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0.386364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
301f98298ccbdf005c5db78bd61a6110b6b81afe | 16,560 | py | Python | tests/api_testing/test_token.py | ooclab/authn | bbb94c69c86669c37204f6e1fad35bd84f9d7b54 | [
"MIT"
] | null | null | null | tests/api_testing/test_token.py | ooclab/authn | bbb94c69c86669c37204f6e1fad35bd84f9d7b54 | [
"MIT"
] | null | null | null | tests/api_testing/test_token.py | ooclab/authn | bbb94c69c86669c37204f6e1fad35bd84f9d7b54 | [
"MIT"
] | null | null | null | import datetime
import uuid
from eva.conf import settings
from codebase.models import (
User,
App,
UserSession,
AppSession
)
from codebase.utils.swaggerui import api
from codebase.utils.token import decode_token
from .base import (
BaseTestCase,
validate_default_error,
get_body_json
)
MAX_SESSION_PER_USER = int(settings.MAX_SESSION_PER_USER)
class _Base(BaseTestCase):
rs = api.spec.resources["token"]
class UserCreateToken(_Base):
"""POST /token - 用户获取 access_token
"""
def test_username_invalid(self):
"""用户名错误
"""
resp = self.api_post("/token", body={
"username": "notexist",
"password": "password",
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "username-or-password-incorrect")
def test_password_invalid(self):
"""密码错误
"""
resp = self.api_post("/token", body={
"username": self.current_user.username,
"password": "wrong",
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "username-or-password-incorrect")
def test_user_inactive(self):
"""用户被禁用
"""
self.current_user.is_active = False
self.db.commit()
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "user-inactive")
def test_get_token_success(self):
"""获取 token 成功
"""
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
spec = self.rs.post_token.op_spec["responses"]["200"]["schema"]
api.validate_object(spec, body)
data = body["data"]
payload = decode_token(data["access_token"])
self.assertEqual(payload["uid"], str(self.current_user.uuid))
def test_most_session_clean(self):
"""用户会话数量限制
"""
self.assertEqual(0, self.db.query(UserSession).count())
for _ in range(MAX_SESSION_PER_USER + 10):
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
self.assertEqual(resp.code, 200)
self.assertEqual(
MAX_SESSION_PER_USER, self.db.query(UserSession).count())
def test_most_sessions_expired(self):
"""用户会话过期
"""
self.assertEqual(0, self.db.query(UserSession).count())
for _ in range(MAX_SESSION_PER_USER + 10):
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
self.assertEqual(resp.code, 200)
for session in self.db.query(UserSession):
session.expires_in = datetime.datetime.utcnow()
self.db.commit()
self.assertEqual(6, self.db.query(UserSession).count())
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
self.assertEqual(resp.code, 200)
self.assertEqual(1, self.db.query(UserSession).count())
class UserTokenRefresh(_Base):
"""POST /token/refresh - 用户刷新 access_token
"""
def test_refresh_token_invalid(self):
"""无效的 refresh token
"""
for token in [None, "", "notexist"]:
resp = self.api_post("/token/refresh", body={
"refresh_token": token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "invalid-refresh-token")
def test_session_is_expired(self):
"""会话过期
"""
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
session = self.db.query(UserSession).filter_by(
refresh_token=refresh_token).first()
session.expires_in = datetime.datetime.utcnow()
self.db.commit()
resp = self.api_post("/token/refresh", body={
"refresh_token": refresh_token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "is-expired")
def test_user_inactive(self):
"""用户被禁用
"""
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
# TODO: 为什么需要删除再获取?
del self.current_user
self.current_user = self.db.query(User).filter_by(
username=self.current_username).first()
self.current_user.is_active = False
self.db.commit()
resp = self.api_post("/token/refresh", body={
"refresh_token": refresh_token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "user-inactive")
def test_refresh_token_success(self):
"""刷新成功
"""
resp = self.api_post("/token", body={
"username": self.current_username,
"password": self.current_password,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
resp = self.api_post("/token/refresh", body={
"refresh_token": refresh_token,
})
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
spec = self.rs.post_token_refresh.op_spec["responses"]["200"]["schema"]
api.validate_object(spec, body)
data = body["data"]
payload = decode_token(data["access_token"])
self.assertEqual(payload["uid"], str(self.current_user.uuid))
def test_refresh_token_session_expires(self):
"""用户刷新 access_token 时遇到会话过期
"""
password = "password"
user = User(username="username", password=password)
self.db.add(user)
self.db.commit()
resp = self.api_post("/token", body={
"username": user.username,
"password": password,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
for session in self.db.query(UserSession).filter_by(
user_id=user.id).all():
session.expires_in = (
datetime.datetime.utcnow() + datetime.timedelta(seconds=60))
resp = self.api_post("/token/refresh", body={
"refresh_token": refresh_token,
})
self.assertEqual(resp.code, 200)
sessions = self.db.query(UserSession).filter_by(user_id=user.id)
self.assertEqual(sessions.count(), 1)
session = sessions.first()
self.assertNotEqual(session.refresh_token, refresh_token)
class AppCreateToken(_Base):
"""POST /app/{id}/token - App 获取 access_token
"""
def test_app_id_invalid(self):
"""App ID 无效
"""
app_id = str(uuid.uuid4())
resp = self.api_post("/app_token", body={"app_id": app_id})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "incorrect-app-id")
def test_app_secret_invalid(self):
"""app_secret 错误
"""
app = App(user=self.current_user, name="fortest", app_secret="secret")
self.db.add(app)
self.db.commit()
resp = self.api_post("/app_token", body={
"app_id": str(app.app_id),
"app_secret": "wrong",
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "incorrect-app-id-or-secret")
def test_user_inactive(self):
"""用户被禁用
"""
self.current_user.is_active = False
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
resp = self.api_post("/app_token", body={
"app_id": str(app.app_id),
"app_secret": app_secret,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "user-inactive")
def test_app_inactive(self):
"""App 被禁用
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
app.is_active = False
self.db.add(app)
self.db.commit()
resp = self.api_post("/app_token", body={
"app_id": str(app.app_id),
"app_secret": app_secret,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "app-inactive")
def test_get_token_success(self):
"""获取 token 成功
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
resp = self.api_post("/app_token", body={
"app_id": str(app.app_id),
"app_secret": app_secret,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
spec = self.rs.post_app_token.op_spec["responses"]["200"]["schema"]
api.validate_object(spec, body)
data = body["data"]
payload = decode_token(data["access_token"])
self.assertEqual(payload["uid"], str(self.current_user.uuid))
class AppTokenRefresh(_Base):
"""POST /app/{id}/token/refresh - App 刷新 access_token
"""
def test_refresh_token_invalid(self):
"""无效的 refresh token
"""
app = App(user=self.current_user, name="fortest", app_secret="secret")
self.db.add(app)
self.db.commit()
for token in [None, "", "notexist"]:
resp = self.api_post("/app_token/refresh", body={
"app_id": str(app.app_id),
"refresh_token": token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "invalid-refresh-token")
def test_session_is_expired(self):
"""会话过期
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
app_id = str(str(app.app_id))
resp = self.api_post("/app_token", body={
"app_id": app_id,
"app_secret": app_secret,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
session = self.db.query(AppSession).filter_by(
refresh_token=refresh_token).first()
session.expires_in = datetime.datetime.utcnow()
self.db.commit()
resp = self.api_post("/app_token/refresh", body={
"app_id": app_id,
"refresh_token": refresh_token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "is-expired")
def test_user_inactive(self):
"""用户被禁用
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
app_id = str(str(app.app_id))
resp = self.api_post("/app_token", body={
"app_id": app_id,
"app_secret": app_secret,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
# TODO: 为什么需要删除再获取?
del self.current_user
self.current_user = self.db.query(User).filter_by(
username=self.current_username).first()
self.current_user.is_active = False
self.db.commit()
resp = self.api_post("/app_token/refresh", body={
"app_id": app_id,
"refresh_token": refresh_token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "user-inactive")
def test_app_inactive(self):
"""App 被禁用
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
app_id = str(str(app.app_id))
resp = self.api_post("/app_token", body={
"app_id": app_id,
"app_secret": app_secret,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
del app
app = self.db.query(App).filter_by(app_id=app_id).first()
app.is_active = False
self.db.commit()
resp = self.api_post("/app_token/refresh", body={
"app_id": app_id,
"refresh_token": refresh_token,
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "app-inactive")
def test_refresh_token_success(self):
"""刷新成功
"""
app_secret = "secret"
app = App(user=self.current_user,
name="fortest", app_secret=app_secret)
self.db.add(app)
self.db.commit()
app_id = str(str(app.app_id))
resp = self.api_post("/app_token", body={
"app_id": app_id,
"app_secret": app_secret,
})
body = get_body_json(resp)
refresh_token = body["data"]["refresh_token"]
resp = self.api_post("/app_token/refresh", body={
"app_id": app_id,
"refresh_token": refresh_token,
})
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
spec = self.rs.post_token_refresh.op_spec["responses"]["200"]["schema"]
api.validate_object(spec, body)
data = body["data"]
payload = decode_token(data["access_token"])
self.assertEqual(payload["uid"], str(self.current_user.uuid))
class CreateOpenToken(_Base):
"""POST /open_token - 直接获取 access_token
"""
def test_username_invalid(self):
"""用户名错误
"""
resp = self.api_post("/open_token", body={
"username": "notexist",
})
body = get_body_json(resp)
self.assertEqual(resp.code, 400)
validate_default_error(body)
self.assertEqual(body["status"], "username-incorrect")
def test_get_token_success(self):
"""获取 token 成功
"""
username = "wechat"
resp = self.api_post("/user", body={"username": username})
body = get_body_json(resp)
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
resp = self.api_post("/open_token", body={"username": username})
body = get_body_json(resp)
self.assertEqual(resp.code, 200)
self.validate_default_success(body)
spec = self.rs.post_open_token.op_spec["responses"]["200"]["schema"]
api.validate_object(spec, body)
user = self.db.query(User).filter_by(username=username).first()
data = body["data"]
payload = decode_token(data["access_token"])
self.assertEqual(payload["uid"], str(user.uuid))
| 31.603053 | 79 | 0.581763 | 1,914 | 16,560 | 4.799373 | 0.066353 | 0.045286 | 0.039517 | 0.053886 | 0.875354 | 0.862617 | 0.855214 | 0.845417 | 0.83083 | 0.808731 | 0 | 0.008622 | 0.285628 | 16,560 | 523 | 80 | 31.66348 | 0.767878 | 0.041425 | 0 | 0.817708 | 0 | 0 | 0.10806 | 0.008136 | 0 | 0 | 0 | 0.003824 | 0.135417 | 1 | 0.059896 | false | 0.041667 | 0.018229 | 0 | 0.096354 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3021645b7346a64968f96bb0a168f25e37863395 | 135,457 | py | Python | examples/grids/grid_bpu/k12p6/k12p6_pss.py | pydae/pydae | 8076bcfeb2cdc865a5fc58561ff8d246d0ed7d9d | [
"MIT"
] | 1 | 2020-12-20T03:45:26.000Z | 2020-12-20T03:45:26.000Z | examples/grids/grid_bpu/k12p6/k12p6_pss.py | pydae/pydae | 8076bcfeb2cdc865a5fc58561ff8d246d0ed7d9d | [
"MIT"
] | null | null | null | examples/grids/grid_bpu/k12p6/k12p6_pss.py | pydae/pydae | 8076bcfeb2cdc865a5fc58561ff8d246d0ed7d9d | [
"MIT"
] | null | null | null | import numpy as np
import numba
import scipy.optimize as sopt
import json
sin = np.sin
cos = np.cos
atan2 = np.arctan2
sqrt = np.sqrt
sign = np.sign
exp = np.exp
class k12p6_pss_class:
def __init__(self):
self.t_end = 10.000000
self.Dt = 0.0010000
self.decimation = 10.000000
self.itol = 1e-6
self.Dt_max = 0.001000
self.Dt_min = 0.001000
self.solvern = 5
self.imax = 100
self.N_x = 45
self.N_y = 60
self.N_z = 15
self.N_store = 10000
self.params_list = ['S_base', 'g_1_5', 'b_1_5', 'bs_1_5', 'g_2_6', 'b_2_6', 'bs_2_6', 'g_3_11', 'b_3_11', 'bs_3_11', 'g_4_10', 'b_4_10', 'bs_4_10', 'g_5_6', 'b_5_6', 'bs_5_6', 'g_6_7', 'b_6_7', 'bs_6_7', 'g_7_8', 'b_7_8', 'bs_7_8', 'g_8_9', 'b_8_9', 'bs_8_9', 'g_9_10', 'b_9_10', 'bs_9_10', 'g_10_11', 'b_10_11', 'bs_10_11', 'U_1_n', 'U_2_n', 'U_3_n', 'U_4_n', 'U_5_n', 'U_6_n', 'U_7_n', 'U_8_n', 'U_9_n', 'U_10_n', 'U_11_n', 'S_n_1', 'Omega_b_1', 'H_1', 'T1d0_1', 'T1q0_1', 'X_d_1', 'X_q_1', 'X1d_1', 'X1q_1', 'D_1', 'R_a_1', 'K_delta_1', 'K_sec_1', 'K_a_1', 'K_ai_1', 'T_r_1', 'V_min_1', 'V_max_1', 'K_aw_1', 'Droop_1', 'T_gov_1_1', 'T_gov_2_1', 'T_gov_3_1', 'K_imw_1', 'omega_ref_1', 'T_wo_1', 'T_1_1', 'T_2_1', 'K_stab_1', 'V_lim_1', 'S_n_2', 'Omega_b_2', 'H_2', 'T1d0_2', 'T1q0_2', 'X_d_2', 'X_q_2', 'X1d_2', 'X1q_2', 'D_2', 'R_a_2', 'K_delta_2', 'K_sec_2', 'K_a_2', 'K_ai_2', 'T_r_2', 'V_min_2', 'V_max_2', 'K_aw_2', 'Droop_2', 'T_gov_1_2', 'T_gov_2_2', 'T_gov_3_2', 'K_imw_2', 'omega_ref_2', 'T_wo_2', 'T_1_2', 'T_2_2', 'K_stab_2', 'V_lim_2', 'S_n_3', 'Omega_b_3', 'H_3', 'T1d0_3', 'T1q0_3', 'X_d_3', 'X_q_3', 'X1d_3', 'X1q_3', 'D_3', 'R_a_3', 'K_delta_3', 'K_sec_3', 'K_a_3', 'K_ai_3', 'T_r_3', 'V_min_3', 'V_max_3', 'K_aw_3', 'Droop_3', 'T_gov_1_3', 'T_gov_2_3', 'T_gov_3_3', 'K_imw_3', 'omega_ref_3', 'T_wo_3', 'T_1_3', 'T_2_3', 'K_stab_3', 'V_lim_3', 'S_n_4', 'Omega_b_4', 'H_4', 'T1d0_4', 'T1q0_4', 'X_d_4', 'X_q_4', 'X1d_4', 'X1q_4', 'D_4', 'R_a_4', 'K_delta_4', 'K_sec_4', 'K_a_4', 'K_ai_4', 'T_r_4', 'V_min_4', 'V_max_4', 'K_aw_4', 'Droop_4', 'T_gov_1_4', 'T_gov_2_4', 'T_gov_3_4', 'K_imw_4', 'omega_ref_4', 'T_wo_4', 'T_1_4', 'T_2_4', 'K_stab_4', 'V_lim_4', 'K_p_agc', 'K_i_agc']
self.params_values_list = [100000000.0, 0.0, -60.0, 0.0, 0.0, -60.0, 0.0, 0.0, -60.0, 0.0, 0.0, -60.0, 0.0, 3.96039603960396, -39.603960396039604, 0.027772499999999995, 9.900990099009901, -99.00990099009901, 0.011108999999999999, 0.9000900090008999, -9.000900090009, 0.12219899999999999, 0.9000900090008999, -9.000900090009, 0.12219899999999999, 9.900990099009901, -99.00990099009901, 0.011108999999999999, 3.96039603960396, -39.603960396039604, 0.027772499999999995, 20000.0, 20000.0, 20000.0, 20000.0, 230000.0, 230000.0, 230000.0, 230000.0, 230000.0, 230000.0, 230000.0, 900000000.0, 314.1592653589793, 6.5, 8.0, 0.4, 1.8, 1.7, 0.3, 0.55, 1.0, 0.0025, 0.001, 0.0, 300, 1e-06, 0.02, -10000.0, 5.0, 10, 0.05, 1.0, 2.0, 10.0, 0.01, 1.0, 10.0, 0.1, 0.1, 1.0, 0.1, 900000000.0, 314.1592653589793, 6.5, 8.0, 0.4, 1.8, 1.7, 0.3, 0.55, 1.0, 0.0025, 0.0, 0.0, 300, 1e-06, 0.02, -10000.0, 5.0, 10, 0.05, 1.0, 2.0, 10.0, 0.01, 1.0, 10.0, 0.1, 0.1, 1.0, 0.1, 900000000.0, 314.1592653589793, 6.175, 8.0, 0.4, 1.8, 1.7, 0.3, 0.55, 1.0, 0.0025, 0.0, 0.01, 300, 1e-06, 0.02, -10000.0, 5.0, 10, 0.05, 1.0, 2.0, 10.0, 0.0, 1.0, 10.0, 0.1, 0.1, 1.0, 0.1, 900000000.0, 314.1592653589793, 6.175, 8.0, 0.4, 1.8, 1.7, 0.3, 0.55, 1.0, 0.0025, 0.0, 0.0, 300, 1e-06, 0.02, -10000.0, 5.0, 10, 0.05, 1.0, 2.0, 10.0, 0.01, 1.0, 10.0, 0.1, 0.1, 1.0, 0.1, 0.01, 0.01]
self.inputs_ini_list = ['P_1', 'Q_1', 'P_2', 'Q_2', 'P_3', 'Q_3', 'P_4', 'Q_4', 'P_5', 'Q_5', 'P_6', 'Q_6', 'P_7', 'Q_7', 'P_8', 'Q_8', 'P_9', 'Q_9', 'P_10', 'Q_10', 'P_11', 'Q_11', 'v_ref_1', 'v_pss_1', 'p_c_1', 'p_r_1', 'v_ref_2', 'v_pss_2', 'p_c_2', 'p_r_2', 'v_ref_3', 'v_pss_3', 'p_c_3', 'p_r_3', 'v_ref_4', 'v_pss_4', 'p_c_4', 'p_r_4']
self.inputs_ini_values_list = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -967000000.0, 100000000.0, 0.0, 0.0, -1767000000.0, 250000000.0, 0.0, 0.0, 0.0, 0.0, 1.03, 0.0, 0.778, 0.0, 1.01, 0.0, 0.778, 0.0, 1.03, 0.0, 0.778, 0.0, 1.01, 0.0, 0.778, 0.0]
self.inputs_run_list = ['P_1', 'Q_1', 'P_2', 'Q_2', 'P_3', 'Q_3', 'P_4', 'Q_4', 'P_5', 'Q_5', 'P_6', 'Q_6', 'P_7', 'Q_7', 'P_8', 'Q_8', 'P_9', 'Q_9', 'P_10', 'Q_10', 'P_11', 'Q_11', 'v_ref_1', 'v_pss_1', 'p_c_1', 'p_r_1', 'v_ref_2', 'v_pss_2', 'p_c_2', 'p_r_2', 'v_ref_3', 'v_pss_3', 'p_c_3', 'p_r_3', 'v_ref_4', 'v_pss_4', 'p_c_4', 'p_r_4']
self.inputs_run_values_list = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -967000000.0, 100000000.0, 0.0, 0.0, -1767000000.0, 250000000.0, 0.0, 0.0, 0.0, 0.0, 1.03, 0.0, 0.778, 0.0, 1.01, 0.0, 0.778, 0.0, 1.03, 0.0, 0.778, 0.0, 1.01, 0.0, 0.778, 0.0]
self.outputs_list = ['V_1', 'V_2', 'V_3', 'V_4', 'V_5', 'V_6', 'V_7', 'V_8', 'V_9', 'V_10', 'V_11', 'p_e_1', 'p_e_2', 'p_e_3', 'p_e_4']
self.x_list = ['delta_1', 'omega_1', 'e1q_1', 'e1d_1', 'v_c_1', 'xi_v_1', 'x_gov_1_1', 'x_gov_2_1', 'xi_imw_1', 'x_wo_1', 'x_lead_1', 'delta_2', 'omega_2', 'e1q_2', 'e1d_2', 'v_c_2', 'xi_v_2', 'x_gov_1_2', 'x_gov_2_2', 'xi_imw_2', 'x_wo_2', 'x_lead_2', 'delta_3', 'omega_3', 'e1q_3', 'e1d_3', 'v_c_3', 'xi_v_3', 'x_gov_1_3', 'x_gov_2_3', 'xi_imw_3', 'x_wo_3', 'x_lead_3', 'delta_4', 'omega_4', 'e1q_4', 'e1d_4', 'v_c_4', 'xi_v_4', 'x_gov_1_4', 'x_gov_2_4', 'xi_imw_4', 'x_wo_4', 'x_lead_4', 'xi_freq']
self.y_run_list = ['V_1', 'theta_1', 'V_2', 'theta_2', 'V_3', 'theta_3', 'V_4', 'theta_4', 'V_5', 'theta_5', 'V_6', 'theta_6', 'V_7', 'theta_7', 'V_8', 'theta_8', 'V_9', 'theta_9', 'V_10', 'theta_10', 'V_11', 'theta_11', 'i_d_1', 'i_q_1', 'p_g_1', 'q_g_1', 'v_f_1', 'p_m_ref_1', 'p_m_1', 'z_wo_1', 'v_pss_1', 'i_d_2', 'i_q_2', 'p_g_2', 'q_g_2', 'v_f_2', 'p_m_ref_2', 'p_m_2', 'z_wo_2', 'v_pss_2', 'i_d_3', 'i_q_3', 'p_g_3', 'q_g_3', 'v_f_3', 'p_m_ref_3', 'p_m_3', 'z_wo_3', 'v_pss_3', 'i_d_4', 'i_q_4', 'p_g_4', 'q_g_4', 'v_f_4', 'p_m_ref_4', 'p_m_4', 'z_wo_4', 'v_pss_4', 'omega_coi', 'p_agc']
self.xy_list = self.x_list + self.y_run_list
self.y_ini_list = ['V_1', 'theta_1', 'V_2', 'theta_2', 'V_3', 'theta_3', 'V_4', 'theta_4', 'V_5', 'theta_5', 'V_6', 'theta_6', 'V_7', 'theta_7', 'V_8', 'theta_8', 'V_9', 'theta_9', 'V_10', 'theta_10', 'V_11', 'theta_11', 'i_d_1', 'i_q_1', 'p_g_1', 'q_g_1', 'v_f_1', 'p_m_ref_1', 'p_m_1', 'z_wo_1', 'v_pss_1', 'i_d_2', 'i_q_2', 'p_g_2', 'q_g_2', 'v_f_2', 'p_m_ref_2', 'p_m_2', 'z_wo_2', 'v_pss_2', 'i_d_3', 'i_q_3', 'p_g_3', 'q_g_3', 'v_f_3', 'p_m_ref_3', 'p_m_3', 'z_wo_3', 'v_pss_3', 'i_d_4', 'i_q_4', 'p_g_4', 'q_g_4', 'v_f_4', 'p_m_ref_4', 'p_m_4', 'z_wo_4', 'v_pss_4', 'omega_coi', 'p_agc']
self.xy_ini_list = self.x_list + self.y_ini_list
self.t = 0.0
self.it = 0
self.it_store = 0
self.xy_prev = np.zeros((self.N_x+self.N_y,1))
self.initialization_tol = 1e-6
self.N_u = len(self.inputs_run_list)
self.sopt_root_method='hybr'
self.sopt_root_jac=True
self.u_ini_list = self.inputs_ini_list
self.u_ini_values_list = self.inputs_ini_values_list
self.u_run_list = self.inputs_run_list
self.u_run_values_list = self.inputs_run_values_list
self.N_u = len(self.u_run_list)
self.u_ini = np.array(self.inputs_ini_values_list)
self.p = np.array(self.params_values_list)
self.xy_0 = np.zeros((self.N_x+self.N_y,))
self.xy = np.zeros((self.N_x+self.N_y,))
self.z = np.zeros((self.N_z,))
self.jac_run = np.zeros((self.N_x+self.N_y,self.N_x+self.N_y))
self.yini2urun = list(set(self.u_run_list).intersection(set(self.y_ini_list)))
self.uini2yrun = list(set(self.y_run_list).intersection(set(self.u_ini_list)))
self.Time = np.zeros(self.N_store)
self.X = np.zeros((self.N_store,self.N_x))
self.Y = np.zeros((self.N_store,self.N_y))
self.Z = np.zeros((self.N_store,self.N_z))
self.iters = np.zeros(self.N_store)
self.u_run = np.array(self.u_run_values_list)
def ss_ini(self):
xy_ini,it = sstate(self.xy_0,self.u_ini,self.p,self.N_x,self.N_y)
self.xy_ini = xy_ini
return xy_ini
def ini(self,up_dict,xy_0={}):
for item in up_dict:
self.set_value(item,up_dict[item])
self.xy_ini = self.ss_ini()
self.ini2run()
jac_run_eval(self.jac_run,self.x,self.y_run,self.u_run,self.p)
def run(self,t_end,up_dict):
for item in up_dict:
self.set_value(item,up_dict[item])
t = self.t
p = self.p
it = self.it
it_store = self.it_store
xy = self.xy
u = self.u_run
t,it,it_store,xy = daesolver(t,t_end,it,it_store,xy,u,p,
self.Time,
self.X,
self.Y,
self.Z,
self.iters,
self.Dt,
self.N_x,
self.N_y,
self.N_z,
self.decimation,
max_it=50,itol=1e-8,store=1)
self.t = t
self.it = it
self.it_store = it_store
self.xy = xy
def post(self):
self.Time = self.Time[:self.it_store]
self.X = self.X[:self.it_store]
self.Y = self.Y[:self.it_store]
self.Z = self.Z[:self.it_store]
def ini2run(self):
## y_ini to y_run
self.y_ini = self.xy_ini[self.N_x:]
self.y_run = np.copy(self.y_ini)
self.u_run = np.copy(self.u_ini)
## y_ini to u_run
for item in self.yini2urun:
self.u_run[self.u_run_list.index(item)] = self.y_ini[self.y_ini_list.index(item)]
## u_ini to y_run
for item in self.uini2yrun:
self.y_run[self.y_run_list.index(item)] = self.u_ini[self.u_ini_list.index(item)]
self.x = self.xy_ini[:self.N_x]
self.xy[:self.N_x] = self.x
self.xy[self.N_x:] = self.y_run
h_eval(self.z,self.x,self.y_run,self.u_ini,self.p)
def get_value(self,name):
if name in self.inputs_run_list:
value = self.u_run[self.inputs_run_list.index(name)]
return value
if name in self.x_list:
idx = self.x_list.index(name)
value = self.x[idx]
return value
if name in self.y_run_list:
idy = self.y_run_list.index(name)
value = self.y_run[idy]
return value
if name in self.params_list:
idp = self.params_list.index(name)
value = self.p[idp]
return value
if name in self.outputs_list:
idz = self.outputs_list.index(name)
value = self.z[idz]
return value
def set_value(self,name_,value):
if name_ in self.inputs_run_list:
self.u_run[self.inputs_run_list.index(name_)] = value
return
elif name_ in self.params_list:
self.p[self.params_list.index(name_)] = value
return
elif name_ in self.inputs_ini_list:
self.u_ini[self.inputs_ini_list.index(name_)] = value
return
else:
print(f'Input or parameter {name_} not found.')
def report_x(self,value_format='5.2f'):
for item in self.x_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_y(self,value_format='5.2f'):
for item in self.y_run_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_u(self,value_format='5.2f'):
for item in self.inputs_run_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_z(self,value_format='5.2f'):
for item in self.outputs_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_params(self,value_format='5.2f'):
for item in self.params_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def ini(self,up_dict,xy_0={}):
for item in up_dict:
self.set_value(item,up_dict[item])
if type(xy_0) == dict:
xy_0_dict = xy_0
self.dict2xy0(xy_0_dict)
if type(xy_0) == str:
if xy_0 == 'eval':
xy0_eval(self.xy_0[:N_x],self.xy_0[N_x:],y,u,p)
else:
self.load_xy_0(file_name = xy_0)
self.xy_ini = self.ss_ini()
self.ini2run()
jac_run_eval(self.jac_run,self.x,self.y_run,self.u_run,self.p)
def dict2xy0(self,xy_0_dict):
for item in xy_0_dict:
if item in self.x_list:
self.xy_0[self.x_list.index(item)] = xy_0_dict[item]
if item in self.y_ini_list:
self.xy_0[self.y_ini_list.index(item) + self.N_x] = xy_0_dict[item]
def save_xy_0(self,file_name = 'xy_0.json'):
xy_0_dict = {}
for item in self.x_list:
xy_0_dict.update({item:self.get_value(item)})
for item in self.y_ini_list:
xy_0_dict.update({item:self.get_value(item)})
xy_0_str = json.dumps(xy_0_dict, indent=4)
with open(file_name,'w') as fobj:
fobj.write(xy_0_str)
def load_xy_0(self,file_name = 'xy_0.json'):
with open(file_name) as fobj:
xy_0_str = fobj.read()
xy_0_dict = json.loads(xy_0_str)
for item in xy_0_dict:
if item in self.x_list:
self.xy_0[self.x_list.index(item)] = xy_0_dict[item]
if item in self.y_ini_list:
self.xy_0[self.y_ini_list.index(item)+self.N_x] = xy_0_dict[item]
@numba.njit(cache=True)
def sstate(xy,u,p,N_x,N_y,max_it=50,tol=1e-8):
jac_ini_ss = np.zeros((N_x+N_y,N_x+N_y),dtype=np.float64)
fg = np.zeros((N_x+N_y,1),dtype=np.float64)
x = xy[:N_x]
y = xy[N_x:]
fg = np.zeros((N_x+N_y,),dtype=np.float64)
f = fg[:N_x]
g = fg[N_x:]
for it in range(max_it):
jac_ini_ss_eval(jac_ini_ss,x,y,u,p)
f_ini_eval(f,x,y,u,p)
g_ini_eval(g,x,y,u,p)
fg[:N_x] = f
fg[N_x:] = g
xy += np.linalg.solve(jac_ini_ss,-fg)
if np.max(np.abs(fg))<tol: break
return xy,it
@numba.njit(cache=True)
def daesolver(t,t_end,it,it_store,xy,u,p,T,X,Y,Z,iters,Dt,N_x,N_y,N_z,decimation,max_it=50,itol=1e-8,store=1):
jac_trap = np.zeros((N_x+N_y,N_x+N_y),dtype=np.float64)
fg = np.zeros((N_x+N_y,1),dtype=np.float64)
fg_i = np.zeros((N_x+N_y),dtype=np.float64)
x = xy[:N_x]
y = xy[N_x:]
fg = np.zeros((N_x+N_y,),dtype=np.float64)
f = fg[:N_x]
g = fg[N_x:]
h = np.zeros((N_z),dtype=np.float64)
if it == 0:
f_eval(f,x,y,u,p)
h_eval(h,x,y,u,p)
it_store = 0
T[0] = t
X[0,:] = x
Y[0,:] = y
Z[0,:] = h
while t<t_end:
it += 1
t += Dt
f_eval(f,x,y,u,p)
g_eval(g,x,y,u,p)
x_0 = np.copy(x)
y_0 = np.copy(y)
f_0 = np.copy(f)
g_0 = np.copy(g)
for iti in range(max_it):
f_eval(f,x,y,u,p)
g_eval(g,x,y,u,p)
jac_trap_eval(jac_trap,x,y,u,p,Dt)
f_n_i = x - x_0 - 0.5*Dt*(f+f_0)
fg_i[:N_x] = f_n_i
fg_i[N_x:] = g
Dxy_i = np.linalg.solve(-jac_trap,fg_i)
x = x + Dxy_i[:N_x]
y = y + Dxy_i[N_x:]
# iteration stop
max_relative = 0.0
for it_var in range(N_x+N_y):
abs_value = np.abs(xy[it_var])
if abs_value < 0.001:
abs_value = 0.001
relative_error = np.abs(Dxy_i[it_var])/abs_value
if relative_error > max_relative: max_relative = relative_error
if max_relative<itol:
break
h_eval(h,x,y,u,p)
xy[:N_x] = x
xy[N_x:] = y
# store in channels
if store == 1:
if it >= it_store*decimation:
T[it_store+1] = t
X[it_store+1,:] = x
Y[it_store+1,:] = y
Z[it_store+1,:] = h
iters[it_store+1] = iti
it_store += 1
return t,it,it_store,xy
@numba.njit(cache=True)
def f_ini_eval(f_ini,x,y,u,p,xyup = 0):
f_ini[0] = -p[53]*x[0] + p[43]*(x[1] - y[58])
f_ini[1] = (-p[51]*(x[1] - y[58]) - y[22]*(p[52]*y[22] + y[0]*sin(x[0] - y[1])) - y[23]*(p[52]*y[23] + y[0]*cos(x[0] - y[1])) + y[28])/(2*p[44])
f_ini[2] = (-x[2] - y[22]*(-p[49] + p[47]) + y[26])/p[45]
f_ini[3] = (-x[3] + y[23]*(-p[50] + p[48]))/p[46]
f_ini[4] = (y[0] - x[4])/p[57]
f_ini[5] = -p[60]*(p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5] - y[26]) - x[4] + y[30] + u[22]
f_ini[6] = (y[27] - x[6])/p[62]
f_ini[7] = (x[6] - x[7])/p[64]
f_ini[8] = p[65]*(u[24] - y[24]) - 1.0e-6*x[8]
f_ini[9] = (x[1] - x[9] - 1.0)/p[67]
f_ini[10] = (-x[10] + y[29])/p[69]
f_ini[11] = -p[83]*x[11] + p[73]*(x[12] - y[58])
f_ini[12] = (-p[81]*(x[12] - y[58]) - y[31]*(p[82]*y[31] + y[2]*sin(x[11] - y[3])) - y[32]*(p[82]*y[32] + y[2]*cos(x[11] - y[3])) + y[37])/(2*p[74])
f_ini[13] = (-x[13] - y[31]*(-p[79] + p[77]) + y[35])/p[75]
f_ini[14] = (-x[14] + y[32]*(-p[80] + p[78]))/p[76]
f_ini[15] = (y[2] - x[15])/p[87]
f_ini[16] = -p[90]*(p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16] - y[35]) - x[15] + y[39] + u[26]
f_ini[17] = (y[36] - x[17])/p[92]
f_ini[18] = (x[17] - x[18])/p[94]
f_ini[19] = p[95]*(u[28] - y[33]) - 1.0e-6*x[19]
f_ini[20] = (x[12] - x[20] - 1.0)/p[97]
f_ini[21] = (-x[21] + y[38])/p[99]
f_ini[22] = -p[113]*x[22] + p[103]*(x[23] - y[58])
f_ini[23] = (-p[111]*(x[23] - y[58]) - y[40]*(p[112]*y[40] + y[4]*sin(x[22] - y[5])) - y[41]*(p[112]*y[41] + y[4]*cos(x[22] - y[5])) + y[46])/(2*p[104])
f_ini[24] = (-x[24] - y[40]*(-p[109] + p[107]) + y[44])/p[105]
f_ini[25] = (-x[25] + y[41]*(-p[110] + p[108]))/p[106]
f_ini[26] = (y[4] - x[26])/p[117]
f_ini[27] = -p[120]*(p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27] - y[44]) - x[26] + y[48] + u[30]
f_ini[28] = (y[45] - x[28])/p[122]
f_ini[29] = (x[28] - x[29])/p[124]
f_ini[30] = p[125]*(u[32] - y[42]) - 1.0e-6*x[30]
f_ini[31] = (x[23] - x[31] - 1.0)/p[127]
f_ini[32] = (-x[32] + y[47])/p[129]
f_ini[33] = -p[143]*x[33] + p[133]*(x[34] - y[58])
f_ini[34] = (-p[141]*(x[34] - y[58]) - y[49]*(p[142]*y[49] + y[6]*sin(x[33] - y[7])) - y[50]*(p[142]*y[50] + y[6]*cos(x[33] - y[7])) + y[55])/(2*p[134])
f_ini[35] = (-x[35] - y[49]*(-p[139] + p[137]) + y[53])/p[135]
f_ini[36] = (-x[36] + y[50]*(-p[140] + p[138]))/p[136]
f_ini[37] = (y[6] - x[37])/p[147]
f_ini[38] = -p[150]*(p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38] - y[53]) - x[37] + y[57] + u[34]
f_ini[39] = (y[54] - x[39])/p[152]
f_ini[40] = (x[39] - x[40])/p[154]
f_ini[41] = p[155]*(u[36] - y[51]) - 1.0e-6*x[41]
f_ini[42] = (x[34] - x[42] - 1.0)/p[157]
f_ini[43] = (-x[43] + y[56])/p[159]
f_ini[44] = 1 - y[58]
@numba.njit(cache=True)
def f_run_eval(f_ini,x,y,u,p,xyup = 0):
f_run[0] = -p[53]*x[0] + p[43]*(x[1] - y[58])
f_run[1] = (-p[51]*(x[1] - y[58]) - y[22]*(p[52]*y[22] + y[0]*sin(x[0] - y[1])) - y[23]*(p[52]*y[23] + y[0]*cos(x[0] - y[1])) + y[28])/(2*p[44])
f_run[2] = (-x[2] - y[22]*(-p[49] + p[47]) + y[26])/p[45]
f_run[3] = (-x[3] + y[23]*(-p[50] + p[48]))/p[46]
f_run[4] = (y[0] - x[4])/p[57]
f_run[5] = -p[60]*(p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5] - y[26]) - x[4] + y[30] + u[22]
f_run[6] = (y[27] - x[6])/p[62]
f_run[7] = (x[6] - x[7])/p[64]
f_run[8] = p[65]*(u[24] - y[24]) - 1.0e-6*x[8]
f_run[9] = (x[1] - x[9] - 1.0)/p[67]
f_run[10] = (-x[10] + y[29])/p[69]
f_run[11] = -p[83]*x[11] + p[73]*(x[12] - y[58])
f_run[12] = (-p[81]*(x[12] - y[58]) - y[31]*(p[82]*y[31] + y[2]*sin(x[11] - y[3])) - y[32]*(p[82]*y[32] + y[2]*cos(x[11] - y[3])) + y[37])/(2*p[74])
f_run[13] = (-x[13] - y[31]*(-p[79] + p[77]) + y[35])/p[75]
f_run[14] = (-x[14] + y[32]*(-p[80] + p[78]))/p[76]
f_run[15] = (y[2] - x[15])/p[87]
f_run[16] = -p[90]*(p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16] - y[35]) - x[15] + y[39] + u[26]
f_run[17] = (y[36] - x[17])/p[92]
f_run[18] = (x[17] - x[18])/p[94]
f_run[19] = p[95]*(u[28] - y[33]) - 1.0e-6*x[19]
f_run[20] = (x[12] - x[20] - 1.0)/p[97]
f_run[21] = (-x[21] + y[38])/p[99]
f_run[22] = -p[113]*x[22] + p[103]*(x[23] - y[58])
f_run[23] = (-p[111]*(x[23] - y[58]) - y[40]*(p[112]*y[40] + y[4]*sin(x[22] - y[5])) - y[41]*(p[112]*y[41] + y[4]*cos(x[22] - y[5])) + y[46])/(2*p[104])
f_run[24] = (-x[24] - y[40]*(-p[109] + p[107]) + y[44])/p[105]
f_run[25] = (-x[25] + y[41]*(-p[110] + p[108]))/p[106]
f_run[26] = (y[4] - x[26])/p[117]
f_run[27] = -p[120]*(p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27] - y[44]) - x[26] + y[48] + u[30]
f_run[28] = (y[45] - x[28])/p[122]
f_run[29] = (x[28] - x[29])/p[124]
f_run[30] = p[125]*(u[32] - y[42]) - 1.0e-6*x[30]
f_run[31] = (x[23] - x[31] - 1.0)/p[127]
f_run[32] = (-x[32] + y[47])/p[129]
f_run[33] = -p[143]*x[33] + p[133]*(x[34] - y[58])
f_run[34] = (-p[141]*(x[34] - y[58]) - y[49]*(p[142]*y[49] + y[6]*sin(x[33] - y[7])) - y[50]*(p[142]*y[50] + y[6]*cos(x[33] - y[7])) + y[55])/(2*p[134])
f_run[35] = (-x[35] - y[49]*(-p[139] + p[137]) + y[53])/p[135]
f_run[36] = (-x[36] + y[50]*(-p[140] + p[138]))/p[136]
f_run[37] = (y[6] - x[37])/p[147]
f_run[38] = -p[150]*(p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38] - y[53]) - x[37] + y[57] + u[34]
f_run[39] = (y[54] - x[39])/p[152]
f_run[40] = (x[39] - x[40])/p[154]
f_run[41] = p[155]*(u[36] - y[51]) - 1.0e-6*x[41]
f_run[42] = (x[34] - x[42] - 1.0)/p[157]
f_run[43] = (-x[43] + y[56])/p[159]
f_run[44] = 1 - y[58]
@numba.njit(cache=True)
def g_ini_eval(g_ini,x,y,u,p,xyup = 0):
g_ini[0] = -u[0]/p[0] + y[0]**2*p[1] + y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) - p[42]*y[24]/p[0]
g_ini[1] = -u[1]/p[0] + y[0]**2*(-p[2] - p[3]/2) + y[0]*y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9])) - p[42]*y[25]/p[0]
g_ini[2] = -u[2]/p[0] + y[2]**2*p[4] + y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) - p[72]*y[33]/p[0]
g_ini[3] = -u[3]/p[0] + y[2]**2*(-p[5] - p[6]/2) + y[2]*y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11])) - p[72]*y[34]/p[0]
g_ini[4] = -u[4]/p[0] + y[20]*y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5])) + y[4]**2*p[7] - p[102]*y[42]/p[0]
g_ini[5] = -u[5]/p[0] + y[20]*y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5])) + y[4]**2*(-p[8] - p[9]/2) - p[102]*y[43]/p[0]
g_ini[6] = -u[6]/p[0] + y[18]*y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[6]**2*p[10] - p[132]*y[51]/p[0]
g_ini[7] = -u[7]/p[0] + y[18]*y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + y[6]**2*(-p[11] - p[12]/2) - p[132]*y[52]/p[0]
g_ini[8] = -u[8]/p[0] + y[0]*y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + y[8]**2*(p[1] + p[13]) + y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
g_ini[9] = -u[9]/p[0] + y[0]*y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9])) + y[8]**2*(-p[2] - p[14] - p[3]/2 - p[15]/2) + y[8]*y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
g_ini[10] = -u[10]/p[0] + y[2]*y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + y[10]**2*(p[4] + p[13] + p[16]) + y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
g_ini[11] = -u[11]/p[0] + y[2]*y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11])) + y[8]*y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11])) + y[10]**2*(-p[5] - p[14] - p[17] - p[6]/2 - p[15]/2 - p[18]/2) + y[10]*y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
g_ini[12] = -u[12]/p[0] + y[10]*y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + y[12]**2*(p[16] + 2*p[19]) + y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
g_ini[13] = -u[13]/p[0] + y[10]*y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13])) + y[12]**2*(-p[17] - 2*p[20] - p[18]/2 - p[21]) + y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
g_ini[14] = -u[14]/p[0] + y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + y[14]**2*(2*p[19] + 2*p[22]) + y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
g_ini[15] = -u[15]/p[0] + y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15])) + y[14]**2*(-2*p[20] - 2*p[23] - p[21] - p[24]) + y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
g_ini[16] = -u[16]/p[0] + y[18]*y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17])) + y[16]**2*(2*p[22] + p[25])
g_ini[17] = -u[17]/p[0] + y[18]*y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17])) + y[16]**2*(-2*p[23] - p[26] - p[24] - p[27]/2)
g_ini[18] = -u[18]/p[0] + y[18]**2*(p[28] + p[10] + p[25]) + y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
g_ini[19] = -u[19]/p[0] + y[18]**2*(-p[29] - p[11] - p[26] - p[30]/2 - p[12]/2 - p[27]/2) + y[18]*y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[18]*y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7])) + y[18]*y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
g_ini[20] = -u[20]/p[0] + y[18]*y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[20]**2*(p[28] + p[7]) + y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
g_ini[21] = -u[21]/p[0] + y[18]*y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + y[20]**2*(-p[29] - p[8] - p[30]/2 - p[9]/2) + y[20]*y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
g_ini[22] = p[52]*y[23] + y[0]*cos(x[0] - y[1]) + p[49]*y[22] - x[2]
g_ini[23] = p[52]*y[22] + y[0]*sin(x[0] - y[1]) - p[50]*y[23] - x[3]
g_ini[24] = y[0]*y[22]*sin(x[0] - y[1]) + y[0]*y[23]*cos(x[0] - y[1]) - y[24]
g_ini[25] = y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1]) - y[25]
g_ini[26] = -y[26] + Piecewise((p[58], p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]), (p[59], p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]), (p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5], True))
g_ini[27] = p[54]*y[59] - y[27] + u[25] + x[8] - (x[1] - p[66])/p[61]
g_ini[28] = p[63]*(x[6] - x[7])/p[64] - y[28] + x[7]
g_ini[29] = x[1] - x[9] - y[29] - 1.0
g_ini[30] = -y[30] + Piecewise((-p[71], p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])), (p[71], p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])), (p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]), True))
g_ini[31] = p[82]*y[32] + y[2]*cos(x[11] - y[3]) + p[79]*y[31] - x[13]
g_ini[32] = p[82]*y[31] + y[2]*sin(x[11] - y[3]) - p[80]*y[32] - x[14]
g_ini[33] = y[2]*y[31]*sin(x[11] - y[3]) + y[2]*y[32]*cos(x[11] - y[3]) - y[33]
g_ini[34] = y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3]) - y[34]
g_ini[35] = -y[35] + Piecewise((p[88], p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]), (p[89], p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]), (p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16], True))
g_ini[36] = p[84]*y[59] - y[36] + u[29] + x[19] - (x[12] - p[96])/p[91]
g_ini[37] = p[93]*(x[17] - x[18])/p[94] - y[37] + x[18]
g_ini[38] = x[12] - x[20] - y[38] - 1.0
g_ini[39] = -y[39] + Piecewise((-p[101], p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])), (p[101], p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])), (p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]), True))
g_ini[40] = p[112]*y[41] + y[4]*cos(x[22] - y[5]) + p[109]*y[40] - x[24]
g_ini[41] = p[112]*y[40] + y[4]*sin(x[22] - y[5]) - p[110]*y[41] - x[25]
g_ini[42] = y[4]*y[40]*sin(x[22] - y[5]) + y[4]*y[41]*cos(x[22] - y[5]) - y[42]
g_ini[43] = y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5]) - y[43]
g_ini[44] = -y[44] + Piecewise((p[118], p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]), (p[119], p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]), (p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27], True))
g_ini[45] = p[114]*y[59] - y[45] + u[33] + x[30] - (x[23] - p[126])/p[121]
g_ini[46] = p[123]*(x[28] - x[29])/p[124] - y[46] + x[29]
g_ini[47] = x[23] - x[31] - y[47] - 1.0
g_ini[48] = -y[48] + Piecewise((-p[131], p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])), (p[131], p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])), (p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]), True))
g_ini[49] = p[142]*y[50] + y[6]*cos(x[33] - y[7]) + p[139]*y[49] - x[35]
g_ini[50] = p[142]*y[49] + y[6]*sin(x[33] - y[7]) - p[140]*y[50] - x[36]
g_ini[51] = y[6]*y[49]*sin(x[33] - y[7]) + y[6]*y[50]*cos(x[33] - y[7]) - y[51]
g_ini[52] = y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7]) - y[52]
g_ini[53] = -y[53] + Piecewise((p[148], p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]), (p[149], p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]), (p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38], True))
g_ini[54] = p[144]*y[59] - y[54] + u[37] + x[41] - (x[34] - p[156])/p[151]
g_ini[55] = p[153]*(x[39] - x[40])/p[154] - y[55] + x[40]
g_ini[56] = x[34] - x[42] - y[56] - 1.0
g_ini[57] = -y[57] + Piecewise((-p[161], p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])), (p[161], p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])), (p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]), True))
g_ini[58] = -y[58] + (p[44]*p[42]*x[1] + p[74]*p[72]*x[12] + p[104]*p[102]*x[23] + p[134]*p[132]*x[34])/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
g_ini[59] = p[163]*x[44] + p[162]*(1 - y[58]) - y[59]
@numba.njit(cache=True)
def g_run_eval(g_run,x,y,u,p,xyup = 0):
g_run[0] = -u[0]/p[0] + y[0]**2*p[1] + y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) - p[42]*y[24]/p[0]
g_run[1] = -u[1]/p[0] + y[0]**2*(-p[2] - p[3]/2) + y[0]*y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9])) - p[42]*y[25]/p[0]
g_run[2] = -u[2]/p[0] + y[2]**2*p[4] + y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) - p[72]*y[33]/p[0]
g_run[3] = -u[3]/p[0] + y[2]**2*(-p[5] - p[6]/2) + y[2]*y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11])) - p[72]*y[34]/p[0]
g_run[4] = -u[4]/p[0] + y[20]*y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5])) + y[4]**2*p[7] - p[102]*y[42]/p[0]
g_run[5] = -u[5]/p[0] + y[20]*y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5])) + y[4]**2*(-p[8] - p[9]/2) - p[102]*y[43]/p[0]
g_run[6] = -u[6]/p[0] + y[18]*y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[6]**2*p[10] - p[132]*y[51]/p[0]
g_run[7] = -u[7]/p[0] + y[18]*y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + y[6]**2*(-p[11] - p[12]/2) - p[132]*y[52]/p[0]
g_run[8] = -u[8]/p[0] + y[0]*y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + y[8]**2*(p[1] + p[13]) + y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
g_run[9] = -u[9]/p[0] + y[0]*y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9])) + y[8]**2*(-p[2] - p[14] - p[3]/2 - p[15]/2) + y[8]*y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
g_run[10] = -u[10]/p[0] + y[2]*y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + y[10]**2*(p[4] + p[13] + p[16]) + y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
g_run[11] = -u[11]/p[0] + y[2]*y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11])) + y[8]*y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11])) + y[10]**2*(-p[5] - p[14] - p[17] - p[6]/2 - p[15]/2 - p[18]/2) + y[10]*y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
g_run[12] = -u[12]/p[0] + y[10]*y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + y[12]**2*(p[16] + 2*p[19]) + y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
g_run[13] = -u[13]/p[0] + y[10]*y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13])) + y[12]**2*(-p[17] - 2*p[20] - p[18]/2 - p[21]) + y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
g_run[14] = -u[14]/p[0] + y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + y[14]**2*(2*p[19] + 2*p[22]) + y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
g_run[15] = -u[15]/p[0] + y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15])) + y[14]**2*(-2*p[20] - 2*p[23] - p[21] - p[24]) + y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
g_run[16] = -u[16]/p[0] + y[18]*y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17])) + y[16]**2*(2*p[22] + p[25])
g_run[17] = -u[17]/p[0] + y[18]*y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17])) + y[16]**2*(-2*p[23] - p[26] - p[24] - p[27]/2)
g_run[18] = -u[18]/p[0] + y[18]**2*(p[28] + p[10] + p[25]) + y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
g_run[19] = -u[19]/p[0] + y[18]**2*(-p[29] - p[11] - p[26] - p[30]/2 - p[12]/2 - p[27]/2) + y[18]*y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[18]*y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7])) + y[18]*y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
g_run[20] = -u[20]/p[0] + y[18]*y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[20]**2*(p[28] + p[7]) + y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
g_run[21] = -u[21]/p[0] + y[18]*y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + y[20]**2*(-p[29] - p[8] - p[30]/2 - p[9]/2) + y[20]*y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
g_run[22] = p[52]*y[23] + y[0]*cos(x[0] - y[1]) + p[49]*y[22] - x[2]
g_run[23] = p[52]*y[22] + y[0]*sin(x[0] - y[1]) - p[50]*y[23] - x[3]
g_run[24] = y[0]*y[22]*sin(x[0] - y[1]) + y[0]*y[23]*cos(x[0] - y[1]) - y[24]
g_run[25] = y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1]) - y[25]
g_run[26] = -y[26] + Piecewise((p[58], p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]), (p[59], p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]), (p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5], True))
g_run[27] = p[54]*y[59] - y[27] + u[25] + x[8] - (x[1] - p[66])/p[61]
g_run[28] = p[63]*(x[6] - x[7])/p[64] - y[28] + x[7]
g_run[29] = x[1] - x[9] - y[29] - 1.0
g_run[30] = -y[30] + Piecewise((-p[71], p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])), (p[71], p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])), (p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]), True))
g_run[31] = p[82]*y[32] + y[2]*cos(x[11] - y[3]) + p[79]*y[31] - x[13]
g_run[32] = p[82]*y[31] + y[2]*sin(x[11] - y[3]) - p[80]*y[32] - x[14]
g_run[33] = y[2]*y[31]*sin(x[11] - y[3]) + y[2]*y[32]*cos(x[11] - y[3]) - y[33]
g_run[34] = y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3]) - y[34]
g_run[35] = -y[35] + Piecewise((p[88], p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]), (p[89], p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]), (p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16], True))
g_run[36] = p[84]*y[59] - y[36] + u[29] + x[19] - (x[12] - p[96])/p[91]
g_run[37] = p[93]*(x[17] - x[18])/p[94] - y[37] + x[18]
g_run[38] = x[12] - x[20] - y[38] - 1.0
g_run[39] = -y[39] + Piecewise((-p[101], p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])), (p[101], p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])), (p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]), True))
g_run[40] = p[112]*y[41] + y[4]*cos(x[22] - y[5]) + p[109]*y[40] - x[24]
g_run[41] = p[112]*y[40] + y[4]*sin(x[22] - y[5]) - p[110]*y[41] - x[25]
g_run[42] = y[4]*y[40]*sin(x[22] - y[5]) + y[4]*y[41]*cos(x[22] - y[5]) - y[42]
g_run[43] = y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5]) - y[43]
g_run[44] = -y[44] + Piecewise((p[118], p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]), (p[119], p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]), (p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27], True))
g_run[45] = p[114]*y[59] - y[45] + u[33] + x[30] - (x[23] - p[126])/p[121]
g_run[46] = p[123]*(x[28] - x[29])/p[124] - y[46] + x[29]
g_run[47] = x[23] - x[31] - y[47] - 1.0
g_run[48] = -y[48] + Piecewise((-p[131], p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])), (p[131], p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])), (p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]), True))
g_run[49] = p[142]*y[50] + y[6]*cos(x[33] - y[7]) + p[139]*y[49] - x[35]
g_run[50] = p[142]*y[49] + y[6]*sin(x[33] - y[7]) - p[140]*y[50] - x[36]
g_run[51] = y[6]*y[49]*sin(x[33] - y[7]) + y[6]*y[50]*cos(x[33] - y[7]) - y[51]
g_run[52] = y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7]) - y[52]
g_run[53] = -y[53] + Piecewise((p[148], p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]), (p[149], p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]), (p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38], True))
g_run[54] = p[144]*y[59] - y[54] + u[37] + x[41] - (x[34] - p[156])/p[151]
g_run[55] = p[153]*(x[39] - x[40])/p[154] - y[55] + x[40]
g_run[56] = x[34] - x[42] - y[56] - 1.0
g_run[57] = -y[57] + Piecewise((-p[161], p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])), (p[161], p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])), (p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]), True))
g_run[58] = -y[58] + (p[44]*p[42]*x[1] + p[74]*p[72]*x[12] + p[104]*p[102]*x[23] + p[134]*p[132]*x[34])/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
g_run[59] = p[163]*x[44] + p[162]*(1 - y[58]) - y[59]
@numba.njit(cache=True)
def jac_ini_ss_eval(jac_ini,x,y,u,p,xyup = 0):
jac_ini[1,0] = (-y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1]))/(2*p[44])
jac_ini[1,45] = (-y[22]*sin(x[0] - y[1]) - y[23]*cos(x[0] - y[1]))/(2*p[44])
jac_ini[1,46] = (y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1]))/(2*p[44])
jac_ini[1,67] = (-2*p[52]*y[22] - y[0]*sin(x[0] - y[1]))/(2*p[44])
jac_ini[1,68] = (-2*p[52]*y[23] - y[0]*cos(x[0] - y[1]))/(2*p[44])
jac_ini[12,11] = (-y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3]))/(2*p[74])
jac_ini[12,47] = (-y[31]*sin(x[11] - y[3]) - y[32]*cos(x[11] - y[3]))/(2*p[74])
jac_ini[12,48] = (y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3]))/(2*p[74])
jac_ini[12,76] = (-2*p[82]*y[31] - y[2]*sin(x[11] - y[3]))/(2*p[74])
jac_ini[12,77] = (-2*p[82]*y[32] - y[2]*cos(x[11] - y[3]))/(2*p[74])
jac_ini[23,22] = (-y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5]))/(2*p[104])
jac_ini[23,49] = (-y[40]*sin(x[22] - y[5]) - y[41]*cos(x[22] - y[5]))/(2*p[104])
jac_ini[23,50] = (y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5]))/(2*p[104])
jac_ini[23,85] = (-2*p[112]*y[40] - y[4]*sin(x[22] - y[5]))/(2*p[104])
jac_ini[23,86] = (-2*p[112]*y[41] - y[4]*cos(x[22] - y[5]))/(2*p[104])
jac_ini[34,33] = (-y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7]))/(2*p[134])
jac_ini[34,51] = (-y[49]*sin(x[33] - y[7]) - y[50]*cos(x[33] - y[7]))/(2*p[134])
jac_ini[34,52] = (y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7]))/(2*p[134])
jac_ini[34,94] = (-2*p[142]*y[49] - y[6]*sin(x[33] - y[7]))/(2*p[134])
jac_ini[34,95] = (-2*p[142]*y[50] - y[6]*cos(x[33] - y[7]))/(2*p[134])
jac_ini[45,45] = 2*y[0]*p[1] + y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_ini[45,46] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_ini[45,53] = y[0]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_ini[45,54] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_ini[46,45] = 2*y[0]*(-p[2] - p[3]/2) + y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_ini[46,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_ini[46,53] = y[0]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_ini[46,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_ini[47,47] = 2*y[2]*p[4] + y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_ini[47,48] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_ini[47,55] = y[2]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_ini[47,56] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_ini[48,47] = 2*y[2]*(-p[5] - p[6]/2) + y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_ini[48,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_ini[48,55] = y[2]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_ini[48,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_ini[49,49] = y[20]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5])) + 2*y[4]*p[7]
jac_ini[49,50] = y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_ini[49,65] = y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_ini[49,66] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_ini[50,49] = y[20]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5])) + 2*y[4]*(-p[8] - p[9]/2)
jac_ini[50,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_ini[50,65] = y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_ini[50,66] = y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_ini[51,51] = y[18]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + 2*y[6]*p[10]
jac_ini[51,52] = y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_ini[51,63] = y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_ini[51,64] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_ini[52,51] = y[18]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + 2*y[6]*(-p[11] - p[12]/2)
jac_ini[52,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_ini[52,63] = y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_ini[52,64] = y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_ini[53,45] = y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_ini[53,46] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_ini[53,53] = y[0]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + 2*y[8]*(p[1] + p[13]) + y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_ini[53,54] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_ini[53,55] = y[8]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_ini[53,56] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_ini[54,45] = y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_ini[54,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_ini[54,53] = y[0]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9])) + 2*y[8]*(-p[2] - p[14] - p[3]/2 - p[15]/2) + y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_ini[54,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_ini[54,55] = y[8]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_ini[54,56] = y[8]*y[10]*(p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_ini[55,47] = y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_ini[55,48] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_ini[55,53] = y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_ini[55,54] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_ini[55,55] = y[2]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + 2*y[10]*(p[4] + p[13] + p[16]) + y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_ini[55,56] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11])) + y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_ini[55,57] = y[10]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_ini[55,58] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_ini[56,47] = y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_ini[56,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_ini[56,53] = y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_ini[56,54] = y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_ini[56,55] = y[2]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11])) + y[8]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11])) + 2*y[10]*(-p[5] - p[14] - p[17] - p[6]/2 - p[15]/2 - p[18]/2) + y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_ini[56,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_ini[56,57] = y[10]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_ini[56,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_ini[57,55] = y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_ini[57,56] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_ini[57,57] = y[10]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + 2*y[12]*(p[16] + 2*p[19]) + y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_ini[57,58] = y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_ini[57,59] = y[12]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_ini[57,60] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_ini[58,55] = y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_ini[58,56] = y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_ini[58,57] = y[10]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13])) + 2*y[12]*(-p[17] - 2*p[20] - p[18]/2 - p[21]) + y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_ini[58,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_ini[58,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_ini[58,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_ini[59,57] = y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_ini[59,58] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_ini[59,59] = y[12]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + 2*y[14]*(2*p[19] + 2*p[22]) + y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_ini[59,60] = y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_ini[59,61] = y[14]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_ini[59,62] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_ini[60,57] = y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_ini[60,58] = y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_ini[60,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15])) + 2*y[14]*(-2*p[20] - 2*p[23] - p[21] - p[24]) + y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_ini[60,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_ini[60,61] = y[14]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_ini[60,62] = y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_ini[61,59] = y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_ini[61,60] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_ini[61,61] = y[18]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17])) + 2*y[16]*(2*p[22] + p[25])
jac_ini[61,62] = y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_ini[61,63] = y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_ini[61,64] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_ini[62,59] = y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_ini[62,60] = y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_ini[62,61] = y[18]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17])) + y[14]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17])) + 2*y[16]*(-2*p[23] - p[26] - p[24] - p[27]/2)
jac_ini[62,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_ini[62,63] = y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_ini[62,64] = y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_ini[63,51] = y[18]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_ini[63,52] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_ini[63,61] = y[18]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_ini[63,62] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_ini[63,63] = 2*y[18]*(p[28] + p[10] + p[25]) + y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_ini[63,64] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_ini[63,65] = y[18]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_ini[63,66] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_ini[64,51] = y[18]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_ini[64,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_ini[64,61] = y[18]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_ini[64,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_ini[64,63] = 2*y[18]*(-p[29] - p[11] - p[26] - p[30]/2 - p[12]/2 - p[27]/2) + y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7])) + y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_ini[64,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_ini[64,65] = y[18]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_ini[64,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_ini[65,49] = y[20]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_ini[65,50] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_ini[65,63] = y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_ini[65,64] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_ini[65,65] = y[18]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + 2*y[20]*(p[28] + p[7]) + y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_ini[65,66] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_ini[66,49] = y[20]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_ini[66,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_ini[66,63] = y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_ini[66,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_ini[66,65] = y[18]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + 2*y[20]*(-p[29] - p[8] - p[30]/2 - p[9]/2) + y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_ini[66,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_ini[67,0] = -y[0]*sin(x[0] - y[1])
jac_ini[67,45] = cos(x[0] - y[1])
jac_ini[67,46] = y[0]*sin(x[0] - y[1])
jac_ini[68,0] = y[0]*cos(x[0] - y[1])
jac_ini[68,45] = sin(x[0] - y[1])
jac_ini[68,46] = -y[0]*cos(x[0] - y[1])
jac_ini[69,0] = y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1])
jac_ini[69,45] = y[22]*sin(x[0] - y[1]) + y[23]*cos(x[0] - y[1])
jac_ini[69,46] = -y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1])
jac_ini[69,67] = y[0]*sin(x[0] - y[1])
jac_ini[69,68] = y[0]*cos(x[0] - y[1])
jac_ini[70,0] = -y[0]*y[22]*sin(x[0] - y[1]) - y[0]*y[23]*cos(x[0] - y[1])
jac_ini[70,45] = y[22]*cos(x[0] - y[1]) - y[23]*sin(x[0] - y[1])
jac_ini[70,46] = y[0]*y[22]*sin(x[0] - y[1]) + y[0]*y[23]*cos(x[0] - y[1])
jac_ini[70,67] = y[0]*cos(x[0] - y[1])
jac_ini[70,68] = -y[0]*sin(x[0] - y[1])
jac_ini[71,4] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (-p[55], True)]))
jac_ini[71,5] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[56], True)]))
jac_ini[71,75] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[55], True)]))
jac_ini[75,10] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*(-p[68]/p[69] + 1), True)]))
jac_ini[75,74] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*p[68]/p[69], True)]))
jac_ini[76,11] = -y[2]*sin(x[11] - y[3])
jac_ini[76,47] = cos(x[11] - y[3])
jac_ini[76,48] = y[2]*sin(x[11] - y[3])
jac_ini[77,11] = y[2]*cos(x[11] - y[3])
jac_ini[77,47] = sin(x[11] - y[3])
jac_ini[77,48] = -y[2]*cos(x[11] - y[3])
jac_ini[78,11] = y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3])
jac_ini[78,47] = y[31]*sin(x[11] - y[3]) + y[32]*cos(x[11] - y[3])
jac_ini[78,48] = -y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3])
jac_ini[78,76] = y[2]*sin(x[11] - y[3])
jac_ini[78,77] = y[2]*cos(x[11] - y[3])
jac_ini[79,11] = -y[2]*y[31]*sin(x[11] - y[3]) - y[2]*y[32]*cos(x[11] - y[3])
jac_ini[79,47] = y[31]*cos(x[11] - y[3]) - y[32]*sin(x[11] - y[3])
jac_ini[79,48] = y[2]*y[31]*sin(x[11] - y[3]) + y[2]*y[32]*cos(x[11] - y[3])
jac_ini[79,76] = y[2]*cos(x[11] - y[3])
jac_ini[79,77] = -y[2]*sin(x[11] - y[3])
jac_ini[80,15] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (-p[85], True)]))
jac_ini[80,16] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[86], True)]))
jac_ini[80,84] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[85], True)]))
jac_ini[84,21] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*(-p[98]/p[99] + 1), True)]))
jac_ini[84,83] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*p[98]/p[99], True)]))
jac_ini[85,22] = -y[4]*sin(x[22] - y[5])
jac_ini[85,49] = cos(x[22] - y[5])
jac_ini[85,50] = y[4]*sin(x[22] - y[5])
jac_ini[86,22] = y[4]*cos(x[22] - y[5])
jac_ini[86,49] = sin(x[22] - y[5])
jac_ini[86,50] = -y[4]*cos(x[22] - y[5])
jac_ini[87,22] = y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5])
jac_ini[87,49] = y[40]*sin(x[22] - y[5]) + y[41]*cos(x[22] - y[5])
jac_ini[87,50] = -y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5])
jac_ini[87,85] = y[4]*sin(x[22] - y[5])
jac_ini[87,86] = y[4]*cos(x[22] - y[5])
jac_ini[88,22] = -y[4]*y[40]*sin(x[22] - y[5]) - y[4]*y[41]*cos(x[22] - y[5])
jac_ini[88,49] = y[40]*cos(x[22] - y[5]) - y[41]*sin(x[22] - y[5])
jac_ini[88,50] = y[4]*y[40]*sin(x[22] - y[5]) + y[4]*y[41]*cos(x[22] - y[5])
jac_ini[88,85] = y[4]*cos(x[22] - y[5])
jac_ini[88,86] = -y[4]*sin(x[22] - y[5])
jac_ini[89,26] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (-p[115], True)]))
jac_ini[89,27] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[116], True)]))
jac_ini[89,93] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[115], True)]))
jac_ini[93,32] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*(-p[128]/p[129] + 1), True)]))
jac_ini[93,92] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*p[128]/p[129], True)]))
jac_ini[94,33] = -y[6]*sin(x[33] - y[7])
jac_ini[94,51] = cos(x[33] - y[7])
jac_ini[94,52] = y[6]*sin(x[33] - y[7])
jac_ini[95,33] = y[6]*cos(x[33] - y[7])
jac_ini[95,51] = sin(x[33] - y[7])
jac_ini[95,52] = -y[6]*cos(x[33] - y[7])
jac_ini[96,33] = y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7])
jac_ini[96,51] = y[49]*sin(x[33] - y[7]) + y[50]*cos(x[33] - y[7])
jac_ini[96,52] = -y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7])
jac_ini[96,94] = y[6]*sin(x[33] - y[7])
jac_ini[96,95] = y[6]*cos(x[33] - y[7])
jac_ini[97,33] = -y[6]*y[49]*sin(x[33] - y[7]) - y[6]*y[50]*cos(x[33] - y[7])
jac_ini[97,51] = y[49]*cos(x[33] - y[7]) - y[50]*sin(x[33] - y[7])
jac_ini[97,52] = y[6]*y[49]*sin(x[33] - y[7]) + y[6]*y[50]*cos(x[33] - y[7])
jac_ini[97,94] = y[6]*cos(x[33] - y[7])
jac_ini[97,95] = -y[6]*sin(x[33] - y[7])
jac_ini[98,37] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (-p[145], True)]))
jac_ini[98,38] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[146], True)]))
jac_ini[98,102] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[145], True)]))
jac_ini[102,43] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*(-p[158]/p[159] + 1), True)]))
jac_ini[102,101] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*p[158]/p[159], True)]))
if xyup == 1:
jac_ini[0,0] = -p[53]
jac_ini[0,1] = p[43]
jac_ini[0,103] = -p[43]
jac_ini[1,1] = -p[51]/(2*p[44])
jac_ini[1,73] = 1/(2*p[44])
jac_ini[1,103] = p[51]/(2*p[44])
jac_ini[2,2] = -1/p[45]
jac_ini[2,67] = (p[49] - p[47])/p[45]
jac_ini[2,71] = 1/p[45]
jac_ini[3,3] = -1/p[46]
jac_ini[3,68] = (-p[50] + p[48])/p[46]
jac_ini[4,4] = -1/p[57]
jac_ini[4,45] = 1/p[57]
jac_ini[5,4] = p[55]*p[60] - 1
jac_ini[5,5] = -p[56]*p[60]
jac_ini[5,71] = p[60]
jac_ini[5,75] = -p[55]*p[60] + 1
jac_ini[6,6] = -1/p[62]
jac_ini[6,72] = 1/p[62]
jac_ini[7,6] = 1/p[64]
jac_ini[7,7] = -1/p[64]
jac_ini[8,8] = -1.00000000000000e-6
jac_ini[8,69] = -p[65]
jac_ini[9,1] = 1/p[67]
jac_ini[9,9] = -1/p[67]
jac_ini[10,10] = -1/p[69]
jac_ini[10,74] = 1/p[69]
jac_ini[11,11] = -p[83]
jac_ini[11,12] = p[73]
jac_ini[11,103] = -p[73]
jac_ini[12,12] = -p[81]/(2*p[74])
jac_ini[12,82] = 1/(2*p[74])
jac_ini[12,103] = p[81]/(2*p[74])
jac_ini[13,13] = -1/p[75]
jac_ini[13,76] = (p[79] - p[77])/p[75]
jac_ini[13,80] = 1/p[75]
jac_ini[14,14] = -1/p[76]
jac_ini[14,77] = (-p[80] + p[78])/p[76]
jac_ini[15,15] = -1/p[87]
jac_ini[15,47] = 1/p[87]
jac_ini[16,15] = p[85]*p[90] - 1
jac_ini[16,16] = -p[86]*p[90]
jac_ini[16,80] = p[90]
jac_ini[16,84] = -p[85]*p[90] + 1
jac_ini[17,17] = -1/p[92]
jac_ini[17,81] = 1/p[92]
jac_ini[18,17] = 1/p[94]
jac_ini[18,18] = -1/p[94]
jac_ini[19,19] = -1.00000000000000e-6
jac_ini[19,78] = -p[95]
jac_ini[20,12] = 1/p[97]
jac_ini[20,20] = -1/p[97]
jac_ini[21,21] = -1/p[99]
jac_ini[21,83] = 1/p[99]
jac_ini[22,22] = -p[113]
jac_ini[22,23] = p[103]
jac_ini[22,103] = -p[103]
jac_ini[23,23] = -p[111]/(2*p[104])
jac_ini[23,91] = 1/(2*p[104])
jac_ini[23,103] = p[111]/(2*p[104])
jac_ini[24,24] = -1/p[105]
jac_ini[24,85] = (p[109] - p[107])/p[105]
jac_ini[24,89] = 1/p[105]
jac_ini[25,25] = -1/p[106]
jac_ini[25,86] = (-p[110] + p[108])/p[106]
jac_ini[26,26] = -1/p[117]
jac_ini[26,49] = 1/p[117]
jac_ini[27,26] = p[115]*p[120] - 1
jac_ini[27,27] = -p[116]*p[120]
jac_ini[27,89] = p[120]
jac_ini[27,93] = -p[115]*p[120] + 1
jac_ini[28,28] = -1/p[122]
jac_ini[28,90] = 1/p[122]
jac_ini[29,28] = 1/p[124]
jac_ini[29,29] = -1/p[124]
jac_ini[30,30] = -1.00000000000000e-6
jac_ini[30,87] = -p[125]
jac_ini[31,23] = 1/p[127]
jac_ini[31,31] = -1/p[127]
jac_ini[32,32] = -1/p[129]
jac_ini[32,92] = 1/p[129]
jac_ini[33,33] = -p[143]
jac_ini[33,34] = p[133]
jac_ini[33,103] = -p[133]
jac_ini[34,34] = -p[141]/(2*p[134])
jac_ini[34,100] = 1/(2*p[134])
jac_ini[34,103] = p[141]/(2*p[134])
jac_ini[35,35] = -1/p[135]
jac_ini[35,94] = (p[139] - p[137])/p[135]
jac_ini[35,98] = 1/p[135]
jac_ini[36,36] = -1/p[136]
jac_ini[36,95] = (-p[140] + p[138])/p[136]
jac_ini[37,37] = -1/p[147]
jac_ini[37,51] = 1/p[147]
jac_ini[38,37] = p[145]*p[150] - 1
jac_ini[38,38] = -p[146]*p[150]
jac_ini[38,98] = p[150]
jac_ini[38,102] = -p[145]*p[150] + 1
jac_ini[39,39] = -1/p[152]
jac_ini[39,99] = 1/p[152]
jac_ini[40,39] = 1/p[154]
jac_ini[40,40] = -1/p[154]
jac_ini[41,41] = -1.00000000000000e-6
jac_ini[41,96] = -p[155]
jac_ini[42,34] = 1/p[157]
jac_ini[42,42] = -1/p[157]
jac_ini[43,43] = -1/p[159]
jac_ini[43,101] = 1/p[159]
jac_ini[44,103] = -1
jac_ini[45,69] = -p[42]/p[0]
jac_ini[46,70] = -p[42]/p[0]
jac_ini[47,78] = -p[72]/p[0]
jac_ini[48,79] = -p[72]/p[0]
jac_ini[49,87] = -p[102]/p[0]
jac_ini[50,88] = -p[102]/p[0]
jac_ini[51,96] = -p[132]/p[0]
jac_ini[52,97] = -p[132]/p[0]
jac_ini[67,2] = -1
jac_ini[67,67] = p[49]
jac_ini[67,68] = p[52]
jac_ini[68,3] = -1
jac_ini[68,67] = p[52]
jac_ini[68,68] = -p[50]
jac_ini[69,69] = -1
jac_ini[70,70] = -1
jac_ini[71,71] = -1
jac_ini[72,1] = -1/p[61]
jac_ini[72,8] = 1
jac_ini[72,72] = -1
jac_ini[72,104] = p[54]
jac_ini[73,6] = p[63]/p[64]
jac_ini[73,7] = -p[63]/p[64] + 1
jac_ini[73,73] = -1
jac_ini[74,1] = 1
jac_ini[74,9] = -1
jac_ini[74,74] = -1
jac_ini[75,75] = -1
jac_ini[76,13] = -1
jac_ini[76,76] = p[79]
jac_ini[76,77] = p[82]
jac_ini[77,14] = -1
jac_ini[77,76] = p[82]
jac_ini[77,77] = -p[80]
jac_ini[78,78] = -1
jac_ini[79,79] = -1
jac_ini[80,80] = -1
jac_ini[81,12] = -1/p[91]
jac_ini[81,19] = 1
jac_ini[81,81] = -1
jac_ini[81,104] = p[84]
jac_ini[82,17] = p[93]/p[94]
jac_ini[82,18] = -p[93]/p[94] + 1
jac_ini[82,82] = -1
jac_ini[83,12] = 1
jac_ini[83,20] = -1
jac_ini[83,83] = -1
jac_ini[84,84] = -1
jac_ini[85,24] = -1
jac_ini[85,85] = p[109]
jac_ini[85,86] = p[112]
jac_ini[86,25] = -1
jac_ini[86,85] = p[112]
jac_ini[86,86] = -p[110]
jac_ini[87,87] = -1
jac_ini[88,88] = -1
jac_ini[89,89] = -1
jac_ini[90,23] = -1/p[121]
jac_ini[90,30] = 1
jac_ini[90,90] = -1
jac_ini[90,104] = p[114]
jac_ini[91,28] = p[123]/p[124]
jac_ini[91,29] = -p[123]/p[124] + 1
jac_ini[91,91] = -1
jac_ini[92,23] = 1
jac_ini[92,31] = -1
jac_ini[92,92] = -1
jac_ini[93,93] = -1
jac_ini[94,35] = -1
jac_ini[94,94] = p[139]
jac_ini[94,95] = p[142]
jac_ini[95,36] = -1
jac_ini[95,94] = p[142]
jac_ini[95,95] = -p[140]
jac_ini[96,96] = -1
jac_ini[97,97] = -1
jac_ini[98,98] = -1
jac_ini[99,34] = -1/p[151]
jac_ini[99,41] = 1
jac_ini[99,99] = -1
jac_ini[99,104] = p[144]
jac_ini[100,39] = p[153]/p[154]
jac_ini[100,40] = -p[153]/p[154] + 1
jac_ini[100,100] = -1
jac_ini[101,34] = 1
jac_ini[101,42] = -1
jac_ini[101,101] = -1
jac_ini[102,102] = -1
jac_ini[103,1] = p[44]*p[42]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_ini[103,12] = p[74]*p[72]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_ini[103,23] = p[104]*p[102]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_ini[103,34] = p[134]*p[132]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_ini[103,103] = -1
jac_ini[104,44] = p[163]
jac_ini[104,103] = -p[162]
jac_ini[104,104] = -1
@numba.njit(cache=True)
def jac_run_ss_eval(jac_run,x,y,u,p,xyup = 0):
jac_run[1,0] = (-y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1]))/(2*p[44])
jac_run[1,45] = (-y[22]*sin(x[0] - y[1]) - y[23]*cos(x[0] - y[1]))/(2*p[44])
jac_run[1,46] = (y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1]))/(2*p[44])
jac_run[1,67] = (-2*p[52]*y[22] - y[0]*sin(x[0] - y[1]))/(2*p[44])
jac_run[1,68] = (-2*p[52]*y[23] - y[0]*cos(x[0] - y[1]))/(2*p[44])
jac_run[12,11] = (-y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3]))/(2*p[74])
jac_run[12,47] = (-y[31]*sin(x[11] - y[3]) - y[32]*cos(x[11] - y[3]))/(2*p[74])
jac_run[12,48] = (y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3]))/(2*p[74])
jac_run[12,76] = (-2*p[82]*y[31] - y[2]*sin(x[11] - y[3]))/(2*p[74])
jac_run[12,77] = (-2*p[82]*y[32] - y[2]*cos(x[11] - y[3]))/(2*p[74])
jac_run[23,22] = (-y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5]))/(2*p[104])
jac_run[23,49] = (-y[40]*sin(x[22] - y[5]) - y[41]*cos(x[22] - y[5]))/(2*p[104])
jac_run[23,50] = (y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5]))/(2*p[104])
jac_run[23,85] = (-2*p[112]*y[40] - y[4]*sin(x[22] - y[5]))/(2*p[104])
jac_run[23,86] = (-2*p[112]*y[41] - y[4]*cos(x[22] - y[5]))/(2*p[104])
jac_run[34,33] = (-y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7]))/(2*p[134])
jac_run[34,51] = (-y[49]*sin(x[33] - y[7]) - y[50]*cos(x[33] - y[7]))/(2*p[134])
jac_run[34,52] = (y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7]))/(2*p[134])
jac_run[34,94] = (-2*p[142]*y[49] - y[6]*sin(x[33] - y[7]))/(2*p[134])
jac_run[34,95] = (-2*p[142]*y[50] - y[6]*cos(x[33] - y[7]))/(2*p[134])
jac_run[45,45] = 2*y[0]*p[1] + y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_run[45,46] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_run[45,53] = y[0]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_run[45,54] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_run[46,45] = 2*y[0]*(-p[2] - p[3]/2) + y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_run[46,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_run[46,53] = y[0]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_run[46,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_run[47,47] = 2*y[2]*p[4] + y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_run[47,48] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_run[47,55] = y[2]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_run[47,56] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_run[48,47] = 2*y[2]*(-p[5] - p[6]/2) + y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_run[48,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_run[48,55] = y[2]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_run[48,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_run[49,49] = y[20]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5])) + 2*y[4]*p[7]
jac_run[49,50] = y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_run[49,65] = y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_run[49,66] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_run[50,49] = y[20]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5])) + 2*y[4]*(-p[8] - p[9]/2)
jac_run[50,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_run[50,65] = y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_run[50,66] = y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_run[51,51] = y[18]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + 2*y[6]*p[10]
jac_run[51,52] = y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_run[51,63] = y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_run[51,64] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_run[52,51] = y[18]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + 2*y[6]*(-p[11] - p[12]/2)
jac_run[52,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_run[52,63] = y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_run[52,64] = y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_run[53,45] = y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_run[53,46] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_run[53,53] = y[0]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + 2*y[8]*(p[1] + p[13]) + y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_run[53,54] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_run[53,55] = y[8]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_run[53,56] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_run[54,45] = y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_run[54,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_run[54,53] = y[0]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9])) + 2*y[8]*(-p[2] - p[14] - p[3]/2 - p[15]/2) + y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_run[54,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_run[54,55] = y[8]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_run[54,56] = y[8]*y[10]*(p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_run[55,47] = y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_run[55,48] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_run[55,53] = y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_run[55,54] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_run[55,55] = y[2]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + 2*y[10]*(p[4] + p[13] + p[16]) + y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_run[55,56] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11])) + y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_run[55,57] = y[10]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_run[55,58] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_run[56,47] = y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_run[56,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_run[56,53] = y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_run[56,54] = y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_run[56,55] = y[2]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11])) + y[8]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11])) + 2*y[10]*(-p[5] - p[14] - p[17] - p[6]/2 - p[15]/2 - p[18]/2) + y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_run[56,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_run[56,57] = y[10]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_run[56,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_run[57,55] = y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_run[57,56] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_run[57,57] = y[10]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + 2*y[12]*(p[16] + 2*p[19]) + y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_run[57,58] = y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_run[57,59] = y[12]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_run[57,60] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_run[58,55] = y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_run[58,56] = y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_run[58,57] = y[10]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13])) + 2*y[12]*(-p[17] - 2*p[20] - p[18]/2 - p[21]) + y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_run[58,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_run[58,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_run[58,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_run[59,57] = y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_run[59,58] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_run[59,59] = y[12]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + 2*y[14]*(2*p[19] + 2*p[22]) + y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_run[59,60] = y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_run[59,61] = y[14]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_run[59,62] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_run[60,57] = y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_run[60,58] = y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_run[60,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15])) + 2*y[14]*(-2*p[20] - 2*p[23] - p[21] - p[24]) + y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_run[60,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_run[60,61] = y[14]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_run[60,62] = y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_run[61,59] = y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_run[61,60] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_run[61,61] = y[18]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17])) + 2*y[16]*(2*p[22] + p[25])
jac_run[61,62] = y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_run[61,63] = y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_run[61,64] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_run[62,59] = y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_run[62,60] = y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_run[62,61] = y[18]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17])) + y[14]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17])) + 2*y[16]*(-2*p[23] - p[26] - p[24] - p[27]/2)
jac_run[62,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_run[62,63] = y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_run[62,64] = y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_run[63,51] = y[18]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_run[63,52] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_run[63,61] = y[18]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_run[63,62] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_run[63,63] = 2*y[18]*(p[28] + p[10] + p[25]) + y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_run[63,64] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_run[63,65] = y[18]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_run[63,66] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_run[64,51] = y[18]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_run[64,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_run[64,61] = y[18]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_run[64,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_run[64,63] = 2*y[18]*(-p[29] - p[11] - p[26] - p[30]/2 - p[12]/2 - p[27]/2) + y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7])) + y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_run[64,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_run[64,65] = y[18]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_run[64,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_run[65,49] = y[20]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_run[65,50] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_run[65,63] = y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_run[65,64] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_run[65,65] = y[18]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + 2*y[20]*(p[28] + p[7]) + y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_run[65,66] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_run[66,49] = y[20]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_run[66,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_run[66,63] = y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_run[66,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_run[66,65] = y[18]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + 2*y[20]*(-p[29] - p[8] - p[30]/2 - p[9]/2) + y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_run[66,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_run[67,0] = -y[0]*sin(x[0] - y[1])
jac_run[67,45] = cos(x[0] - y[1])
jac_run[67,46] = y[0]*sin(x[0] - y[1])
jac_run[68,0] = y[0]*cos(x[0] - y[1])
jac_run[68,45] = sin(x[0] - y[1])
jac_run[68,46] = -y[0]*cos(x[0] - y[1])
jac_run[69,0] = y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1])
jac_run[69,45] = y[22]*sin(x[0] - y[1]) + y[23]*cos(x[0] - y[1])
jac_run[69,46] = -y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1])
jac_run[69,67] = y[0]*sin(x[0] - y[1])
jac_run[69,68] = y[0]*cos(x[0] - y[1])
jac_run[70,0] = -y[0]*y[22]*sin(x[0] - y[1]) - y[0]*y[23]*cos(x[0] - y[1])
jac_run[70,45] = y[22]*cos(x[0] - y[1]) - y[23]*sin(x[0] - y[1])
jac_run[70,46] = y[0]*y[22]*sin(x[0] - y[1]) + y[0]*y[23]*cos(x[0] - y[1])
jac_run[70,67] = y[0]*cos(x[0] - y[1])
jac_run[70,68] = -y[0]*sin(x[0] - y[1])
jac_run[71,4] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (-p[55], True)]))
jac_run[71,5] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[56], True)]))
jac_run[71,75] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[55], True)]))
jac_run[75,10] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*(-p[68]/p[69] + 1), True)]))
jac_run[75,74] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*p[68]/p[69], True)]))
jac_run[76,11] = -y[2]*sin(x[11] - y[3])
jac_run[76,47] = cos(x[11] - y[3])
jac_run[76,48] = y[2]*sin(x[11] - y[3])
jac_run[77,11] = y[2]*cos(x[11] - y[3])
jac_run[77,47] = sin(x[11] - y[3])
jac_run[77,48] = -y[2]*cos(x[11] - y[3])
jac_run[78,11] = y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3])
jac_run[78,47] = y[31]*sin(x[11] - y[3]) + y[32]*cos(x[11] - y[3])
jac_run[78,48] = -y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3])
jac_run[78,76] = y[2]*sin(x[11] - y[3])
jac_run[78,77] = y[2]*cos(x[11] - y[3])
jac_run[79,11] = -y[2]*y[31]*sin(x[11] - y[3]) - y[2]*y[32]*cos(x[11] - y[3])
jac_run[79,47] = y[31]*cos(x[11] - y[3]) - y[32]*sin(x[11] - y[3])
jac_run[79,48] = y[2]*y[31]*sin(x[11] - y[3]) + y[2]*y[32]*cos(x[11] - y[3])
jac_run[79,76] = y[2]*cos(x[11] - y[3])
jac_run[79,77] = -y[2]*sin(x[11] - y[3])
jac_run[80,15] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (-p[85], True)]))
jac_run[80,16] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[86], True)]))
jac_run[80,84] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[85], True)]))
jac_run[84,21] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*(-p[98]/p[99] + 1), True)]))
jac_run[84,83] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*p[98]/p[99], True)]))
jac_run[85,22] = -y[4]*sin(x[22] - y[5])
jac_run[85,49] = cos(x[22] - y[5])
jac_run[85,50] = y[4]*sin(x[22] - y[5])
jac_run[86,22] = y[4]*cos(x[22] - y[5])
jac_run[86,49] = sin(x[22] - y[5])
jac_run[86,50] = -y[4]*cos(x[22] - y[5])
jac_run[87,22] = y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5])
jac_run[87,49] = y[40]*sin(x[22] - y[5]) + y[41]*cos(x[22] - y[5])
jac_run[87,50] = -y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5])
jac_run[87,85] = y[4]*sin(x[22] - y[5])
jac_run[87,86] = y[4]*cos(x[22] - y[5])
jac_run[88,22] = -y[4]*y[40]*sin(x[22] - y[5]) - y[4]*y[41]*cos(x[22] - y[5])
jac_run[88,49] = y[40]*cos(x[22] - y[5]) - y[41]*sin(x[22] - y[5])
jac_run[88,50] = y[4]*y[40]*sin(x[22] - y[5]) + y[4]*y[41]*cos(x[22] - y[5])
jac_run[88,85] = y[4]*cos(x[22] - y[5])
jac_run[88,86] = -y[4]*sin(x[22] - y[5])
jac_run[89,26] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (-p[115], True)]))
jac_run[89,27] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[116], True)]))
jac_run[89,93] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[115], True)]))
jac_run[93,32] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*(-p[128]/p[129] + 1), True)]))
jac_run[93,92] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*p[128]/p[129], True)]))
jac_run[94,33] = -y[6]*sin(x[33] - y[7])
jac_run[94,51] = cos(x[33] - y[7])
jac_run[94,52] = y[6]*sin(x[33] - y[7])
jac_run[95,33] = y[6]*cos(x[33] - y[7])
jac_run[95,51] = sin(x[33] - y[7])
jac_run[95,52] = -y[6]*cos(x[33] - y[7])
jac_run[96,33] = y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7])
jac_run[96,51] = y[49]*sin(x[33] - y[7]) + y[50]*cos(x[33] - y[7])
jac_run[96,52] = -y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7])
jac_run[96,94] = y[6]*sin(x[33] - y[7])
jac_run[96,95] = y[6]*cos(x[33] - y[7])
jac_run[97,33] = -y[6]*y[49]*sin(x[33] - y[7]) - y[6]*y[50]*cos(x[33] - y[7])
jac_run[97,51] = y[49]*cos(x[33] - y[7]) - y[50]*sin(x[33] - y[7])
jac_run[97,52] = y[6]*y[49]*sin(x[33] - y[7]) + y[6]*y[50]*cos(x[33] - y[7])
jac_run[97,94] = y[6]*cos(x[33] - y[7])
jac_run[97,95] = -y[6]*sin(x[33] - y[7])
jac_run[98,37] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (-p[145], True)]))
jac_run[98,38] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[146], True)]))
jac_run[98,102] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[145], True)]))
jac_run[102,43] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*(-p[158]/p[159] + 1), True)]))
jac_run[102,101] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*p[158]/p[159], True)]))
if xyup == 1:
jac_run[0,0] = -p[53]
jac_run[0,1] = p[43]
jac_run[0,103] = -p[43]
jac_run[1,1] = -p[51]/(2*p[44])
jac_run[1,73] = 1/(2*p[44])
jac_run[1,103] = p[51]/(2*p[44])
jac_run[2,2] = -1/p[45]
jac_run[2,67] = (p[49] - p[47])/p[45]
jac_run[2,71] = 1/p[45]
jac_run[3,3] = -1/p[46]
jac_run[3,68] = (-p[50] + p[48])/p[46]
jac_run[4,4] = -1/p[57]
jac_run[4,45] = 1/p[57]
jac_run[5,4] = p[55]*p[60] - 1
jac_run[5,5] = -p[56]*p[60]
jac_run[5,71] = p[60]
jac_run[5,75] = -p[55]*p[60] + 1
jac_run[6,6] = -1/p[62]
jac_run[6,72] = 1/p[62]
jac_run[7,6] = 1/p[64]
jac_run[7,7] = -1/p[64]
jac_run[8,8] = -1.00000000000000e-6
jac_run[8,69] = -p[65]
jac_run[9,1] = 1/p[67]
jac_run[9,9] = -1/p[67]
jac_run[10,10] = -1/p[69]
jac_run[10,74] = 1/p[69]
jac_run[11,11] = -p[83]
jac_run[11,12] = p[73]
jac_run[11,103] = -p[73]
jac_run[12,12] = -p[81]/(2*p[74])
jac_run[12,82] = 1/(2*p[74])
jac_run[12,103] = p[81]/(2*p[74])
jac_run[13,13] = -1/p[75]
jac_run[13,76] = (p[79] - p[77])/p[75]
jac_run[13,80] = 1/p[75]
jac_run[14,14] = -1/p[76]
jac_run[14,77] = (-p[80] + p[78])/p[76]
jac_run[15,15] = -1/p[87]
jac_run[15,47] = 1/p[87]
jac_run[16,15] = p[85]*p[90] - 1
jac_run[16,16] = -p[86]*p[90]
jac_run[16,80] = p[90]
jac_run[16,84] = -p[85]*p[90] + 1
jac_run[17,17] = -1/p[92]
jac_run[17,81] = 1/p[92]
jac_run[18,17] = 1/p[94]
jac_run[18,18] = -1/p[94]
jac_run[19,19] = -1.00000000000000e-6
jac_run[19,78] = -p[95]
jac_run[20,12] = 1/p[97]
jac_run[20,20] = -1/p[97]
jac_run[21,21] = -1/p[99]
jac_run[21,83] = 1/p[99]
jac_run[22,22] = -p[113]
jac_run[22,23] = p[103]
jac_run[22,103] = -p[103]
jac_run[23,23] = -p[111]/(2*p[104])
jac_run[23,91] = 1/(2*p[104])
jac_run[23,103] = p[111]/(2*p[104])
jac_run[24,24] = -1/p[105]
jac_run[24,85] = (p[109] - p[107])/p[105]
jac_run[24,89] = 1/p[105]
jac_run[25,25] = -1/p[106]
jac_run[25,86] = (-p[110] + p[108])/p[106]
jac_run[26,26] = -1/p[117]
jac_run[26,49] = 1/p[117]
jac_run[27,26] = p[115]*p[120] - 1
jac_run[27,27] = -p[116]*p[120]
jac_run[27,89] = p[120]
jac_run[27,93] = -p[115]*p[120] + 1
jac_run[28,28] = -1/p[122]
jac_run[28,90] = 1/p[122]
jac_run[29,28] = 1/p[124]
jac_run[29,29] = -1/p[124]
jac_run[30,30] = -1.00000000000000e-6
jac_run[30,87] = -p[125]
jac_run[31,23] = 1/p[127]
jac_run[31,31] = -1/p[127]
jac_run[32,32] = -1/p[129]
jac_run[32,92] = 1/p[129]
jac_run[33,33] = -p[143]
jac_run[33,34] = p[133]
jac_run[33,103] = -p[133]
jac_run[34,34] = -p[141]/(2*p[134])
jac_run[34,100] = 1/(2*p[134])
jac_run[34,103] = p[141]/(2*p[134])
jac_run[35,35] = -1/p[135]
jac_run[35,94] = (p[139] - p[137])/p[135]
jac_run[35,98] = 1/p[135]
jac_run[36,36] = -1/p[136]
jac_run[36,95] = (-p[140] + p[138])/p[136]
jac_run[37,37] = -1/p[147]
jac_run[37,51] = 1/p[147]
jac_run[38,37] = p[145]*p[150] - 1
jac_run[38,38] = -p[146]*p[150]
jac_run[38,98] = p[150]
jac_run[38,102] = -p[145]*p[150] + 1
jac_run[39,39] = -1/p[152]
jac_run[39,99] = 1/p[152]
jac_run[40,39] = 1/p[154]
jac_run[40,40] = -1/p[154]
jac_run[41,41] = -1.00000000000000e-6
jac_run[41,96] = -p[155]
jac_run[42,34] = 1/p[157]
jac_run[42,42] = -1/p[157]
jac_run[43,43] = -1/p[159]
jac_run[43,101] = 1/p[159]
jac_run[44,103] = -1
jac_run[45,69] = -p[42]/p[0]
jac_run[46,70] = -p[42]/p[0]
jac_run[47,78] = -p[72]/p[0]
jac_run[48,79] = -p[72]/p[0]
jac_run[49,87] = -p[102]/p[0]
jac_run[50,88] = -p[102]/p[0]
jac_run[51,96] = -p[132]/p[0]
jac_run[52,97] = -p[132]/p[0]
jac_run[67,2] = -1
jac_run[67,67] = p[49]
jac_run[67,68] = p[52]
jac_run[68,3] = -1
jac_run[68,67] = p[52]
jac_run[68,68] = -p[50]
jac_run[69,69] = -1
jac_run[70,70] = -1
jac_run[71,71] = -1
jac_run[72,1] = -1/p[61]
jac_run[72,8] = 1
jac_run[72,72] = -1
jac_run[72,104] = p[54]
jac_run[73,6] = p[63]/p[64]
jac_run[73,7] = -p[63]/p[64] + 1
jac_run[73,73] = -1
jac_run[74,1] = 1
jac_run[74,9] = -1
jac_run[74,74] = -1
jac_run[75,75] = -1
jac_run[76,13] = -1
jac_run[76,76] = p[79]
jac_run[76,77] = p[82]
jac_run[77,14] = -1
jac_run[77,76] = p[82]
jac_run[77,77] = -p[80]
jac_run[78,78] = -1
jac_run[79,79] = -1
jac_run[80,80] = -1
jac_run[81,12] = -1/p[91]
jac_run[81,19] = 1
jac_run[81,81] = -1
jac_run[81,104] = p[84]
jac_run[82,17] = p[93]/p[94]
jac_run[82,18] = -p[93]/p[94] + 1
jac_run[82,82] = -1
jac_run[83,12] = 1
jac_run[83,20] = -1
jac_run[83,83] = -1
jac_run[84,84] = -1
jac_run[85,24] = -1
jac_run[85,85] = p[109]
jac_run[85,86] = p[112]
jac_run[86,25] = -1
jac_run[86,85] = p[112]
jac_run[86,86] = -p[110]
jac_run[87,87] = -1
jac_run[88,88] = -1
jac_run[89,89] = -1
jac_run[90,23] = -1/p[121]
jac_run[90,30] = 1
jac_run[90,90] = -1
jac_run[90,104] = p[114]
jac_run[91,28] = p[123]/p[124]
jac_run[91,29] = -p[123]/p[124] + 1
jac_run[91,91] = -1
jac_run[92,23] = 1
jac_run[92,31] = -1
jac_run[92,92] = -1
jac_run[93,93] = -1
jac_run[94,35] = -1
jac_run[94,94] = p[139]
jac_run[94,95] = p[142]
jac_run[95,36] = -1
jac_run[95,94] = p[142]
jac_run[95,95] = -p[140]
jac_run[96,96] = -1
jac_run[97,97] = -1
jac_run[98,98] = -1
jac_run[99,34] = -1/p[151]
jac_run[99,41] = 1
jac_run[99,99] = -1
jac_run[99,104] = p[144]
jac_run[100,39] = p[153]/p[154]
jac_run[100,40] = -p[153]/p[154] + 1
jac_run[100,100] = -1
jac_run[101,34] = 1
jac_run[101,42] = -1
jac_run[101,101] = -1
jac_run[102,102] = -1
jac_run[103,1] = p[44]*p[42]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_run[103,12] = p[74]*p[72]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_run[103,23] = p[104]*p[102]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_run[103,34] = p[134]*p[132]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_run[103,103] = -1
jac_run[104,44] = p[163]
jac_run[104,103] = -p[162]
jac_run[104,104] = -1
@numba.njit(cache=True)
def jac_trap_eval(jac_trap,x,y,u,p,xyup = 0):
jac_trap[1,0] = -0.25*Dt*(-y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1]))/p[44]
jac_trap[1,45] = -0.25*Dt*(-y[22]*sin(x[0] - y[1]) - y[23]*cos(x[0] - y[1]))/p[44]
jac_trap[1,46] = -0.25*Dt*(y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1]))/p[44]
jac_trap[1,67] = -0.25*Dt*(-2*p[52]*y[22] - y[0]*sin(x[0] - y[1]))/p[44]
jac_trap[1,68] = -0.25*Dt*(-2*p[52]*y[23] - y[0]*cos(x[0] - y[1]))/p[44]
jac_trap[12,11] = -0.25*Dt*(-y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3]))/p[74]
jac_trap[12,47] = -0.25*Dt*(-y[31]*sin(x[11] - y[3]) - y[32]*cos(x[11] - y[3]))/p[74]
jac_trap[12,48] = -0.25*Dt*(y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3]))/p[74]
jac_trap[12,76] = -0.25*Dt*(-2*p[82]*y[31] - y[2]*sin(x[11] - y[3]))/p[74]
jac_trap[12,77] = -0.25*Dt*(-2*p[82]*y[32] - y[2]*cos(x[11] - y[3]))/p[74]
jac_trap[23,22] = -0.25*Dt*(-y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5]))/p[104]
jac_trap[23,49] = -0.25*Dt*(-y[40]*sin(x[22] - y[5]) - y[41]*cos(x[22] - y[5]))/p[104]
jac_trap[23,50] = -0.25*Dt*(y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5]))/p[104]
jac_trap[23,85] = -0.25*Dt*(-2*p[112]*y[40] - y[4]*sin(x[22] - y[5]))/p[104]
jac_trap[23,86] = -0.25*Dt*(-2*p[112]*y[41] - y[4]*cos(x[22] - y[5]))/p[104]
jac_trap[34,33] = -0.25*Dt*(-y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7]))/p[134]
jac_trap[34,51] = -0.25*Dt*(-y[49]*sin(x[33] - y[7]) - y[50]*cos(x[33] - y[7]))/p[134]
jac_trap[34,52] = -0.25*Dt*(y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7]))/p[134]
jac_trap[34,94] = -0.25*Dt*(-2*p[142]*y[49] - y[6]*sin(x[33] - y[7]))/p[134]
jac_trap[34,95] = -0.25*Dt*(-2*p[142]*y[50] - y[6]*cos(x[33] - y[7]))/p[134]
jac_trap[45,45] = 2*y[0]*p[1] + y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_trap[45,46] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_trap[45,53] = y[0]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_trap[45,54] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_trap[46,45] = 2*y[0]*(-p[2] - p[3]/2) + y[8]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_trap[46,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_trap[46,53] = y[0]*(p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9]))
jac_trap[46,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_trap[47,47] = 2*y[2]*p[4] + y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_trap[47,48] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_trap[47,55] = y[2]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_trap[47,56] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_trap[48,47] = 2*y[2]*(-p[5] - p[6]/2) + y[10]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_trap[48,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_trap[48,55] = y[2]*(p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11]))
jac_trap[48,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_trap[49,49] = y[20]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5])) + 2*y[4]*p[7]
jac_trap[49,50] = y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_trap[49,65] = y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_trap[49,66] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_trap[50,49] = y[20]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5])) + 2*y[4]*(-p[8] - p[9]/2)
jac_trap[50,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_trap[50,65] = y[4]*(p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_trap[50,66] = y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_trap[51,51] = y[18]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + 2*y[6]*p[10]
jac_trap[51,52] = y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_trap[51,63] = y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_trap[51,64] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_trap[52,51] = y[18]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + 2*y[6]*(-p[11] - p[12]/2)
jac_trap[52,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_trap[52,63] = y[6]*(p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7]))
jac_trap[52,64] = y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_trap[53,45] = y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9]))
jac_trap[53,46] = y[0]*y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_trap[53,53] = y[0]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + 2*y[8]*(p[1] + p[13]) + y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_trap[53,54] = y[0]*y[8]*(-p[2]*cos(y[1] - y[9]) - p[1]*sin(y[1] - y[9])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_trap[53,55] = y[8]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_trap[53,56] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_trap[54,45] = y[8]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9]))
jac_trap[54,46] = y[0]*y[8]*(-p[2]*sin(y[1] - y[9]) + p[1]*cos(y[1] - y[9]))
jac_trap[54,53] = y[0]*(p[2]*cos(y[1] - y[9]) + p[1]*sin(y[1] - y[9])) + 2*y[8]*(-p[2] - p[14] - p[3]/2 - p[15]/2) + y[10]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_trap[54,54] = y[0]*y[8]*(p[2]*sin(y[1] - y[9]) - p[1]*cos(y[1] - y[9])) + y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_trap[54,55] = y[8]*(p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11]))
jac_trap[54,56] = y[8]*y[10]*(p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_trap[55,47] = y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11]))
jac_trap[55,48] = y[2]*y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_trap[55,53] = y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11]))
jac_trap[55,54] = y[8]*y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_trap[55,55] = y[2]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + 2*y[10]*(p[4] + p[13] + p[16]) + y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_trap[55,56] = y[2]*y[10]*(-p[5]*cos(y[3] - y[11]) - p[4]*sin(y[3] - y[11])) + y[8]*y[10]*(-p[14]*cos(y[9] - y[11]) - p[13]*sin(y[9] - y[11])) + y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_trap[55,57] = y[10]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_trap[55,58] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_trap[56,47] = y[10]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11]))
jac_trap[56,48] = y[2]*y[10]*(-p[5]*sin(y[3] - y[11]) + p[4]*cos(y[3] - y[11]))
jac_trap[56,53] = y[10]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11]))
jac_trap[56,54] = y[8]*y[10]*(-p[14]*sin(y[9] - y[11]) + p[13]*cos(y[9] - y[11]))
jac_trap[56,55] = y[2]*(p[5]*cos(y[3] - y[11]) + p[4]*sin(y[3] - y[11])) + y[8]*(p[14]*cos(y[9] - y[11]) + p[13]*sin(y[9] - y[11])) + 2*y[10]*(-p[5] - p[14] - p[17] - p[6]/2 - p[15]/2 - p[18]/2) + y[12]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_trap[56,56] = y[2]*y[10]*(p[5]*sin(y[3] - y[11]) - p[4]*cos(y[3] - y[11])) + y[8]*y[10]*(p[14]*sin(y[9] - y[11]) - p[13]*cos(y[9] - y[11])) + y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_trap[56,57] = y[10]*(p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13]))
jac_trap[56,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_trap[57,55] = y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13]))
jac_trap[57,56] = y[10]*y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_trap[57,57] = y[10]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + 2*y[12]*(p[16] + 2*p[19]) + y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_trap[57,58] = y[10]*y[12]*(-p[17]*cos(y[11] - y[13]) - p[16]*sin(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_trap[57,59] = y[12]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_trap[57,60] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_trap[58,55] = y[12]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13]))
jac_trap[58,56] = y[10]*y[12]*(-p[17]*sin(y[11] - y[13]) + p[16]*cos(y[11] - y[13]))
jac_trap[58,57] = y[10]*(p[17]*cos(y[11] - y[13]) + p[16]*sin(y[11] - y[13])) + 2*y[12]*(-p[17] - 2*p[20] - p[18]/2 - p[21]) + y[14]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_trap[58,58] = y[10]*y[12]*(p[17]*sin(y[11] - y[13]) - p[16]*cos(y[11] - y[13])) + y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_trap[58,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15]))
jac_trap[58,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_trap[59,57] = y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15]))
jac_trap[59,58] = y[12]*y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_trap[59,59] = y[12]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + 2*y[14]*(2*p[19] + 2*p[22]) + y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_trap[59,60] = y[12]*y[14]*(-2*p[20]*cos(y[13] - y[15]) - 2*p[19]*sin(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_trap[59,61] = y[14]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_trap[59,62] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_trap[60,57] = y[14]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15]))
jac_trap[60,58] = y[12]*y[14]*(-2*p[20]*sin(y[13] - y[15]) + 2*p[19]*cos(y[13] - y[15]))
jac_trap[60,59] = y[12]*(2*p[20]*cos(y[13] - y[15]) + 2*p[19]*sin(y[13] - y[15])) + 2*y[14]*(-2*p[20] - 2*p[23] - p[21] - p[24]) + y[16]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_trap[60,60] = y[12]*y[14]*(2*p[20]*sin(y[13] - y[15]) - 2*p[19]*cos(y[13] - y[15])) + y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_trap[60,61] = y[14]*(2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_trap[60,62] = y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_trap[61,59] = y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_trap[61,60] = y[14]*y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_trap[61,61] = y[18]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17])) + 2*y[16]*(2*p[22] + p[25])
jac_trap[61,62] = y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17])) + y[14]*y[16]*(-2*p[23]*cos(y[15] - y[17]) - 2*p[22]*sin(y[15] - y[17]))
jac_trap[61,63] = y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_trap[61,64] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_trap[62,59] = y[16]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17]))
jac_trap[62,60] = y[14]*y[16]*(-2*p[23]*sin(y[15] - y[17]) + 2*p[22]*cos(y[15] - y[17]))
jac_trap[62,61] = y[18]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17])) + y[14]*(2*p[23]*cos(y[15] - y[17]) + 2*p[22]*sin(y[15] - y[17])) + 2*y[16]*(-2*p[23] - p[26] - p[24] - p[27]/2)
jac_trap[62,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17])) + y[14]*y[16]*(2*p[23]*sin(y[15] - y[17]) - 2*p[22]*cos(y[15] - y[17]))
jac_trap[62,63] = y[16]*(p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_trap[62,64] = y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_trap[63,51] = y[18]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7]))
jac_trap[63,52] = y[18]*y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_trap[63,61] = y[18]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_trap[63,62] = y[18]*y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_trap[63,63] = 2*y[18]*(p[28] + p[10] + p[25]) + y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_trap[63,64] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + y[18]*y[6]*(-p[11]*cos(y[19] - y[7]) + p[10]*sin(y[19] - y[7])) + y[18]*y[16]*(-p[26]*cos(y[19] - y[17]) + p[25]*sin(y[19] - y[17]))
jac_trap[63,65] = y[18]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_trap[63,66] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_trap[64,51] = y[18]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7]))
jac_trap[64,52] = y[18]*y[6]*(p[11]*sin(y[19] - y[7]) + p[10]*cos(y[19] - y[7]))
jac_trap[64,61] = y[18]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_trap[64,62] = y[18]*y[16]*(p[26]*sin(y[19] - y[17]) + p[25]*cos(y[19] - y[17]))
jac_trap[64,63] = 2*y[18]*(-p[29] - p[11] - p[26] - p[30]/2 - p[12]/2 - p[27]/2) + y[20]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[6]*(p[11]*cos(y[19] - y[7]) - p[10]*sin(y[19] - y[7])) + y[16]*(p[26]*cos(y[19] - y[17]) - p[25]*sin(y[19] - y[17]))
jac_trap[64,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[18]*y[6]*(-p[11]*sin(y[19] - y[7]) - p[10]*cos(y[19] - y[7])) + y[18]*y[16]*(-p[26]*sin(y[19] - y[17]) - p[25]*cos(y[19] - y[17]))
jac_trap[64,65] = y[18]*(p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21]))
jac_trap[64,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_trap[65,49] = y[20]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_trap[65,50] = y[20]*y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_trap[65,63] = y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21]))
jac_trap[65,64] = y[18]*y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_trap[65,65] = y[18]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + 2*y[20]*(p[28] + p[7]) + y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_trap[65,66] = y[18]*y[20]*(-p[29]*cos(y[19] - y[21]) - p[28]*sin(y[19] - y[21])) + y[20]*y[4]*(-p[8]*cos(y[21] - y[5]) + p[7]*sin(y[21] - y[5]))
jac_trap[66,49] = y[20]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_trap[66,50] = y[20]*y[4]*(p[8]*sin(y[21] - y[5]) + p[7]*cos(y[21] - y[5]))
jac_trap[66,63] = y[20]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21]))
jac_trap[66,64] = y[18]*y[20]*(-p[29]*sin(y[19] - y[21]) + p[28]*cos(y[19] - y[21]))
jac_trap[66,65] = y[18]*(p[29]*cos(y[19] - y[21]) + p[28]*sin(y[19] - y[21])) + 2*y[20]*(-p[29] - p[8] - p[30]/2 - p[9]/2) + y[4]*(p[8]*cos(y[21] - y[5]) - p[7]*sin(y[21] - y[5]))
jac_trap[66,66] = y[18]*y[20]*(p[29]*sin(y[19] - y[21]) - p[28]*cos(y[19] - y[21])) + y[20]*y[4]*(-p[8]*sin(y[21] - y[5]) - p[7]*cos(y[21] - y[5]))
jac_trap[67,0] = -y[0]*sin(x[0] - y[1])
jac_trap[67,45] = cos(x[0] - y[1])
jac_trap[67,46] = y[0]*sin(x[0] - y[1])
jac_trap[68,0] = y[0]*cos(x[0] - y[1])
jac_trap[68,45] = sin(x[0] - y[1])
jac_trap[68,46] = -y[0]*cos(x[0] - y[1])
jac_trap[69,0] = y[0]*y[22]*cos(x[0] - y[1]) - y[0]*y[23]*sin(x[0] - y[1])
jac_trap[69,45] = y[22]*sin(x[0] - y[1]) + y[23]*cos(x[0] - y[1])
jac_trap[69,46] = -y[0]*y[22]*cos(x[0] - y[1]) + y[0]*y[23]*sin(x[0] - y[1])
jac_trap[69,67] = y[0]*sin(x[0] - y[1])
jac_trap[69,68] = y[0]*cos(x[0] - y[1])
jac_trap[70,0] = -y[0]*y[22]*sin(x[0] - y[1]) - y[0]*y[23]*cos(x[0] - y[1])
jac_trap[70,45] = y[22]*cos(x[0] - y[1]) - y[23]*sin(x[0] - y[1])
jac_trap[70,46] = y[0]*y[22]*sin(x[0] - y[1]) + y[0]*y[23]*cos(x[0] - y[1])
jac_trap[70,67] = y[0]*cos(x[0] - y[1])
jac_trap[70,68] = -y[0]*sin(x[0] - y[1])
jac_trap[71,4] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (-p[55], True)]))
jac_trap[71,5] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[56], True)]))
jac_trap[71,75] = Piecewise(np.array([(0, (p[58] > p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5]) | (p[59] < p[55]*(-x[4] + y[30] + u[22]) + p[56]*x[5])), (p[55], True)]))
jac_trap[75,10] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*(-p[68]/p[69] + 1), True)]))
jac_trap[75,74] = Piecewise(np.array([(0, (p[71] < p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10])) | (p[71] < -p[70]*(p[68]*(-x[10] + y[29])/p[69] + x[10]))), (p[70]*p[68]/p[69], True)]))
jac_trap[76,11] = -y[2]*sin(x[11] - y[3])
jac_trap[76,47] = cos(x[11] - y[3])
jac_trap[76,48] = y[2]*sin(x[11] - y[3])
jac_trap[77,11] = y[2]*cos(x[11] - y[3])
jac_trap[77,47] = sin(x[11] - y[3])
jac_trap[77,48] = -y[2]*cos(x[11] - y[3])
jac_trap[78,11] = y[2]*y[31]*cos(x[11] - y[3]) - y[2]*y[32]*sin(x[11] - y[3])
jac_trap[78,47] = y[31]*sin(x[11] - y[3]) + y[32]*cos(x[11] - y[3])
jac_trap[78,48] = -y[2]*y[31]*cos(x[11] - y[3]) + y[2]*y[32]*sin(x[11] - y[3])
jac_trap[78,76] = y[2]*sin(x[11] - y[3])
jac_trap[78,77] = y[2]*cos(x[11] - y[3])
jac_trap[79,11] = -y[2]*y[31]*sin(x[11] - y[3]) - y[2]*y[32]*cos(x[11] - y[3])
jac_trap[79,47] = y[31]*cos(x[11] - y[3]) - y[32]*sin(x[11] - y[3])
jac_trap[79,48] = y[2]*y[31]*sin(x[11] - y[3]) + y[2]*y[32]*cos(x[11] - y[3])
jac_trap[79,76] = y[2]*cos(x[11] - y[3])
jac_trap[79,77] = -y[2]*sin(x[11] - y[3])
jac_trap[80,15] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (-p[85], True)]))
jac_trap[80,16] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[86], True)]))
jac_trap[80,84] = Piecewise(np.array([(0, (p[88] > p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16]) | (p[89] < p[85]*(-x[15] + y[39] + u[26]) + p[86]*x[16])), (p[85], True)]))
jac_trap[84,21] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*(-p[98]/p[99] + 1), True)]))
jac_trap[84,83] = Piecewise(np.array([(0, (p[101] < p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21])) | (p[101] < -p[100]*(p[98]*(-x[21] + y[38])/p[99] + x[21]))), (p[100]*p[98]/p[99], True)]))
jac_trap[85,22] = -y[4]*sin(x[22] - y[5])
jac_trap[85,49] = cos(x[22] - y[5])
jac_trap[85,50] = y[4]*sin(x[22] - y[5])
jac_trap[86,22] = y[4]*cos(x[22] - y[5])
jac_trap[86,49] = sin(x[22] - y[5])
jac_trap[86,50] = -y[4]*cos(x[22] - y[5])
jac_trap[87,22] = y[4]*y[40]*cos(x[22] - y[5]) - y[4]*y[41]*sin(x[22] - y[5])
jac_trap[87,49] = y[40]*sin(x[22] - y[5]) + y[41]*cos(x[22] - y[5])
jac_trap[87,50] = -y[4]*y[40]*cos(x[22] - y[5]) + y[4]*y[41]*sin(x[22] - y[5])
jac_trap[87,85] = y[4]*sin(x[22] - y[5])
jac_trap[87,86] = y[4]*cos(x[22] - y[5])
jac_trap[88,22] = -y[4]*y[40]*sin(x[22] - y[5]) - y[4]*y[41]*cos(x[22] - y[5])
jac_trap[88,49] = y[40]*cos(x[22] - y[5]) - y[41]*sin(x[22] - y[5])
jac_trap[88,50] = y[4]*y[40]*sin(x[22] - y[5]) + y[4]*y[41]*cos(x[22] - y[5])
jac_trap[88,85] = y[4]*cos(x[22] - y[5])
jac_trap[88,86] = -y[4]*sin(x[22] - y[5])
jac_trap[89,26] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (-p[115], True)]))
jac_trap[89,27] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[116], True)]))
jac_trap[89,93] = Piecewise(np.array([(0, (p[118] > p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27]) | (p[119] < p[115]*(-x[26] + y[48] + u[30]) + p[116]*x[27])), (p[115], True)]))
jac_trap[93,32] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*(-p[128]/p[129] + 1), True)]))
jac_trap[93,92] = Piecewise(np.array([(0, (p[131] < p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32])) | (p[131] < -p[130]*(p[128]*(-x[32] + y[47])/p[129] + x[32]))), (p[130]*p[128]/p[129], True)]))
jac_trap[94,33] = -y[6]*sin(x[33] - y[7])
jac_trap[94,51] = cos(x[33] - y[7])
jac_trap[94,52] = y[6]*sin(x[33] - y[7])
jac_trap[95,33] = y[6]*cos(x[33] - y[7])
jac_trap[95,51] = sin(x[33] - y[7])
jac_trap[95,52] = -y[6]*cos(x[33] - y[7])
jac_trap[96,33] = y[6]*y[49]*cos(x[33] - y[7]) - y[6]*y[50]*sin(x[33] - y[7])
jac_trap[96,51] = y[49]*sin(x[33] - y[7]) + y[50]*cos(x[33] - y[7])
jac_trap[96,52] = -y[6]*y[49]*cos(x[33] - y[7]) + y[6]*y[50]*sin(x[33] - y[7])
jac_trap[96,94] = y[6]*sin(x[33] - y[7])
jac_trap[96,95] = y[6]*cos(x[33] - y[7])
jac_trap[97,33] = -y[6]*y[49]*sin(x[33] - y[7]) - y[6]*y[50]*cos(x[33] - y[7])
jac_trap[97,51] = y[49]*cos(x[33] - y[7]) - y[50]*sin(x[33] - y[7])
jac_trap[97,52] = y[6]*y[49]*sin(x[33] - y[7]) + y[6]*y[50]*cos(x[33] - y[7])
jac_trap[97,94] = y[6]*cos(x[33] - y[7])
jac_trap[97,95] = -y[6]*sin(x[33] - y[7])
jac_trap[98,37] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (-p[145], True)]))
jac_trap[98,38] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[146], True)]))
jac_trap[98,102] = Piecewise(np.array([(0, (p[148] > p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38]) | (p[149] < p[145]*(-x[37] + y[57] + u[34]) + p[146]*x[38])), (p[145], True)]))
jac_trap[102,43] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*(-p[158]/p[159] + 1), True)]))
jac_trap[102,101] = Piecewise(np.array([(0, (p[161] < p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43])) | (p[161] < -p[160]*(p[158]*(-x[43] + y[56])/p[159] + x[43]))), (p[160]*p[158]/p[159], True)]))
if xyup == 1:
jac_trap[0,0] = 0.5*Dt*p[53] + 1
jac_trap[0,1] = -0.5*Dt*p[43]
jac_trap[0,103] = 0.5*Dt*p[43]
jac_trap[1,1] = 0.25*p[51]*Dt/p[44] + 1
jac_trap[1,73] = -0.25*Dt/p[44]
jac_trap[1,103] = -0.25*p[51]*Dt/p[44]
jac_trap[2,2] = 0.5*Dt/p[45] + 1
jac_trap[2,67] = -0.5*Dt*(p[49] - p[47])/p[45]
jac_trap[2,71] = -0.5*Dt/p[45]
jac_trap[3,3] = 0.5*Dt/p[46] + 1
jac_trap[3,68] = -0.5*Dt*(-p[50] + p[48])/p[46]
jac_trap[4,4] = 0.5*Dt/p[57] + 1
jac_trap[4,45] = -0.5*Dt/p[57]
jac_trap[5,4] = -0.5*Dt*(p[55]*p[60] - 1)
jac_trap[5,5] = 0.5*Dt*p[56]*p[60] + 1
jac_trap[5,71] = -0.5*Dt*p[60]
jac_trap[5,75] = -0.5*Dt*(-p[55]*p[60] + 1)
jac_trap[6,6] = 0.5*Dt/p[62] + 1
jac_trap[6,72] = -0.5*Dt/p[62]
jac_trap[7,6] = -0.5*Dt/p[64]
jac_trap[7,7] = 0.5*Dt/p[64] + 1
jac_trap[8,8] = 5.0e-7*Dt + 1
jac_trap[8,69] = 0.5*Dt*p[65]
jac_trap[9,1] = -0.5*Dt/p[67]
jac_trap[9,9] = 0.5*Dt/p[67] + 1
jac_trap[10,10] = 0.5*Dt/p[69] + 1
jac_trap[10,74] = -0.5*Dt/p[69]
jac_trap[11,11] = 0.5*Dt*p[83] + 1
jac_trap[11,12] = -0.5*Dt*p[73]
jac_trap[11,103] = 0.5*Dt*p[73]
jac_trap[12,12] = 0.25*p[81]*Dt/p[74] + 1
jac_trap[12,82] = -0.25*Dt/p[74]
jac_trap[12,103] = -0.25*p[81]*Dt/p[74]
jac_trap[13,13] = 0.5*Dt/p[75] + 1
jac_trap[13,76] = -0.5*Dt*(p[79] - p[77])/p[75]
jac_trap[13,80] = -0.5*Dt/p[75]
jac_trap[14,14] = 0.5*Dt/p[76] + 1
jac_trap[14,77] = -0.5*Dt*(-p[80] + p[78])/p[76]
jac_trap[15,15] = 0.5*Dt/p[87] + 1
jac_trap[15,47] = -0.5*Dt/p[87]
jac_trap[16,15] = -0.5*Dt*(p[85]*p[90] - 1)
jac_trap[16,16] = 0.5*Dt*p[86]*p[90] + 1
jac_trap[16,80] = -0.5*Dt*p[90]
jac_trap[16,84] = -0.5*Dt*(-p[85]*p[90] + 1)
jac_trap[17,17] = 0.5*Dt/p[92] + 1
jac_trap[17,81] = -0.5*Dt/p[92]
jac_trap[18,17] = -0.5*Dt/p[94]
jac_trap[18,18] = 0.5*Dt/p[94] + 1
jac_trap[19,19] = 5.0e-7*Dt + 1
jac_trap[19,78] = 0.5*Dt*p[95]
jac_trap[20,12] = -0.5*Dt/p[97]
jac_trap[20,20] = 0.5*Dt/p[97] + 1
jac_trap[21,21] = 0.5*Dt/p[99] + 1
jac_trap[21,83] = -0.5*Dt/p[99]
jac_trap[22,22] = 0.5*Dt*p[113] + 1
jac_trap[22,23] = -0.5*Dt*p[103]
jac_trap[22,103] = 0.5*Dt*p[103]
jac_trap[23,23] = 0.25*p[111]*Dt/p[104] + 1
jac_trap[23,91] = -0.25*Dt/p[104]
jac_trap[23,103] = -0.25*p[111]*Dt/p[104]
jac_trap[24,24] = 0.5*Dt/p[105] + 1
jac_trap[24,85] = -0.5*Dt*(p[109] - p[107])/p[105]
jac_trap[24,89] = -0.5*Dt/p[105]
jac_trap[25,25] = 0.5*Dt/p[106] + 1
jac_trap[25,86] = -0.5*Dt*(-p[110] + p[108])/p[106]
jac_trap[26,26] = 0.5*Dt/p[117] + 1
jac_trap[26,49] = -0.5*Dt/p[117]
jac_trap[27,26] = -0.5*Dt*(p[115]*p[120] - 1)
jac_trap[27,27] = 0.5*Dt*p[116]*p[120] + 1
jac_trap[27,89] = -0.5*Dt*p[120]
jac_trap[27,93] = -0.5*Dt*(-p[115]*p[120] + 1)
jac_trap[28,28] = 0.5*Dt/p[122] + 1
jac_trap[28,90] = -0.5*Dt/p[122]
jac_trap[29,28] = -0.5*Dt/p[124]
jac_trap[29,29] = 0.5*Dt/p[124] + 1
jac_trap[30,30] = 5.0e-7*Dt + 1
jac_trap[30,87] = 0.5*Dt*p[125]
jac_trap[31,23] = -0.5*Dt/p[127]
jac_trap[31,31] = 0.5*Dt/p[127] + 1
jac_trap[32,32] = 0.5*Dt/p[129] + 1
jac_trap[32,92] = -0.5*Dt/p[129]
jac_trap[33,33] = 0.5*Dt*p[143] + 1
jac_trap[33,34] = -0.5*Dt*p[133]
jac_trap[33,103] = 0.5*Dt*p[133]
jac_trap[34,34] = 0.25*p[141]*Dt/p[134] + 1
jac_trap[34,100] = -0.25*Dt/p[134]
jac_trap[34,103] = -0.25*p[141]*Dt/p[134]
jac_trap[35,35] = 0.5*Dt/p[135] + 1
jac_trap[35,94] = -0.5*Dt*(p[139] - p[137])/p[135]
jac_trap[35,98] = -0.5*Dt/p[135]
jac_trap[36,36] = 0.5*Dt/p[136] + 1
jac_trap[36,95] = -0.5*Dt*(-p[140] + p[138])/p[136]
jac_trap[37,37] = 0.5*Dt/p[147] + 1
jac_trap[37,51] = -0.5*Dt/p[147]
jac_trap[38,37] = -0.5*Dt*(p[145]*p[150] - 1)
jac_trap[38,38] = 0.5*Dt*p[146]*p[150] + 1
jac_trap[38,98] = -0.5*Dt*p[150]
jac_trap[38,102] = -0.5*Dt*(-p[145]*p[150] + 1)
jac_trap[39,39] = 0.5*Dt/p[152] + 1
jac_trap[39,99] = -0.5*Dt/p[152]
jac_trap[40,39] = -0.5*Dt/p[154]
jac_trap[40,40] = 0.5*Dt/p[154] + 1
jac_trap[41,41] = 5.0e-7*Dt + 1
jac_trap[41,96] = 0.5*Dt*p[155]
jac_trap[42,34] = -0.5*Dt/p[157]
jac_trap[42,42] = 0.5*Dt/p[157] + 1
jac_trap[43,43] = 0.5*Dt/p[159] + 1
jac_trap[43,101] = -0.5*Dt/p[159]
jac_trap[44,44] = 1
jac_trap[44,103] = 0.5*Dt
jac_trap[45,69] = -p[42]/p[0]
jac_trap[46,70] = -p[42]/p[0]
jac_trap[47,78] = -p[72]/p[0]
jac_trap[48,79] = -p[72]/p[0]
jac_trap[49,87] = -p[102]/p[0]
jac_trap[50,88] = -p[102]/p[0]
jac_trap[51,96] = -p[132]/p[0]
jac_trap[52,97] = -p[132]/p[0]
jac_trap[67,2] = -1
jac_trap[67,67] = p[49]
jac_trap[67,68] = p[52]
jac_trap[68,3] = -1
jac_trap[68,67] = p[52]
jac_trap[68,68] = -p[50]
jac_trap[69,69] = -1
jac_trap[70,70] = -1
jac_trap[71,71] = -1
jac_trap[72,1] = -1/p[61]
jac_trap[72,8] = 1
jac_trap[72,72] = -1
jac_trap[72,104] = p[54]
jac_trap[73,6] = p[63]/p[64]
jac_trap[73,7] = -p[63]/p[64] + 1
jac_trap[73,73] = -1
jac_trap[74,1] = 1
jac_trap[74,9] = -1
jac_trap[74,74] = -1
jac_trap[75,75] = -1
jac_trap[76,13] = -1
jac_trap[76,76] = p[79]
jac_trap[76,77] = p[82]
jac_trap[77,14] = -1
jac_trap[77,76] = p[82]
jac_trap[77,77] = -p[80]
jac_trap[78,78] = -1
jac_trap[79,79] = -1
jac_trap[80,80] = -1
jac_trap[81,12] = -1/p[91]
jac_trap[81,19] = 1
jac_trap[81,81] = -1
jac_trap[81,104] = p[84]
jac_trap[82,17] = p[93]/p[94]
jac_trap[82,18] = -p[93]/p[94] + 1
jac_trap[82,82] = -1
jac_trap[83,12] = 1
jac_trap[83,20] = -1
jac_trap[83,83] = -1
jac_trap[84,84] = -1
jac_trap[85,24] = -1
jac_trap[85,85] = p[109]
jac_trap[85,86] = p[112]
jac_trap[86,25] = -1
jac_trap[86,85] = p[112]
jac_trap[86,86] = -p[110]
jac_trap[87,87] = -1
jac_trap[88,88] = -1
jac_trap[89,89] = -1
jac_trap[90,23] = -1/p[121]
jac_trap[90,30] = 1
jac_trap[90,90] = -1
jac_trap[90,104] = p[114]
jac_trap[91,28] = p[123]/p[124]
jac_trap[91,29] = -p[123]/p[124] + 1
jac_trap[91,91] = -1
jac_trap[92,23] = 1
jac_trap[92,31] = -1
jac_trap[92,92] = -1
jac_trap[93,93] = -1
jac_trap[94,35] = -1
jac_trap[94,94] = p[139]
jac_trap[94,95] = p[142]
jac_trap[95,36] = -1
jac_trap[95,94] = p[142]
jac_trap[95,95] = -p[140]
jac_trap[96,96] = -1
jac_trap[97,97] = -1
jac_trap[98,98] = -1
jac_trap[99,34] = -1/p[151]
jac_trap[99,41] = 1
jac_trap[99,99] = -1
jac_trap[99,104] = p[144]
jac_trap[100,39] = p[153]/p[154]
jac_trap[100,40] = -p[153]/p[154] + 1
jac_trap[100,100] = -1
jac_trap[101,34] = 1
jac_trap[101,42] = -1
jac_trap[101,101] = -1
jac_trap[102,102] = -1
jac_trap[103,1] = p[44]*p[42]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_trap[103,12] = p[74]*p[72]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_trap[103,23] = p[104]*p[102]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_trap[103,34] = p[134]*p[132]/(p[44]*p[42] + p[74]*p[72] + p[104]*p[102] + p[134]*p[132])
jac_trap[103,103] = -1
jac_trap[104,44] = p[163]
jac_trap[104,103] = -p[162]
jac_trap[104,104] = -1
@numba.njit(cache=True)
def h_run_eval(h_run,x,y,u,p,xyup = 0):
h_run[0] = y[0]
h_run[1] = y[2]
h_run[2] = y[4]
h_run[3] = y[6]
h_run[4] = y[8]
h_run[5] = y[10]
h_run[6] = y[12]
h_run[7] = y[14]
h_run[8] = y[16]
h_run[9] = y[18]
h_run[10] = y[20]
h_run[11] = y[22]*(p[52]*y[22] + y[0]*sin(x[0] - y[1])) + y[23]*(p[52]*y[23] + y[0]*cos(x[0] - y[1]))
h_run[12] = y[31]*(p[82]*y[31] + y[2]*sin(x[11] - y[3])) + y[32]*(p[82]*y[32] + y[2]*cos(x[11] - y[3]))
h_run[13] = y[40]*(p[112]*y[40] + y[4]*sin(x[22] - y[5])) + y[41]*(p[112]*y[41] + y[4]*cos(x[22] - y[5]))
h_run[14] = y[49]*(p[142]*y[49] + y[6]*sin(x[33] - y[7])) + y[50]*(p[142]*y[50] + y[6]*cos(x[33] - y[7]))
@numba.njit(cache=True)
def Piecewise(arg):
out = arg[0][1]
N = len(arg)
for it in range(N-1,-1,-1):
if arg[it][1]: out = arg[it][0]
return out
@numba.njit(cache=True)
def ITE(arg):
out = arg[0][1]
N = len(arg)
for it in range(N-1,-1,-1):
if arg[it][1]: out = arg[it][0]
return out
@numba.njit(cache=True)
def Abs(x):
return np.abs(x)
@numba.njit(cache=True)
def ini_dae_jacobian_numba(struct,x):
N_x = struct[0].N_x
N_y = struct[0].N_y
struct[0].x[:,0] = x[0:N_x]
struct[0].y_ini[:,0] = x[N_x:(N_x+N_y)]
ini(struct,10)
ini(struct,11)
for row,col in zip(struct[0].Fx_ini_rows,struct[0].Fx_ini_cols):
struct[0].Ac_ini[row,col] = struct[0].Fx_ini[row,col]
for row,col in zip(struct[0].Fy_ini_rows,struct[0].Fy_ini_cols):
struct[0].Ac_ini[row,col+N_x] = struct[0].Fy_ini[row,col]
for row,col in zip(struct[0].Gx_ini_rows,struct[0].Gx_ini_cols):
struct[0].Ac_ini[row+N_x,col] = struct[0].Gx_ini[row,col]
for row,col in zip(struct[0].Gy_ini_rows,struct[0].Gy_ini_cols):
struct[0].Ac_ini[row+N_x,col+N_x] = struct[0].Gy_ini[row,col]
@numba.njit(cache=True)
def ini_dae_problem(struct,x):
N_x = struct[0].N_x
N_y = struct[0].N_y
struct[0].x[:,0] = x[0:N_x]
struct[0].y_ini[:,0] = x[N_x:(N_x+N_y)]
ini(struct,2)
ini(struct,3)
struct[0].fg[:N_x,:] = struct[0].f[:]
struct[0].fg[N_x:,:] = struct[0].g[:]
@numba.njit(cache=True)
def ssate(struct,xy):
for it in range(100):
ini_dae_jacobian_numba(struct,xy[:,0])
ini_dae_problem(struct,xy[:,0])
xy[:] += np.linalg.solve(struct[0].Ac_ini,-struct[0].fg)
if np.max(np.abs(struct[0].fg[:,0]))<1e-8: break
N_x = struct[0].N_x
struct[0].x[:,0] = xy[:N_x,0]
struct[0].y_ini[:,0] = xy[N_x:,0]
return xy,it
def nonzeros():
Fx_ini_rows = [0, 0, 1, 1, 2, 3, 4, 5, 5, 6, 7, 7, 8, 9, 9, 10, 11, 11, 12, 12, 13, 14, 15, 16, 16, 17, 18, 18, 19, 20, 20, 21, 22, 22, 23, 23, 24, 25, 26, 27, 27, 28, 29, 29, 30, 31, 31, 32, 33, 33, 34, 34, 35, 36, 37, 38, 38, 39, 40, 40, 41, 42, 42, 43]
Fx_ini_cols = [0, 1, 0, 1, 2, 3, 4, 4, 5, 6, 6, 7, 8, 1, 9, 10, 11, 12, 11, 12, 13, 14, 15, 15, 16, 17, 17, 18, 19, 12, 20, 21, 22, 23, 22, 23, 24, 25, 26, 26, 27, 28, 28, 29, 30, 23, 31, 32, 33, 34, 33, 34, 35, 36, 37, 37, 38, 39, 39, 40, 41, 34, 42, 43]
Fy_ini_rows = [0, 1, 1, 1, 1, 1, 1, 2, 2, 3, 4, 5, 5, 6, 8, 10, 11, 12, 12, 12, 12, 12, 12, 13, 13, 14, 15, 16, 16, 17, 19, 21, 22, 23, 23, 23, 23, 23, 23, 24, 24, 25, 26, 27, 27, 28, 30, 32, 33, 34, 34, 34, 34, 34, 34, 35, 35, 36, 37, 38, 38, 39, 41, 43, 44]
Fy_ini_cols = [58, 0, 1, 22, 23, 28, 58, 22, 26, 23, 0, 26, 30, 27, 24, 29, 58, 2, 3, 31, 32, 37, 58, 31, 35, 32, 2, 35, 39, 36, 33, 38, 58, 4, 5, 40, 41, 46, 58, 40, 44, 41, 4, 44, 48, 45, 42, 47, 58, 6, 7, 49, 50, 55, 58, 49, 53, 50, 6, 53, 57, 54, 51, 56, 58]
Gx_ini_rows = [22, 22, 23, 23, 24, 25, 26, 26, 27, 27, 28, 28, 29, 29, 30, 31, 31, 32, 32, 33, 34, 35, 35, 36, 36, 37, 37, 38, 38, 39, 40, 40, 41, 41, 42, 43, 44, 44, 45, 45, 46, 46, 47, 47, 48, 49, 49, 50, 50, 51, 52, 53, 53, 54, 54, 55, 55, 56, 56, 57, 58, 58, 58, 58, 59]
Gx_ini_cols = [0, 2, 0, 3, 0, 0, 4, 5, 1, 8, 6, 7, 1, 9, 10, 11, 13, 11, 14, 11, 11, 15, 16, 12, 19, 17, 18, 12, 20, 21, 22, 24, 22, 25, 22, 22, 26, 27, 23, 30, 28, 29, 23, 31, 32, 33, 35, 33, 36, 33, 33, 37, 38, 34, 41, 39, 40, 34, 42, 43, 1, 12, 23, 34, 44]
Gy_ini_rows = [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12, 12, 12, 12, 12, 13, 13, 13, 13, 13, 13, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 16, 16, 16, 16, 16, 16, 17, 17, 17, 17, 17, 17, 18, 18, 18, 18, 18, 18, 18, 18, 19, 19, 19, 19, 19, 19, 19, 19, 20, 20, 20, 20, 20, 20, 21, 21, 21, 21, 21, 21, 22, 22, 22, 22, 23, 23, 23, 23, 24, 24, 24, 24, 24, 25, 25, 25, 25, 25, 26, 26, 27, 27, 28, 29, 30, 30, 31, 31, 31, 31, 32, 32, 32, 32, 33, 33, 33, 33, 33, 34, 34, 34, 34, 34, 35, 35, 36, 36, 37, 38, 39, 39, 40, 40, 40, 40, 41, 41, 41, 41, 42, 42, 42, 42, 42, 43, 43, 43, 43, 43, 44, 44, 45, 45, 46, 47, 48, 48, 49, 49, 49, 49, 50, 50, 50, 50, 51, 51, 51, 51, 51, 52, 52, 52, 52, 52, 53, 53, 54, 54, 55, 56, 57, 57, 58, 59, 59]
Gy_ini_cols = [0, 1, 8, 9, 24, 0, 1, 8, 9, 25, 2, 3, 10, 11, 33, 2, 3, 10, 11, 34, 4, 5, 20, 21, 42, 4, 5, 20, 21, 43, 6, 7, 18, 19, 51, 6, 7, 18, 19, 52, 0, 1, 8, 9, 10, 11, 0, 1, 8, 9, 10, 11, 2, 3, 8, 9, 10, 11, 12, 13, 2, 3, 8, 9, 10, 11, 12, 13, 10, 11, 12, 13, 14, 15, 10, 11, 12, 13, 14, 15, 12, 13, 14, 15, 16, 17, 12, 13, 14, 15, 16, 17, 14, 15, 16, 17, 18, 19, 14, 15, 16, 17, 18, 19, 6, 7, 16, 17, 18, 19, 20, 21, 6, 7, 16, 17, 18, 19, 20, 21, 4, 5, 18, 19, 20, 21, 4, 5, 18, 19, 20, 21, 0, 1, 22, 23, 0, 1, 22, 23, 0, 1, 22, 23, 24, 0, 1, 22, 23, 25, 26, 30, 27, 59, 28, 29, 29, 30, 2, 3, 31, 32, 2, 3, 31, 32, 2, 3, 31, 32, 33, 2, 3, 31, 32, 34, 35, 39, 36, 59, 37, 38, 38, 39, 4, 5, 40, 41, 4, 5, 40, 41, 4, 5, 40, 41, 42, 4, 5, 40, 41, 43, 44, 48, 45, 59, 46, 47, 47, 48, 6, 7, 49, 50, 6, 7, 49, 50, 6, 7, 49, 50, 51, 6, 7, 49, 50, 52, 53, 57, 54, 59, 55, 56, 56, 57, 58, 58, 59]
return Fx_ini_rows,Fx_ini_cols,Fy_ini_rows,Fy_ini_cols,Gx_ini_rows,Gx_ini_cols,Gy_ini_rows,Gy_ini_cols | 66.076585 | 1,704 | 0.460087 | 32,285 | 135,457 | 1.84293 | 0.012297 | 0.037648 | 0.022589 | 0.019765 | 0.859426 | 0.807661 | 0.760937 | 0.729222 | 0.699692 | 0.677205 | 0 | 0.251475 | 0.203061 | 135,457 | 2,050 | 1,705 | 66.076585 | 0.299692 | 0.000568 | 0 | 0.054111 | 0 | 0 | 0.019687 | 0.000997 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018037 | false | 0 | 0.002122 | 0.000531 | 0.029178 | 0.003183 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
304d3d25be64900e8c0fc98391960545653f88a5 | 59 | py | Python | lol_dto/classes/__init__.py | mrtolkien/lol_game_dto | c23d5bfac6028885b7bcf11028c43b66a64dbaec | [
"MIT"
] | 22 | 2020-06-06T17:02:16.000Z | 2022-02-16T16:06:47.000Z | lol_dto/classes/__init__.py | mrtolkien/lol_game_dto | c23d5bfac6028885b7bcf11028c43b66a64dbaec | [
"MIT"
] | null | null | null | lol_dto/classes/__init__.py | mrtolkien/lol_game_dto | c23d5bfac6028885b7bcf11028c43b66a64dbaec | [
"MIT"
] | 1 | 2020-07-20T23:26:07.000Z | 2020-07-20T23:26:07.000Z | import lol_dto.classes.game
import lol_dto.classes.sources
| 19.666667 | 30 | 0.864407 | 10 | 59 | 4.9 | 0.6 | 0.367347 | 0.489796 | 0.77551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 59 | 2 | 31 | 29.5 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3059d41bb0f2d2682fcd080c7d78f39165d5b2e2 | 208 | py | Python | dvc/utils/compat.py | zjj2wry/dvc | c9df567938eefd7b1f5b094c15f04e5ce704aa36 | [
"Apache-2.0"
] | null | null | null | dvc/utils/compat.py | zjj2wry/dvc | c9df567938eefd7b1f5b094c15f04e5ce704aa36 | [
"Apache-2.0"
] | null | null | null | dvc/utils/compat.py | zjj2wry/dvc | c9df567938eefd7b1f5b094c15f04e5ce704aa36 | [
"Apache-2.0"
] | null | null | null | """Handle import compatibility between Python 2 and Python 3"""
try:
from urlparse import urlparse, urljoin # noqa: F401
except ImportError:
from urllib.parse import urlparse, urljoin # noqa: F401
| 29.714286 | 63 | 0.740385 | 27 | 208 | 5.703704 | 0.666667 | 0.181818 | 0.272727 | 0.324675 | 0.376623 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047337 | 0.1875 | 208 | 6 | 64 | 34.666667 | 0.863905 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
305f895041bffa21e002b499d482c3c9caa52378 | 212 | py | Python | src/main/resources/project-templates/aws_web_server_blocks/open_readme.py | WinterAlexander/Ada-IntelliJ | cf701e0b4b6bb28fa83953f36a10115cea1c8ef3 | [
"Apache-2.0"
] | 17 | 2018-10-03T21:31:03.000Z | 2021-01-22T04:16:05.000Z | src/main/resources/project-templates/aws_web_server_blocks/open_readme.py | WinterAlexander/Ada-IntelliJ | cf701e0b4b6bb28fa83953f36a10115cea1c8ef3 | [
"Apache-2.0"
] | 20 | 2018-11-01T21:17:09.000Z | 2021-10-01T18:57:20.000Z | src/main/resources/project-templates/aws_web_server_blocks/open_readme.py | WinterAlexander/Ada-IntelliJ | cf701e0b4b6bb28fa83953f36a10115cea1c8ef3 | [
"Apache-2.0"
] | 4 | 2020-04-14T15:02:37.000Z | 2022-03-10T20:35:54.000Z | import GPS
class Project_Template_Object:
def __init__(self):
pass
def on_apply(self):
GPS.EditorBuffer.get(GPS.File("README"))
def get_object():
return Project_Template_Object()
| 15.142857 | 48 | 0.679245 | 27 | 212 | 4.962963 | 0.62963 | 0.223881 | 0.313433 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221698 | 212 | 13 | 49 | 16.307692 | 0.812121 | 0 | 0 | 0 | 0 | 0 | 0.028302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0.125 | 0.125 | 0.125 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
06551fcc172c1837837b6cb8967e4238bc468b53 | 4,319 | py | Python | src/models/BaseGAN.py | minsoo9506/catchMinor | 22b40e1459fd7e845e64596837f2bdb1a59b044d | [
"MIT"
] | null | null | null | src/models/BaseGAN.py | minsoo9506/catchMinor | 22b40e1459fd7e845e64596837f2bdb1a59b044d | [
"MIT"
] | null | null | null | src/models/BaseGAN.py | minsoo9506/catchMinor | 22b40e1459fd7e845e64596837f2bdb1a59b044d | [
"MIT"
] | null | null | null | from typing import Any, Dict, List
import torch
import torch.nn as nn
class BaseGenerator(nn.Module):
def __init__(
self,
n_layers: int = 3,
features_list: List[int] = [2, 4, 8, 16],
activation_func_name: str = "ReLU",
dropout_p: float = 0.2,
use_batch_norm: bool = False,
):
"""generator
Parameters
----------
n_layers : int, optional
_description_, by default 3
features_list : List[int], optional
feature_list[0] should be latent_dim,
feature_list[-1] should be original data dim,
by default [2, 4, 8, 16]
activation_func_name : str, optional
_description_, by default "ReLU"
dropout_p : float, optional
_description_, by default 0.2
use_batch_norm : bool, optional
_description_, by default False
"""
super().__init__()
assert n_layers + 1 == len(
features_list
), "should be: n_layers + 1 == len(features_list)"
self.latent_dim = features_list[0]
self.n_layers = n_layers
self.output_size = features_list[-1]
activation_func = getattr(nn, activation_func_name, nn.ReLU)
layers: List[Any] = []
for in_features, out_features in zip(features_list[:-2], features_list[1:-1]):
# fully-connected layer
layers.append(nn.Linear(in_features, out_features))
# batchnorm1d
if use_batch_norm:
layers.append(nn.BatchNorm1d(out_features))
# activation function
layers.append(activation_func())
# dropout
if dropout_p != 0:
layers.append(nn.Dropout(dropout_p))
layers.append(
nn.Linear(
in_features=features_list[-2],
out_features=features_list[-1],
),
nn.Tanh(),
)
self.model = nn.Sequential(*layers)
def forward(self, z: torch.Tensor) -> torch.Tensor:
generated_x = self.model(z)
return generated_x
class BaseDiscriminator(nn.Module):
def __init__(
self,
n_layers: int = 3,
features_list: List[int] = [16, 8, 4, 1],
activation_func_name: str = "ReLU",
dropout_p: float = 0.2,
use_batch_norm: bool = False,
):
"""disciminator
Parameters
----------
n_layers : int, optional
_description_, by default 3
features_list : List[int], optional
feature_list[0] should be original data dim,
feature_list[-1] should be 1,
by default [16, 8, 4, 1]
activation_func_name : str, optional
_description_, by default "ReLU"
dropout_p : float, optional
_description_, by default 0.2
use_batch_norm : bool, optional
_description_, by default False
"""
super().__init__()
assert n_layers + 1 == len(
features_list
), "should be: n_layers + 1 == len(features_list)"
assert features_list[-1] == 1, "should be: feature_list[-1] == 1"
self.n_layers = n_layers
activation_func = getattr(nn, activation_func_name, nn.ReLU)
layers: List[Any] = []
for in_features, out_features in zip(features_list[:-2], features_list[1:-1]):
# fully-connected layer
layers.append(nn.Linear(in_features, out_features))
# batchnorm1d
if use_batch_norm:
layers.append(nn.BatchNorm1d(out_features))
# activation function
layers.append(activation_func())
# dropout
if dropout_p != 0:
layers.append(nn.Dropout(dropout_p))
layers.append(
nn.Linear(
in_features=features_list[-2],
out_features=features_list[-1],
),
nn.Sigmoid(),
)
self.model = nn.Sequential(*layers)
def forward(self, x: torch.Tensor) -> torch.Tensor:
true_or_fake = self.model(x)
return true_or_fake
| 32.969466 | 87 | 0.540866 | 479 | 4,319 | 4.611691 | 0.177453 | 0.103214 | 0.076053 | 0.101403 | 0.83522 | 0.8067 | 0.785876 | 0.785876 | 0.739701 | 0.739701 | 0 | 0.022826 | 0.360963 | 4,319 | 130 | 88 | 33.223077 | 0.777536 | 0.236166 | 0 | 0.72973 | 0 | 0 | 0.044689 | 0 | 0 | 0 | 0 | 0 | 0.040541 | 1 | 0.054054 | false | 0 | 0.040541 | 0 | 0.148649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0692ab91705aa43848b61b730ffa941e72683565 | 10,016 | py | Python | create_files.py | arame/AI-Development-Oxford-main | e2386af217aa7d32e5d68fee9c8e8983188a60c5 | [
"MIT"
] | null | null | null | create_files.py | arame/AI-Development-Oxford-main | e2386af217aa7d32e5d68fee9c8e8983188a60c5 | [
"MIT"
] | null | null | null | create_files.py | arame/AI-Development-Oxford-main | e2386af217aa7d32e5d68fee9c8e8983188a60c5 | [
"MIT"
] | null | null | null | # To add a new cell, type '# %%'
# To add a new markdown cell, type '# %% [markdown]'
# %%
from IPython import get_ipython
# %% [markdown]
# <a href="https://colab.research.google.com/github/Educat8n/AI-Development-Oxford/blob/main/boston_house_price_prediction.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/></a>
# %%
get_ipython().run_cell_magic('writefile', 'boston_house_price_prediction.py', '\n\n# Step -1 - Import Packages\nimport pandas as pd\nfrom sklearn.datasets import load_boston\nfrom sklearn import preprocessing\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.svm import SVR\nfrom sklearn.model_selection import GridSearchCV\nimport seaborn as sns\nfrom sklearn import metrics\nplt.rcParams["figure.figsize"] = (10, 10)\n\n\n\n# Step - 2 - Define the main function\ndef main():\n # Get data\n\n ### To Do Assignment: try changing the data from Boston housing to California housing dataset \n ### You can load the datasets as follows::\n ### from sklearn.datasets import fetch_california_housing\n ### housing = fetch_california_housing()\n ### Refer this link for more detatils: https://scikit-learn.org/stable/modules/generated/sklearn.datasets.fetch_california_housing.html\n boston = load_boston()\n boston_X = pd.DataFrame(boston.data, columns = boston.feature_names)\n boston_y = boston.target\n features = boston.feature_names\n \n ## Data Exploration\n print(f\'The features in dataset are: {features}\')\n #print(f\'Data description\\n {boston_X.describe()}\')\n \n #Plots\n plot_data(boston_X, boston_y, features, cor=True)\n\n ## Remove Outliers\n boston_X, boston_y = remove_outliers(boston_X,boston_y, features)\n \n X_train, y_train, X_test, y_test = preprocess(boston_X, boston_y, features)\n\n model = SVR() \n\n model = train(model, X_train, y_train)\n\n evaluate(model, X_test, y_test, bl= True)\n\n best_params = optimize_models(X_train, y_train)\n print(best_params)\n\n ## Build Best Model\n best_C= best_params[\'C\']\n best_kernel = best_params[\'kernel\']\n\n best_model = SVR(kernel = best_kernel, C= best_C)\n best_model = train(best_model, X_train, y_train)\n evaluate (best_model, X_test, y_test)\n\n\n \n\n \n# Step - 3 - Plot graphs to understand data\ndef plot_data(x_df, y_df,features, cor=False):\n X = x_df.values\n plt.figure(figsize=(10,10))\n plt.title("Price Distribution")\n plt.hist(y_df, bins=30)\n plt.show()\n #cols = x_df.columns()\n fig, ax = plt.subplots(1, len(features), sharey=True, figsize=(20,5))\n plt.title("Relationship between different input features and price")\n ax = ax.flatten()\n for i, col in enumerate(features):\n x = X[:,i]\n y = y_df\n ax[i].scatter(x, y, marker=\'o\')\n ax[i].set_title(col)\n ax[i].set_xlabel(col)\n ax[i].set_ylabel(\'MEDV\')\n plt.show()\n\n if cor:\n pass\n ### To Do Add the code to find and display correlation among\n ### different features\n\n\n\n\n# Step - 4 - Preprocess data\n# Step -4a : Remove outliers\ndef remove_outliers(x,y, features):\n #remove null\n x_df = x.copy(deep=True)\n x_df[\'MEDV\'] = y\n x_df.dropna(inplace=True)\n return x_df[features], x_df[\'MEDV\']\n \n \n# Step -4b : Normalize data\ndef scale_numeric(df):\n x = df.values \n scaler = preprocessing.StandardScaler()\n ### To Do Assignment instead of StandardScaler use MinMaxscaler, \n ### Also observe if scaling influences the results\n x_scaled = scaler.fit_transform(x)\n df = pd.DataFrame(x_scaled)\n return df\n\n \n\n# Step -4b : Preprocess data\ndef preprocess(x, y, features):\n x_df = x[features].copy(deep=True)\n x_df = scale_numeric(x_df)\n #print(len(x_df),len(y))\n # Split data into train, test\n X_train, X_test, y_train, y_test = train_test_split(x_df,y, test_size=0.3, random_state=1)\n return X_train, y_train, X_test, y_test\n \n \n \n \n# Step - 5 - train model \ndef train(model,X_train, y_train):\n model.fit(X_train, y_train)\n return model\n \n \n# Step - 6 - Evaluate Model\ndef evaluate(model, X_test, y_test, plot = True, print_results=True, bl=False):\n y_pred = model.predict(X_test)\n if print_results:\n if bl:\n print(\'\\n\\nBaseline Model Performance on Test Dataset:\\n\')\n else:\n print(\'\\n\\nBest Model Performance on Test Dataset:\\n\')\n print(\'R^2:\',metrics.r2_score(y_test, y_pred))\n print(\'MAE:\',metrics.mean_absolute_error(y_test, y_pred))\n print(\'MSE:\',metrics.mean_squared_error(y_test, y_pred))\n print(\'RMSE:\',np.sqrt(metrics.mean_squared_error(y_test, y_pred)))\n\n if plot:\n plt.scatter(y_test, y_pred)\n plt.xlabel("Prices")\n plt.ylabel("Predicted prices")\n plt.title("Prices vs Predicted prices")\n plt.show()\n return \n \n \n \n \n# Step - 7 - Improve Model\ndef optimize_models(X_train, y_train):\n ### To Do Assignment Change the model to MLP and accordiongly change Grid search params\n params = {\'kernel\':[\'linear\', \'rbf\'], \'C\':[1, 10]}\n model = SVR()\n clf = GridSearchCV(model, params)\n clf.fit(X_train, y_train)\n return (clf.best_params_)\n\n\n\n# call the main finction\nif __name__ == \'__main__\':\n main()\n \n \n ')
# %%
get_ipython().run_line_magic('run', 'boston_house_price_prediction.py')
# %%
get_ipython().run_cell_magic('writefile', 'cancer_detection.py', '\n\n# Step -1 - Import Package\nimport pandas as pd\nfrom sklearn.datasets import load_breast_cancer\nfrom sklearn import preprocessing\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.svm import SVC\nfrom sklearn.model_selection import GridSearchCV\nimport seaborn as sns\nfrom sklearn import metrics\nplt.rcParams["figure.figsize"] = (10, 10)\n\n\n\n# Step - 2 - Define the main function\ndef main():\n # Get data\n cancer_data = load_breast_cancer()\n cancer_data_X = pd.DataFrame(cancer_data.data, columns = cancer_data.feature_names)\n cancer_data_y = cancer_data.target\n features = cancer_data.feature_names\n \n vars = [\'mean radius\', \'mean texture\', \'mean area\', \'mean perimeter\', \'mean smoothness\']\n ## Data Exploration\n print(f\'The features in dataset are: {features}\')\n #print(f\'Data description\\n {cancer_data_X.describe()}\')\n \n #Plots\n plot_data(cancer_data_X, cancer_data_y, features= vars, cor=True)\n\n ## Remove Outliers\n cancer_data_X, cancer_data_y = remove_outliers(cancer_data_X,cancer_data_y, features)\n \n X_train, y_train, X_test, y_test = preprocess(cancer_data_X, cancer_data_y, features)\n\n model = SVC(random_state=6)\n\n model = train(model, X_train, y_train)\n \n baseline = evaluate(model, X_test, y_test, bl=True)\n\n best_params = optimize_models(X_train, y_train)\n print(best_params)\n\n ## Build Best Model\n best_C= best_params[\'C\']\n best_kernel = best_params[\'kernel\']\n\n best_model = SVC(kernel = best_kernel, C= best_C, random_state=6)\n best_model = train(best_model, X_train, y_train)\n evaluate (best_model, X_test, y_test)\n \n\n \n \n \n# Step - 3 - Plot graphs to understand data\ndef plot_data(x_df, y_df,features, cor=False):\n X = x_df.copy(deep=True)\n X[\'class\'] = y_df\n sns.pairplot(X, hue = \'class\', vars = [\'mean radius\', \'mean texture\', \'mean area\', \'mean perimeter\', \'mean smoothness\'] )\n plt.show()\n \n if cor:\n corr = X[features].corr()\n plt.figure(figsize=(10,10))\n sns.heatmap(corr, cbar=True, square= True, fmt=\'.1f\', annot=True, annot_kws={\'size\':15}, cmap=\'Greens\')\n plt.show()\n\n\n\n\n\n\n# Step - 4 - Preprocess data\n# Step -4a : Remove outliers\ndef remove_outliers(x,y, features):\n #remove null\n x_df = x.copy(deep=True)\n x_df[\'class\'] = y\n x_df.dropna(inplace=True)\n return x_df[features], x_df[\'class\']\n \n \n# Step -4b : Normalize data\ndef scale_numeric(df):\n x = df.values \n scaler = preprocessing.StandardScaler()\n x_scaled = scaler.fit_transform(x)\n df = pd.DataFrame(x_scaled)\n return df\n\n \n\n# Step -4b : Preprocess data\ndef preprocess(x, y, features):\n x_df = x[features].copy(deep=True)\n x_df = scale_numeric(x_df)\n #print(len(x_df),len(y))\n # Split data into train, test\n X_train, X_test, y_train, y_test = train_test_split(x_df,y, test_size=0.3, random_state=45)\n return X_train, y_train, X_test, y_test\n \n \n \n \n# Step - 5 - train model \ndef train(model,X_train, y_train):\n model.fit(X_train, y_train)\n return model\n \n \n# Step - 6 - Evaluate Model\ndef evaluate(model, X_test, y_test, plot = True, print_results=True, bl=False):\n y_pred = model.predict(X_test)\n cm = metrics.confusion_matrix(y_test, y_pred)\n acc = metrics.accuracy_score(y_test, y_pred)\n if print_results:\n if bl:\n print(\'\\n\\nBaseline Model Performance on Test Dataset:\\n\')\n else:\n print(\'\\n\\nBest Model Performance on Test Dataset:\\n\')\n print(\'\\nConfusion Matrix:\\n\',cm)\n print(f\'Accuracy: {acc*100}%\')\n\n if plot:\n sns.heatmap(cm, annot= True)\n plt.show()\n return \n \n \n \n \n# Step - 7 - Improve Model\ndef optimize_models(X_train, y_train):\n params = {\'kernel\':[\'rbf\'], \'C\':[1.0, 5.0, 10]}\n model = SVC(random_state=5)\n clf = GridSearchCV(model, params)\n clf.fit(X_train, y_train)\n return clf.best_params_\n\n\n# call the main finction\nif __name__ == \'__main__\':\n main()\n \n \n ')
# %%
get_ipython().run_line_magic('run', 'cancer_detection.py')
# %%
| 345.37931 | 5,146 | 0.673922 | 1,694 | 10,016 | 3.811688 | 0.16588 | 0.029735 | 0.020908 | 0.015487 | 0.716432 | 0.669816 | 0.639616 | 0.61081 | 0.598265 | 0.564968 | 0 | 0.008399 | 0.167931 | 10,016 | 28 | 5,147 | 357.714286 | 0.766379 | 0.034944 | 0 | 0 | 0 | 3.6 | 0.896269 | 0.190363 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.6 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
233478bae2d5154ca89ea9b646c29bd1240542e1 | 206 | py | Python | drone_sim/sim/__init__.py | Atharva-05/drone_sim | 89ee9726a47248a1cdb991db8f5a691700ec944b | [
"MIT"
] | 5 | 2022-01-03T07:12:41.000Z | 2022-02-03T08:25:54.000Z | drone_sim/sim/__init__.py | Atharva-05/drone_sim | 89ee9726a47248a1cdb991db8f5a691700ec944b | [
"MIT"
] | 7 | 2022-01-03T06:51:30.000Z | 2022-01-16T10:25:12.000Z | drone_sim/sim/__init__.py | Atharva-05/drone_sim | 89ee9726a47248a1cdb991db8f5a691700ec944b | [
"MIT"
] | 2 | 2022-01-03T15:49:09.000Z | 2022-01-09T20:56:22.000Z | from drone_sim.sim.drone import Drone
from drone_sim.sim.sensors import Sensor
from drone_sim.sim.sensors import PositionTracker
from drone_sim.sim.sensors import IMU
from drone_sim.sim.parameters import *
| 34.333333 | 49 | 0.849515 | 34 | 206 | 5 | 0.264706 | 0.264706 | 0.352941 | 0.441176 | 0.494118 | 0.494118 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097087 | 206 | 5 | 50 | 41.2 | 0.913978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
234e8ff6ebc30ece7e3908df9681c4f3ed94dcbf | 4,368 | py | Python | language/migrations/0003_auto_20180519_0129.py | vsanasc/sbrain | c0d0c24ea347d6bd0f34b9fdc3d7f01563ba0461 | [
"BSD-3-Clause"
] | 1 | 2019-10-22T19:17:59.000Z | 2019-10-22T19:17:59.000Z | language/migrations/0003_auto_20180519_0129.py | vsanasc/sbrain | c0d0c24ea347d6bd0f34b9fdc3d7f01563ba0461 | [
"BSD-3-Clause"
] | null | null | null | language/migrations/0003_auto_20180519_0129.py | vsanasc/sbrain | c0d0c24ea347d6bd0f34b9fdc3d7f01563ba0461 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 2.0 on 2018-05-19 01:29
from django.db import migrations, models
import django.utils.timezone
class Migration(migrations.Migration):
dependencies = [
('language', '0002_auto_20171212_0030'),
]
operations = [
migrations.RemoveField(
model_name='episode',
name='modified_at',
),
migrations.RemoveField(
model_name='phrase',
name='modified_at',
),
migrations.RemoveField(
model_name='phraseaudio',
name='modified_at',
),
migrations.RemoveField(
model_name='producer',
name='modified_at',
),
migrations.RemoveField(
model_name='relationphrase',
name='modified_at',
),
migrations.AddField(
model_name='episode',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='generator',
name='created_at',
field=models.DateTimeField(auto_now_add=True, default=django.utils.timezone.now),
preserve_default=False,
),
migrations.AddField(
model_name='generator',
name='status',
field=models.SmallIntegerField(choices=[(0, 'Inactive'), (1, 'Active')], default=1),
preserve_default=False,
),
migrations.AddField(
model_name='generator',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='phrase',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='phraseaudio',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='producer',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='production',
name='created_at',
field=models.DateTimeField(auto_now_add=True, default=django.utils.timezone.now),
preserve_default=False,
),
migrations.AddField(
model_name='production',
name='status',
field=models.SmallIntegerField(choices=[(0, 'Inactive'), (1, 'Active')], default=1),
preserve_default=False,
),
migrations.AddField(
model_name='production',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='relationphrase',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AddField(
model_name='season',
name='created_at',
field=models.DateTimeField(auto_now_add=True, default=django.utils.timezone.now),
preserve_default=False,
),
migrations.AddField(
model_name='season',
name='status',
field=models.SmallIntegerField(choices=[(0, 'Inactive'), (1, 'Active')], default=1),
preserve_default=False,
),
migrations.AddField(
model_name='season',
name='updated_at',
field=models.DateTimeField(auto_now=True),
),
migrations.AlterField(
model_name='episode',
name='created_at',
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name='phrase',
name='created_at',
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name='phraseaudio',
name='created_at',
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name='producer',
name='created_at',
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name='relationphrase',
name='created_at',
field=models.DateTimeField(auto_now_add=True),
),
]
| 32.117647 | 96 | 0.552885 | 388 | 4,368 | 6.023196 | 0.149485 | 0.092426 | 0.089003 | 0.178006 | 0.835687 | 0.827557 | 0.816859 | 0.741549 | 0.741549 | 0.720154 | 0 | 0.013374 | 0.332418 | 4,368 | 135 | 97 | 32.355556 | 0.788066 | 0.009844 | 0 | 0.937985 | 1 | 0 | 0.120056 | 0.00532 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015504 | 0 | 0.03876 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
00298d95152898dbeb69d12c3ff70de708f6d5d1 | 278 | py | Python | tests/__init__.py | pjaytycy/improcflow | 4f9e40432436221690573b863c5fd8ab49bd9ac5 | [
"MIT"
] | 1 | 2021-06-22T07:39:12.000Z | 2021-06-22T07:39:12.000Z | tests/__init__.py | pjaytycy/improcflow | 4f9e40432436221690573b863c5fd8ab49bd9ac5 | [
"MIT"
] | 1 | 2018-02-08T20:50:53.000Z | 2018-02-25T14:23:56.000Z | tests/__init__.py | pjaytycy/improcflow | 4f9e40432436221690573b863c5fd8ab49bd9ac5 | [
"MIT"
] | null | null | null | from test_flow_basic import *
from test_automatic_conversion import *
from test_control_logic import *
from test_default_values import *
from test_python_arithmetic import *
from test_python_comparison import *
from test_python_loops import *
from test_python_subflow import *
| 27.8 | 39 | 0.852518 | 40 | 278 | 5.525 | 0.4 | 0.289593 | 0.443439 | 0.361991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118705 | 278 | 9 | 40 | 30.888889 | 0.902041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
003f1ae1fd405fed6f1e31fdbd859fdc271977bb | 870 | py | Python | app/sdc_demo/operations.py | d-vergilyush/devdays2020-sdc-demo-backend | 2effd2718a8306054e5051b279747e811e242588 | [
"MIT"
] | null | null | null | app/sdc_demo/operations.py | d-vergilyush/devdays2020-sdc-demo-backend | 2effd2718a8306054e5051b279747e811e242588 | [
"MIT"
] | null | null | null | app/sdc_demo/operations.py | d-vergilyush/devdays2020-sdc-demo-backend | 2effd2718a8306054e5051b279747e811e242588 | [
"MIT"
] | 1 | 2021-01-11T12:37:05.000Z | 2021-01-11T12:37:05.000Z | from app.sdk import sdk
from aiohttp import web
@sdk.operation(["GET"], ["demo", "$create-patient"], public=True)
async def create_patient(operation, request):
patient = sdk.client.resource("Patient", **{
"id": "demo-patient",
"name": [{
"text": "Jack Black",
"given": ["Jack"],
"family": "Black"
}]})
await patient.save()
return web.json_response({"resource": patient}, status=200)
@sdk.operation(["GET"], ["demo", "$create-patient"], public=True)
async def create_patient(operation, request):
patient = sdk.client.resource("Patient", **{
"id": "demo-patient",
"name": [{
"text": "Jack Black",
"given": ["Jack"],
"family": "Black"
}]})
await patient.save()
return web.json_response({"resource": patient}, status=200)
| 26.363636 | 65 | 0.557471 | 91 | 870 | 5.285714 | 0.351648 | 0.108108 | 0.06237 | 0.079002 | 0.918919 | 0.918919 | 0.918919 | 0.918919 | 0.918919 | 0.918919 | 0 | 0.009231 | 0.252874 | 870 | 32 | 66 | 27.1875 | 0.730769 | 0 | 0 | 0.916667 | 0 | 0 | 0.204598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ccbfe83db09b5103064776721aae69a973aa5f7e | 108 | py | Python | gym_softrobot/utils/linalg.py | skim0119/gym-softrobot | 041cac6f6535c51b4d84128e5bd52b48d00c0970 | [
"MIT"
] | 10 | 2022-01-11T19:49:02.000Z | 2022-03-24T22:27:32.000Z | gym_softrobot/utils/linalg.py | skim0119/gym-softrobot | 041cac6f6535c51b4d84128e5bd52b48d00c0970 | [
"MIT"
] | 7 | 2022-01-15T07:48:53.000Z | 2022-03-07T17:43:44.000Z | gym_softrobot/utils/linalg.py | skim0119/gym-softrobot | 041cac6f6535c51b4d84128e5bd52b48d00c0970 | [
"MIT"
] | 2 | 2022-03-06T19:43:06.000Z | 2022-03-25T21:31:52.000Z | import numpy as np
def do_normalization(data, limit):
return (data - limit[0]) / (limit[1] - limit[0])
| 21.6 | 52 | 0.657407 | 17 | 108 | 4.117647 | 0.705882 | 0.257143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034091 | 0.185185 | 108 | 4 | 53 | 27 | 0.761364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
aeb402b5e4df714ec15d969936dab6218a84286f | 5,473 | py | Python | test/programytest/rdf/test_loading.py | cdoebler1/AIML2 | ee692ec5ea3794cd1bc4cc8ec2a6b5e5c20a0d6a | [
"MIT"
] | 345 | 2016-11-23T22:37:04.000Z | 2022-03-30T20:44:44.000Z | test/programytest/rdf/test_loading.py | MikeyBeez/program-y | 00d7a0c7d50062f18f0ab6f4a041068e119ef7f0 | [
"MIT"
] | 275 | 2016-12-07T10:30:28.000Z | 2022-02-08T21:28:33.000Z | test/programytest/rdf/test_loading.py | VProgramMist/modified-program-y | f32efcafafd773683b3fe30054d5485fe9002b7d | [
"MIT"
] | 159 | 2016-11-28T18:59:30.000Z | 2022-03-20T18:02:44.000Z | import os
import os.path
import unittest
from unittest.mock import patch
from programy.rdf.collection import RDFCollection
from programy.storage.factory import StorageFactory
from programy.storage.stores.file.config import FileStorageConfiguration
from programy.storage.stores.file.config import FileStoreConfiguration
from programy.storage.stores.file.engine import FileStorageEngine
class RDFCollectionLoadingTests(unittest.TestCase):
def test_load_from_file(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
factory._storage_engines[StorageFactory.RDF] = storage_engine
factory._store_to_engine_map[StorageFactory.RDF] = storage_engine
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertTrue(collection.load(factory))
self.assertTrue(collection.has_subject("TEST1"))
self.assertTrue(collection.has_predicate("TEST1", "HASPURPOSE"))
self.assertTrue(collection.has_object("TEST1", "HASPURPOSE", "to test"))
def test_load_no_engine(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertFalse(collection.load(factory))
def test_reload_from_file(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
factory._storage_engines[StorageFactory.RDF] = storage_engine
factory._store_to_engine_map[StorageFactory.RDF] = storage_engine
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertTrue(collection.load(factory))
self.assertTrue(collection.has_subject("TEST1"))
self.assertTrue(collection.has_predicate("TEST1", "HASPURPOSE"))
self.assertTrue(collection.has_object("TEST1", "HASPURPOSE", "to test"))
collection.delete_entity("TEST1", "HASPURPOSE", "to test")
self.assertFalse(collection.has_object("TEST1", "HASPURPOSE", "to test"))
self.assertTrue(collection.reload(factory, "TESTDATA"))
self.assertTrue(collection.has_subject("TEST1"))
self.assertTrue(collection.has_predicate("TEST1", "HASPURPOSE"))
self.assertTrue(collection.has_object("TEST1", "HASPURPOSE", "to test"))
def test_reload_no_engine(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertFalse(collection.reload(factory, "TESTDATA"))
def patch_load_all(self, collector):
raise Exception ("Mock Exception")
@patch("programy.storage.stores.file.store.rdfs.FileRDFStore.load_all", patch_load_all)
def test_load_exception(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
factory._storage_engines[StorageFactory.RDF] = storage_engine
factory._store_to_engine_map[StorageFactory.RDF] = storage_engine
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertFalse(collection.load(factory))
def patch_reload(self, collector, rdf_name):
raise Exception("Mock Exception")
@patch("programy.storage.stores.file.store.rdfs.FileRDFStore.reload", patch_reload)
def test_reload_from_file_exception(self):
config = FileStorageConfiguration()
config._rdf_storage = FileStoreConfiguration(dirs=[os.path.dirname(__file__) + os.sep + "test_files" + os.sep + "rdfs"])
factory = StorageFactory()
storage_engine = FileStorageEngine(config)
factory._storage_engines[StorageFactory.RDF] = storage_engine
factory._store_to_engine_map[StorageFactory.RDF] = storage_engine
storage_engine.initialise()
collection = RDFCollection()
self.assertIsNotNone(collection)
self.assertTrue(collection.load(factory))
self.assertTrue(collection.has_subject("TEST1"))
self.assertTrue(collection.has_predicate("TEST1", "HASPURPOSE"))
self.assertTrue(collection.has_object("TEST1", "HASPURPOSE", "to test"))
collection.delete_entity("TEST1", "HASPURPOSE", "to test")
self.assertFalse(collection.has_object("TEST1", "HASPURPOSE", "to test"))
self.assertFalse(collection.reload(factory, "TESTDATA"))
| 36.486667 | 128 | 0.709848 | 552 | 5,473 | 6.806159 | 0.105072 | 0.069204 | 0.102209 | 0.086239 | 0.882885 | 0.866915 | 0.855736 | 0.82832 | 0.82832 | 0.82832 | 0 | 0.003571 | 0.181253 | 5,473 | 149 | 129 | 36.731544 | 0.834858 | 0 | 0 | 0.78125 | 0 | 0 | 0.09355 | 0.021926 | 0 | 0 | 0 | 0 | 0.291667 | 1 | 0.083333 | false | 0 | 0.09375 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aef01ba204f195d9b5132e5eb8f1721001be7a89 | 16,492 | py | Python | sdk/python/pulumi_alicloud/eipanycast/anycast_eip_address_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/eipanycast/anycast_eip_address_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/eipanycast/anycast_eip_address_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['AnycastEipAddressAttachmentArgs', 'AnycastEipAddressAttachment']
@pulumi.input_type
class AnycastEipAddressAttachmentArgs:
def __init__(__self__, *,
anycast_id: pulumi.Input[str],
bind_instance_id: pulumi.Input[str],
bind_instance_region_id: pulumi.Input[str],
bind_instance_type: pulumi.Input[str]):
"""
The set of arguments for constructing a AnycastEipAddressAttachment resource.
:param pulumi.Input[str] anycast_id: The ID of Anycast EIP.
:param pulumi.Input[str] bind_instance_id: The ID of bound instance.
:param pulumi.Input[str] bind_instance_region_id: The region ID of bound instance.
:param pulumi.Input[str] bind_instance_type: The type of bound instance. Valid value: `SlbInstance`.
"""
pulumi.set(__self__, "anycast_id", anycast_id)
pulumi.set(__self__, "bind_instance_id", bind_instance_id)
pulumi.set(__self__, "bind_instance_region_id", bind_instance_region_id)
pulumi.set(__self__, "bind_instance_type", bind_instance_type)
@property
@pulumi.getter(name="anycastId")
def anycast_id(self) -> pulumi.Input[str]:
"""
The ID of Anycast EIP.
"""
return pulumi.get(self, "anycast_id")
@anycast_id.setter
def anycast_id(self, value: pulumi.Input[str]):
pulumi.set(self, "anycast_id", value)
@property
@pulumi.getter(name="bindInstanceId")
def bind_instance_id(self) -> pulumi.Input[str]:
"""
The ID of bound instance.
"""
return pulumi.get(self, "bind_instance_id")
@bind_instance_id.setter
def bind_instance_id(self, value: pulumi.Input[str]):
pulumi.set(self, "bind_instance_id", value)
@property
@pulumi.getter(name="bindInstanceRegionId")
def bind_instance_region_id(self) -> pulumi.Input[str]:
"""
The region ID of bound instance.
"""
return pulumi.get(self, "bind_instance_region_id")
@bind_instance_region_id.setter
def bind_instance_region_id(self, value: pulumi.Input[str]):
pulumi.set(self, "bind_instance_region_id", value)
@property
@pulumi.getter(name="bindInstanceType")
def bind_instance_type(self) -> pulumi.Input[str]:
"""
The type of bound instance. Valid value: `SlbInstance`.
"""
return pulumi.get(self, "bind_instance_type")
@bind_instance_type.setter
def bind_instance_type(self, value: pulumi.Input[str]):
pulumi.set(self, "bind_instance_type", value)
@pulumi.input_type
class _AnycastEipAddressAttachmentState:
def __init__(__self__, *,
anycast_id: Optional[pulumi.Input[str]] = None,
bind_instance_id: Optional[pulumi.Input[str]] = None,
bind_instance_region_id: Optional[pulumi.Input[str]] = None,
bind_instance_type: Optional[pulumi.Input[str]] = None,
bind_time: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering AnycastEipAddressAttachment resources.
:param pulumi.Input[str] anycast_id: The ID of Anycast EIP.
:param pulumi.Input[str] bind_instance_id: The ID of bound instance.
:param pulumi.Input[str] bind_instance_region_id: The region ID of bound instance.
:param pulumi.Input[str] bind_instance_type: The type of bound instance. Valid value: `SlbInstance`.
:param pulumi.Input[str] bind_time: The time of bound instance.
"""
if anycast_id is not None:
pulumi.set(__self__, "anycast_id", anycast_id)
if bind_instance_id is not None:
pulumi.set(__self__, "bind_instance_id", bind_instance_id)
if bind_instance_region_id is not None:
pulumi.set(__self__, "bind_instance_region_id", bind_instance_region_id)
if bind_instance_type is not None:
pulumi.set(__self__, "bind_instance_type", bind_instance_type)
if bind_time is not None:
pulumi.set(__self__, "bind_time", bind_time)
@property
@pulumi.getter(name="anycastId")
def anycast_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of Anycast EIP.
"""
return pulumi.get(self, "anycast_id")
@anycast_id.setter
def anycast_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "anycast_id", value)
@property
@pulumi.getter(name="bindInstanceId")
def bind_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of bound instance.
"""
return pulumi.get(self, "bind_instance_id")
@bind_instance_id.setter
def bind_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bind_instance_id", value)
@property
@pulumi.getter(name="bindInstanceRegionId")
def bind_instance_region_id(self) -> Optional[pulumi.Input[str]]:
"""
The region ID of bound instance.
"""
return pulumi.get(self, "bind_instance_region_id")
@bind_instance_region_id.setter
def bind_instance_region_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bind_instance_region_id", value)
@property
@pulumi.getter(name="bindInstanceType")
def bind_instance_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of bound instance. Valid value: `SlbInstance`.
"""
return pulumi.get(self, "bind_instance_type")
@bind_instance_type.setter
def bind_instance_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bind_instance_type", value)
@property
@pulumi.getter(name="bindTime")
def bind_time(self) -> Optional[pulumi.Input[str]]:
"""
The time of bound instance.
"""
return pulumi.get(self, "bind_time")
@bind_time.setter
def bind_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bind_time", value)
class AnycastEipAddressAttachment(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
anycast_id: Optional[pulumi.Input[str]] = None,
bind_instance_id: Optional[pulumi.Input[str]] = None,
bind_instance_region_id: Optional[pulumi.Input[str]] = None,
bind_instance_type: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a Eipanycast Anycast Eip Address Attachment resource.
For information about Eipanycast Anycast Eip Address Attachment and how to use it, see [What is Anycast Eip Address Attachment](https://help.aliyun.com/document_detail/171857.html).
> **NOTE:** Available in v1.113.0+.
> **NOTE:** The following regions support currently while Slb instance support bound.
[eu-west-1-gb33-a01,cn-hongkong-am4-c04,ap-southeast-os30-a01,us-west-ot7-a01,ap-south-in73-a01,ap-southeast-my88-a01]
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example_anycast_eip_address = alicloud.eipanycast.AnycastEipAddress("exampleAnycastEipAddress", service_location="international")
example_anycast_eip_address_attachment = alicloud.eipanycast.AnycastEipAddressAttachment("exampleAnycastEipAddressAttachment",
anycast_id=example_anycast_eip_address.id,
bind_instance_id="lb-j6chlcr8lffy7********",
bind_instance_region_id="cn-hongkong",
bind_instance_type="SlbInstance")
```
## Import
Eipanycast Anycast Eip Address Attachment can be imported using the id, e.g.
```sh
$ pulumi import alicloud:eipanycast/anycastEipAddressAttachment:AnycastEipAddressAttachment example `anycast_id`:`bind_instance_id`:`bind_instance_region_id`:`bind_instance_type`
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] anycast_id: The ID of Anycast EIP.
:param pulumi.Input[str] bind_instance_id: The ID of bound instance.
:param pulumi.Input[str] bind_instance_region_id: The region ID of bound instance.
:param pulumi.Input[str] bind_instance_type: The type of bound instance. Valid value: `SlbInstance`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AnycastEipAddressAttachmentArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a Eipanycast Anycast Eip Address Attachment resource.
For information about Eipanycast Anycast Eip Address Attachment and how to use it, see [What is Anycast Eip Address Attachment](https://help.aliyun.com/document_detail/171857.html).
> **NOTE:** Available in v1.113.0+.
> **NOTE:** The following regions support currently while Slb instance support bound.
[eu-west-1-gb33-a01,cn-hongkong-am4-c04,ap-southeast-os30-a01,us-west-ot7-a01,ap-south-in73-a01,ap-southeast-my88-a01]
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example_anycast_eip_address = alicloud.eipanycast.AnycastEipAddress("exampleAnycastEipAddress", service_location="international")
example_anycast_eip_address_attachment = alicloud.eipanycast.AnycastEipAddressAttachment("exampleAnycastEipAddressAttachment",
anycast_id=example_anycast_eip_address.id,
bind_instance_id="lb-j6chlcr8lffy7********",
bind_instance_region_id="cn-hongkong",
bind_instance_type="SlbInstance")
```
## Import
Eipanycast Anycast Eip Address Attachment can be imported using the id, e.g.
```sh
$ pulumi import alicloud:eipanycast/anycastEipAddressAttachment:AnycastEipAddressAttachment example `anycast_id`:`bind_instance_id`:`bind_instance_region_id`:`bind_instance_type`
```
:param str resource_name: The name of the resource.
:param AnycastEipAddressAttachmentArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AnycastEipAddressAttachmentArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
anycast_id: Optional[pulumi.Input[str]] = None,
bind_instance_id: Optional[pulumi.Input[str]] = None,
bind_instance_region_id: Optional[pulumi.Input[str]] = None,
bind_instance_type: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AnycastEipAddressAttachmentArgs.__new__(AnycastEipAddressAttachmentArgs)
if anycast_id is None and not opts.urn:
raise TypeError("Missing required property 'anycast_id'")
__props__.__dict__["anycast_id"] = anycast_id
if bind_instance_id is None and not opts.urn:
raise TypeError("Missing required property 'bind_instance_id'")
__props__.__dict__["bind_instance_id"] = bind_instance_id
if bind_instance_region_id is None and not opts.urn:
raise TypeError("Missing required property 'bind_instance_region_id'")
__props__.__dict__["bind_instance_region_id"] = bind_instance_region_id
if bind_instance_type is None and not opts.urn:
raise TypeError("Missing required property 'bind_instance_type'")
__props__.__dict__["bind_instance_type"] = bind_instance_type
__props__.__dict__["bind_time"] = None
super(AnycastEipAddressAttachment, __self__).__init__(
'alicloud:eipanycast/anycastEipAddressAttachment:AnycastEipAddressAttachment',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
anycast_id: Optional[pulumi.Input[str]] = None,
bind_instance_id: Optional[pulumi.Input[str]] = None,
bind_instance_region_id: Optional[pulumi.Input[str]] = None,
bind_instance_type: Optional[pulumi.Input[str]] = None,
bind_time: Optional[pulumi.Input[str]] = None) -> 'AnycastEipAddressAttachment':
"""
Get an existing AnycastEipAddressAttachment resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] anycast_id: The ID of Anycast EIP.
:param pulumi.Input[str] bind_instance_id: The ID of bound instance.
:param pulumi.Input[str] bind_instance_region_id: The region ID of bound instance.
:param pulumi.Input[str] bind_instance_type: The type of bound instance. Valid value: `SlbInstance`.
:param pulumi.Input[str] bind_time: The time of bound instance.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _AnycastEipAddressAttachmentState.__new__(_AnycastEipAddressAttachmentState)
__props__.__dict__["anycast_id"] = anycast_id
__props__.__dict__["bind_instance_id"] = bind_instance_id
__props__.__dict__["bind_instance_region_id"] = bind_instance_region_id
__props__.__dict__["bind_instance_type"] = bind_instance_type
__props__.__dict__["bind_time"] = bind_time
return AnycastEipAddressAttachment(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="anycastId")
def anycast_id(self) -> pulumi.Output[str]:
"""
The ID of Anycast EIP.
"""
return pulumi.get(self, "anycast_id")
@property
@pulumi.getter(name="bindInstanceId")
def bind_instance_id(self) -> pulumi.Output[str]:
"""
The ID of bound instance.
"""
return pulumi.get(self, "bind_instance_id")
@property
@pulumi.getter(name="bindInstanceRegionId")
def bind_instance_region_id(self) -> pulumi.Output[str]:
"""
The region ID of bound instance.
"""
return pulumi.get(self, "bind_instance_region_id")
@property
@pulumi.getter(name="bindInstanceType")
def bind_instance_type(self) -> pulumi.Output[str]:
"""
The type of bound instance. Valid value: `SlbInstance`.
"""
return pulumi.get(self, "bind_instance_type")
@property
@pulumi.getter(name="bindTime")
def bind_time(self) -> pulumi.Output[str]:
"""
The time of bound instance.
"""
return pulumi.get(self, "bind_time")
| 43.060052 | 189 | 0.66699 | 1,927 | 16,492 | 5.401142 | 0.103269 | 0.12452 | 0.080707 | 0.069178 | 0.812548 | 0.792371 | 0.779785 | 0.760377 | 0.753075 | 0.711472 | 0 | 0.005948 | 0.235447 | 16,492 | 382 | 190 | 43.172775 | 0.819494 | 0.332586 | 0 | 0.556701 | 1 | 0 | 0.134412 | 0.039145 | 0 | 0 | 0 | 0 | 0 | 1 | 0.154639 | false | 0.005155 | 0.025773 | 0 | 0.273196 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4e155188eb6a4fdd6d202b348e3d77782b34a6c3 | 90 | py | Python | timepiece/tests/__init__.py | icekernel/django-timepiece | 883cfcd50da3d1b411a43f3b6116342b49117ace | [
"MIT"
] | null | null | null | timepiece/tests/__init__.py | icekernel/django-timepiece | 883cfcd50da3d1b411a43f3b6116342b49117ace | [
"MIT"
] | null | null | null | timepiece/tests/__init__.py | icekernel/django-timepiece | 883cfcd50da3d1b411a43f3b6116342b49117ace | [
"MIT"
] | null | null | null | from .test_management import *
from .test_templatetags import *
from .test_utils import *
| 22.5 | 32 | 0.8 | 12 | 90 | 5.75 | 0.5 | 0.347826 | 0.405797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 90 | 3 | 33 | 30 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9d83d3e85a348fe6e9799402d01b91386bf9eb55 | 2,159 | py | Python | ad_api/api/sp/budget_rules.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | 12 | 2021-11-06T11:12:12.000Z | 2022-03-31T19:10:08.000Z | ad_api/api/sp/budget_rules.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | 12 | 2021-11-06T03:46:59.000Z | 2022-03-11T19:09:58.000Z | ad_api/api/sp/budget_rules.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | 6 | 2021-09-19T00:47:57.000Z | 2022-01-11T13:55:44.000Z | from ad_api.base import Client, sp_endpoint, fill_query_params, ApiResponse
class BudgetRules(Client):
@sp_endpoint('/sp/campaigns/{}/budgetRules/budgetHistory', method='GET')
def get_budget_history(self, campaignId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), campaignId), params=kwargs)
@sp_endpoint('/sp/budgetRules', method='POST')
def create_budget_rules(self, **kwargs) -> ApiResponse:
return self._request(kwargs.pop('path'), data=kwargs.pop('body'), params=kwargs)
@sp_endpoint('/sp/budgetRules', method='GET')
def list_budget_rules(self, **kwargs) -> ApiResponse:
return self._request(kwargs.pop('path'), params=kwargs)
@sp_endpoint('/sp/budgetRules', method='PUT')
def edit_budget_rules(self, **kwargs) -> ApiResponse:
return self._request(kwargs.pop('path'), data=kwargs.pop('body'), params=kwargs)
@sp_endpoint('/sp/budgetRules/{}', method='GET')
def get_budget_rule(self, budgetRuleId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), budgetRuleId), params=kwargs)
@sp_endpoint('/sp/budgetRules/{}/campaigns', method='GET')
def get_campaigns_budget_rule(self, budgetRuleId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), budgetRuleId), params=kwargs)
@sp_endpoint('/sp/campaigns/{}/budgetRules', method='POST')
def create_campaign_budget_rules(self, campaignId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), campaignId), data=kwargs.pop('body'), params=kwargs)
@sp_endpoint('/sp/campaigns/{}/budgetRules', method='GET')
def get_budget_rules_campaign(self, campaignId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), campaignId), params=kwargs)
@sp_endpoint('/sp/campaigns/{}/budgetRules/{}', method='DELETE')
def delete_budget_rule_campaign(self, campaignId, budgetRuleId, **kwargs) -> ApiResponse:
return self._request(fill_query_params(kwargs.pop('path'), campaignId, budgetRuleId), params=kwargs)
| 53.975 | 119 | 0.716072 | 258 | 2,159 | 5.775194 | 0.155039 | 0.120805 | 0.072483 | 0.163087 | 0.830201 | 0.777181 | 0.757718 | 0.718792 | 0.666443 | 0.638926 | 0 | 0 | 0.125521 | 2,159 | 39 | 120 | 55.358974 | 0.789195 | 0 | 0 | 0.206897 | 0 | 0 | 0.138953 | 0.072719 | 0 | 0 | 0 | 0 | 0 | 1 | 0.310345 | false | 0 | 0.034483 | 0.310345 | 0.689655 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
9da3f8e11adbec818591b83e5bef51438822f5c8 | 34 | py | Python | tests/files/_other.py | fdieulle/greenpeace | 63ed70bd69dd150d1b2135d7cf20625dc4c2412f | [
"MIT"
] | null | null | null | tests/files/_other.py | fdieulle/greenpeace | 63ed70bd69dd150d1b2135d7cf20625dc4c2412f | [
"MIT"
] | null | null | null | tests/files/_other.py | fdieulle/greenpeace | 63ed70bd69dd150d1b2135d7cf20625dc4c2412f | [
"MIT"
] | null | null | null | def foo():
return "Hello foo"
| 11.333333 | 22 | 0.588235 | 5 | 34 | 4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264706 | 34 | 2 | 23 | 17 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.264706 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
9df93e202279fd0470d4037c2562ca2cc6f2ae8e | 47 | py | Python | abaqus/Tools/Widgets/MatplotlibWidget/__init__.py | Haiiliin/pyabaqus-executor | a5962dd7582c0ff4498ac14508b79907a22cfbbd | [
"MIT"
] | 2 | 2022-02-04T07:14:02.000Z | 2022-02-04T09:04:13.000Z | abaqus/Tools/Widgets/MatplotlibWidget/__init__.py | Haiiliin/pyabaqus-executor | a5962dd7582c0ff4498ac14508b79907a22cfbbd | [
"MIT"
] | null | null | null | abaqus/Tools/Widgets/MatplotlibWidget/__init__.py | Haiiliin/pyabaqus-executor | a5962dd7582c0ff4498ac14508b79907a22cfbbd | [
"MIT"
] | null | null | null | from .MatplotlibWidget import MatplotlibWidget
| 23.5 | 46 | 0.893617 | 4 | 47 | 10.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 47 | 1 | 47 | 47 | 0.976744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d1a3c3e652291a358e243f2482eb8886f7c0b6bb | 76,712 | py | Python | dmyplant2/dFSM/VBClassMainState.py | DieterChvatal/dmyplant4 | d73b30291b9cbdefc6b4994d4257e48d667172ad | [
"MIT"
] | null | null | null | dmyplant2/dFSM/VBClassMainState.py | DieterChvatal/dmyplant4 | d73b30291b9cbdefc6b4994d4257e48d667172ad | [
"MIT"
] | null | null | null | dmyplant2/dFSM/VBClassMainState.py | DieterChvatal/dmyplant4 | d73b30291b9cbdefc6b4994d4257e48d667172ad | [
"MIT"
] | 1 | 2022-03-21T08:24:06.000Z | 2022-03-21T08:24:06.000Z | #Imports System.Text
#Imports System.ComponentModel
from .VBClassLib import (
MSG, PrimaryMSG,
EngineAction, ActionDB,
ServiceSelectorSwitch_States, DemandSelectorSwitch_States,
Available_States, Delay_Check,
MSG_Trigger)
class ErrorInClass(Exception):
pass
class Calc_Finished(Exception):
pass
class ClassMainState:
#Private Statlist As List(Of ActionDB)
self.__PrimaryMSG = None #Nothing
# ?? Private WithEvents BGWCalcState As BackgroundWorker
#__AMM_List #As List(Of MSG)
def __init__(list_of_message, list_of_primary_messages):
self.__AMM_List = list_of_message #New List(Of MSG)
self.__PrimaryMSGList = list_of_primary_messages # New List(Of PrimaryMSG)
# self.BGWCalcState = New BackgroundWorker With {.WorkerReportsProgress = True}
self.__Silence = True
self.__Busy = False
self.__GapMaxLength = 5400 # in Sekunden, 5400 = 1,5h
self.__TargetloadreachTime = 300
self.__Alarm_Delay_Time = 1000
self.__E_Action = None
self.__EngineMSG_List = None
self.__Status_List = None
def Finalize():
MyBase.Finalize()
# ?? Public Event ErrorInClass(ByVal Message As String)
# ?? Public Event PrimaryMessageNotfound(ByVal Message As String)
# ?? Public Event Calc_Finisched(ByVal Statlist As List(Of ActionDB))
# ?? <CategoryAttribute("Input"), Description("Engine Message List. As soon as the List is written, calculation will be started")>
@property
def EngineMSG_List(self):
return self.__EngineMSG_List
@EngineMSG_List.setter
def EngineMSG_List(self, value):
try:
self.__Busy = True
#AMM_List.Clear()
self.__AMM_List = value
self._Status_List = [] #New List(Of ActionDB)
if (self.__AMM_List is None) or (len(self.__AMM_List) == 0):
raise ErrorInClass("No Messages in EngineMSG_List")
self.__Busy = False
raise Calc_Finished(self.__Status_List)
elif (self.PrimaryMSGList is None) or (len(self.__PrimaryMSGList) == 0):
raise ErrorInClass("PrimaryMSGList is empty")
self.__Busy = False
raise Calc_Finished(self.__Status_List)
else:
self.__E_Action = EngineAction()
# ?? do this in a Thread ?? BGWCalcState.RunWorkerAsync()
except Exception as err:
raise ErrorInClass("Error 33 " + str(err))
# ?? still dont know how to deal with that ?? BGWCalcState.ReportProgress(-1, "Error 33 " & ex.Message)
#<CategoryAttribute("Input"), Description("Primary Trip List")>
@property
def PrimaryMSGList(self):
return self.__PrimaryMSGList #As List(Of PrimaryMSG)
#<CategoryAttribute("Input"), Description("waiting time in ms if an additional Alarm is happen. Default 1500ms")>
@property
def Alarm_Delay_Time(self):
return self.__Alarm_Delay_Time # As Integer
#<CategoryAttribute("Input"), Description("If Silence is TRUE (DEFAULT), no message is generated if primary Message is not found")>
@property
def Silence(self):
return self.__Silence #As Boolean
#<CategoryAttribute("Input"), Description("Max time difference in seconds between two MSG. Default is 5400 sec. (1,5h)")>
@property
def GapMaxLength(self):
return self.__GapMaxLength # As Integer
#<CategoryAttribute("Input"), Description("After this time ramp up (Island or Paralell) will be end automatic. If the Message 'Tagred Load Reached' is earlier ramp up is finished with message. Default is 300 sec. (5 min.)")>
@property
def TargetloadreachTime(self):
return self.__TargetloadreachTime #As Integer
#<CategoryAttribute("Output"), ReadOnlyAttribute(True), Description("TRUE if calculation is running")>
@property
def Busy(self):
return self.__Busy # As Boolean
# 4.4.2022 bis hierher nach python übersetzt.
# <CategoryAttribute("Output"), ReadOnlyAttribute(True), Description("Engine Status List")>
# Public ReadOnly Property Status_List As List(Of ActionDB)
# ' Get
# ' Return Statlist
# ' End Get
# 'End Property
# #End Region
# #Region "Private Functions"
# Private Function Store_Action(ByVal NewAction As Engine_Action, ByVal row As Integer, ByVal Delayckeck As Delay_Ckeck) As Boolean
# Try
# Select Case Delayckeck
# Case Delay_Ckeck.NetzStörung
# Dim NetzStörungFound As Boolean = False
# Dim NetzstörungMSG As New MSG
# Dim StartMSG As MSG = AMM_List(row)
# Dim _LastMSG_Date As DateTime = AMM_List(row).MsgDate
# Dim OldAction As Engine_Action = E_Action.Action_Actual
# Try
# For i = row + 1 To AMM_List.Count - 1
# Dim ActualMSG_Date As DateTime = AMM_List(i).MsgDate
# If (ActualMSG_Date - _LastMSG_Date).TotalMilliseconds < _Alarm_Delay_Time Then
# If AMM_List(i).MsgType = "A" Then
# E_Action.Trip_List.Add(New MSG With {
# .MsgNo = AMM_List(i).MsgNo,
# .MsgDate = AMM_List(i).MsgDate,
# .MsgText = AMM_List(i).MsgText,
# .MsgType = AMM_List(i).MsgType,
# .MsgResponibility = "M"
# })
# ElseIf AMM_List(i).MsgNo = MSG_Trigger.Netzstoerung Then
# NetzStörungFound = True
# NetzstörungMSG = AMM_List(i)
# End If
# Else
# Exit For
# End If
# Next
# If NetzStörungFound AndAlso E_Action.Trip_List.Count > 0 Then
# 'A_Action.Action_To = Message_Time
# E_Action.Action_To = E_Action.Trip_List(0).MsgDate 'AMM_List(row).MsgDate
# Store_Action_Line(Engine_Action.Mains_Failure)
# E_Action.Action_To = E_Action.Trip_List(0).MsgDate
# Store_Action_Line(Engine_Action.Forced_Outage)
# ElseIf NetzStörungFound Then ' AndAlso (E_Action.Trip_List(0).MsgDate - NetzstörungMSG.MsgDate).TotalMilliseconds < _Alarm_Delay_Time Then
# E_Action.Action_To = AMM_List(row).MsgDate
# Store_Action_Line(Engine_Action.Mains_Failure)
# ElseIf E_Action.Trip_List.Count > 0 Then
# E_Action.Action_To = E_Action.Trip_List(0).MsgDate 'AMM_List(row).MsgDate
# Store_Action_Line(Engine_Action.Forced_Outage)
# Else
# If NewAction <> OldAction And (E_Action.Action_To - E_Action.Action_From).Milliseconds > 0 Then
# If E_Action.Action_From < New DateTime(1990, 1, 1) Then
# E_Action.Action_From = AMM_List(row).MsgDate
# End If
# Store_Action_Line(NewAction)
# End If
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_Action " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# Case Delay_Ckeck.RemoteReset
# Dim _LastMSG_Date As DateTime = AMM_List(row).MsgDate
# For i = row + 1 To AMM_List.Count - 1
# Dim ActualMSG_Date As DateTime = AMM_List(i).MsgDate
# If (ActualMSG_Date - _LastMSG_Date).TotalMilliseconds < 500 Then
# If AMM_List(i).MsgNo = MSG_Trigger.Remote_Reset Then
# E_Action.Reset_List.Add(New MSG With {
# .MsgNo = AMM_List(i).MsgNo,
# .MsgDate = AMM_List(i).MsgDate,
# .MsgText = AMM_List(i).MsgText,
# .MsgType = AMM_List(i).MsgType,
# .MsgResponibility = "M"
# })
# End If
# Else
# Exit For
# End If
# Next
# Case Delay_Ckeck.Betrieb_NetzOrInsel
# E_Action.Action_To = AMM_List(row).MsgDate
# Dim Found As Boolean = False
# For i = _Status_List.Count - 1 To 0 Step -1
# Select Case _Status_List(i).Action_Actual
# Case Engine_Action.Mains_Parallel_Operation
# Store_Action_Line(Engine_Action.Mains_Parallel_Operation)
# Found = True
# Exit For
# Case Engine_Action.RampUp_Mains_Parallel_Operation
# Store_Action_Line(Engine_Action.RampUp_Mains_Parallel_Operation)
# Found = True
# Exit For
# Case Engine_Action.Island_Operation
# Store_Action_Line(Engine_Action.Island_Operation)
# Found = True
# Exit For
# Case Engine_Action.RampUp_Island_Operation
# Store_Action_Line(Engine_Action.RampUp_Island_Operation)
# Found = True
# Exit For
# End Select
# Next
# If Not Found Then
# Store_Action_Line(NewAction)
# End If
# Case Delay_Ckeck.NoCheck
# If E_Action.Action_From < New DateTime(1990, 1, 1) Then
# E_Action.Action_From = AMM_List(row).MsgDate
# End If
# Store_Action_Line(NewAction)
# End Select
# Return True
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_Action " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# Return False
# End Try
# End Function
# Private Sub Store_Action_Line(ByVal NewAction As Engine_Action)
# Try
# If E_Action.Action_To < New DateTime(1990, 1, 1) Then
# E_Action.Action_To = E_Action.Action_To
# End If
# If NewAction = Engine_Action.Troubleshooting Then
# E_Action = E_Action
# End If
# If E_Action.Action_Actual = Engine_Action.Forced_Outage OrElse (E_Action.Action_Trip Is Nothing AndAlso E_Action.Action_Actual = Engine_Action.Troubleshooting) Then
# If E_Action.Trip_List.Count > 0 Then
# E_Action.Action_Trip = Select_Primary_MSG(E_Action.Trip_List)
# For i = _Status_List.Count - 1 To 0 Step -1
# If _Status_List(i).Action_Actual = Engine_Action.Forced_Outage OrElse _Status_List(i).Action_Actual = Engine_Action.Troubleshooting Then
# _Status_List(i).Trigger_Date = E_Action.Action_Trip.MsgDate
# _Status_List(i).Trigger_MSGNo = E_Action.Action_Trip.MsgNo
# _Status_List(i).Trigger_Responsibility = E_Action.Action_Trip.MsgResponibility
# _Status_List(i).Trigger_Text = E_Action.Action_Trip.MsgText
# Else
# Exit For
# End If
# Next
# End If
# E_Action.Trip_List.Clear()
# ElseIf E_Action.Action_Actual = Engine_Action.Troubleshooting Then
# E_Action.Trip_List.Clear()
# E_Action.Reset_List.Clear()
# Else
# If E_Action.Reset_List.Count > 0 Then
# E_Action.Action_Trip = E_Action.Reset_List(E_Action.Reset_List.Count - 1)
# E_Action.Reset_List.Clear()
# End If
# End If
# Dim ST As New ActionDB With {
# .Action_Actual = E_Action.Action_Actual,
# .Action_From = E_Action.Action_From,
# .Action_To = E_Action.Action_To,
# .ServiceSelectorSwitch = E_Action.ServiceSelectorSwitch,
# .DemandSelectorSwitch = E_Action.DemandSelectorSwitch,
# .AV_MAN_Activated_Status = E_Action.AV_MAN_Activated_Status,
# .CalcDate = DateTime.Now()
# }
# If E_Action.Action_Trip IsNot Nothing Then
# ST.Trigger_Date = E_Action.Action_Trip.MsgDate
# ST.Trigger_MSGNo = E_Action.Action_Trip.MsgNo
# ST.Trigger_Text = E_Action.Action_Trip.MsgText
# ST.Trigger_Responsibility = E_Action.Action_Trip.MsgResponibility
# If E_Action.Action_Actual = Engine_Action.Troubleshooting Then
# ST.Trigger_Count = 0
# Else
# ST.Trigger_Count = 1
# End If
# Else
# ST.Trigger_Date = E_Action.Action_Date
# End If
# _Status_List.Add(ST)
# E_Action.Action_From = E_Action.Action_To
# E_Action.Action_Actual = NewAction
# E_Action.Action_To = Nothing
# E_Action.Action_Date = E_Action.Action_From
# If E_Action.Action_Actual <> Engine_Action.Troubleshooting AndAlso E_Action.Action_Actual <> Engine_Action.Forced_Outage Then
# E_Action.Action_Trip = Nothing
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_Action_Line " & ex.Message)
# End Try
# End Sub
# Private Sub Store_WS_DS()
# Try
# Dim ST As New ActionDB With {
# .Action_Actual = E_Action.Action_Actual,
# .Action_From = E_Action.Action_From,
# .Action_To = E_Action.Action_To,
# .ServiceSelectorSwitch = E_Action.ServiceSelectorSwitch,
# .DemandSelectorSwitch = E_Action.DemandSelectorSwitch,
# .AV_MAN_Activated_Status = E_Action.AV_MAN_Activated_Status,
# .Trigger_Date = E_Action.Action_Date,
# .CalcDate = DateTime.Now()
# }
# If Not E_Action.Action_Trip Is Nothing Then
# ST.Trigger_Date = E_Action.Action_Trip.MsgDate
# ST.Trigger_MSGNo = E_Action.Action_Trip.MsgNo
# ST.Trigger_Text = E_Action.Action_Trip.MsgText
# ST.Trigger_Responsibility = E_Action.Action_Trip.MsgResponibility
# ST.Trigger_Count = 0
# End If
# _Status_List.Add(ST)
# E_Action.Action_From = E_Action.Action_To
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_WS_DS " & ex.Message)
# End Try
# End Sub
# Private Sub Store_BWS(ByVal NewBWS As ServiceSelectorSwitch_States, ByVal StatusZeit As DateTime)
# Try
# If E_Action.ServiceSelectorSwitch <> NewBWS Then
# E_Action.Action_To = StatusZeit
# Store_WS_DS()
# E_Action.ServiceSelectorSwitch = NewBWS
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_BWS " & ex.Message + " at: " + StatusZeit.ToString)
# End Try
# End Sub
# Private Function Store_AWS(ByVal NewAWS As DemandSelectorSwitch_States, ByVal StatusZeit As DateTime) As Boolean
# Try
# If E_Action.DemandSelectorSwitch <> NewAWS Then
# E_Action.Action_To = StatusZeit
# Store_WS_DS()
# E_Action.DemandSelectorSwitch = NewAWS
# End If
# Return True
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_AWS " & ex.Message + " at: " + StatusZeit.ToString)
# Return False
# End Try
# End Function
# Private Function Store_AV_MAN_Activated_Status(ByVal NewAVSS As Available_States, ByVal StatusZeit As DateTime, ByVal row As Integer, ByVal Ret As Boolean) As Boolean
# Dim i As Integer
# Try
# If E_Action.AV_MAN_Activated_Status <> NewAVSS Then
# E_Action.Action_To = StatusZeit
# Store_WS_DS()
# If Ret Then
# For i = _Status_List.Count - 1 To 0 Step -1
# If _Status_List(i).AV_MAN_Activated_Status <> NewAVSS AndAlso _Status_List(i).Action_Actual <> Engine_Action.Forced_Outage AndAlso _Status_List(i).Action_Actual <> Engine_Action.Troubleshooting Then
# _Status_List(i).AV_MAN_Activated_Status = NewAVSS
# Else
# Exit For
# End If
# Next
# End If
# E_Action.AV_MAN_Activated_Status = NewAVSS
# End If
# Return True
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Store_AVSS " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# Return False
# End Try
# End Function
# Public Sub Close()
# AMM_List = Nothing
# PrimaryMSGList = Nothing
# BGWCalcState.CancelAsync()
# BGWCalcState.Dispose()
# BGWCalcState = Nothing
# Finalize()
# End Sub
# Private Sub BGWCalcState_DoWork(sender As Object, e As DoWorkEventArgs) Handles BGWCalcState.DoWork
# CalcState()
# End Sub
# Private Sub BGWCalcState_RunWorkerCompleted(sender As Object, e As RunWorkerCompletedEventArgs) Handles BGWCalcState.RunWorkerCompleted
# Try
# If Not e.Error Is Nothing Then
# BGWCalcState.ReportProgress(-1, "Backgroundworker Fehler " & e.Error.Message)
# ElseIf _Status_List Is Nothing Then
# BGWCalcState.ReportProgress(-1, "keine Ahnung warum das so ist")
# Else
# RaiseEvent Calc_Finisched(_Status_List)
# End If
# Catch ex As Exception
# RaiseEvent ErrorInClass("Error im Backgroundworker: " + ex.Message)
# 'BGWCalcState.ReportProgress(-1, "Error im Backgroundworker: " + ex.Message)
# Finally
# AMM_List.Clear()
# _Bussy = False
# End Try
# End Sub
# #Region "Actionauswertungen"
# Private Sub CalcState()
# Dim RowCount As Integer = 0
# Dim LastMsg As MSG
# LastMsg = AMM_List(0)
# E_Action.Action_From = LastMsg.MsgDate
# E_Action.Action_Actual = Engine_Action.Undefinded
# E_Action.Action_Trip = Nothing
# Dim MSGNO As Integer
# Dim MSGDATE As DateTime
# Dim MSGTEXT As String
# Dim MSGType As String
# For Each row As MSG In AMM_List
# Try
# row.Status = E_Action.Action_Actual.ToString
# MSGNO = row.MsgNo
# MSGDATE = row.MsgDate
# MSGTEXT = row.MsgText
# MSGType = row.MsgType
# If ((MSGDATE - LastMsg.MsgDate).TotalSeconds > GapMaxLength) AndAlso (E_Action.Action_Actual <> Engine_Action.Data_GAP) Then
# E_Action.Action_To = LastMsg.MsgDate
# Store_Action(Engine_Action.Data_GAP, RowCount, Delay_Ckeck.NoCheck)
# LastMsg = row
# End If
# If MSGType = "A" Then
# E_Action.Trip_List.Add(New MSG With {
# .MsgNo = MSGNO,
# .MsgDate = MSGDATE,
# .MsgText = MSGTEXT,
# .MsgType = MSGType,
# .MsgResponibility = "M"
# })
# ElseIf MSGNO = MSG_Trigger.WarmStartCPU_1253 OrElse MSGNO = MSG_Trigger.ColdStartCPU_1254 Then ' Or A_MSG = OPC_Server_starte_9004 Then
# E_Action.Trip_List.Add(New MSG With {
# .MsgNo = MSGNO,
# .MsgDate = MSGDATE,
# .MsgText = MSGTEXT,
# .MsgType = "A",
# .MsgResponibility = "R"
# })
# ElseIf E_Action.Action_Actual = Engine_Action.Forced_Outage AndAlso MSGNO = MSG_Trigger.Remote_Reset Then
# E_Action.Reset_List.Add(New MSG With {
# .MsgNo = MSGNO,
# .MsgDate = MSGDATE,
# .MsgText = MSGTEXT,
# .MsgType = MSGType, '"B",
# .MsgResponibility = "R"
# })
# Else
# Select Case E_Action.Action_Actual
# Case Engine_Action.Undefinded
# Check_Action_GAP(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Data_GAP
# Check_Action_GAP(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Start_Preparation
# Check_Action_Blockstart(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Start
# Check_Action_Start(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Idle
# Check_Action_Idle(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Synchronisation
# Check_Action_Synch(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.RampUp_Mains_Parallel_Operation
# Check_Action_RampupNetzparallel_Betrieb(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Mains_Parallel_Operation
# Check_Action_Netzparallel_Betrieb(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.RampUp_Island_Operation
# Check_Action_RampupInselbetrieb_Betrieb(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Island_Operation
# Check_Action_Inselbetrieb_Betrieb(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Load_Rampdown
# Check_Action_Rampdown(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Engine_Cooldown
# Check_Action_Cooldown(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Forced_Outage
# Check_Action_Forcedoutage(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Troubleshooting
# Check_Action_Troubleshooting(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Ready
# Check_Action_Ready(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Not_Ready
# Check_Action_Not_Ready(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Operation 'Wird nie erreicht, da im Store_Action auf den richtigen Action umgeschaltet wird!!
# Check_Action_Betrieb(E_Action, MSGNO, MSGDATE, RowCount)
# Case Engine_Action.Mains_Failure
# Check_Action_NetzStörung(E_Action, MSGNO, MSGDATE, RowCount)
# Case Else
# 'MsgBox("das gibt es nicht", MsgBoxStyle.OkOnly)
# End Select
# 'Auswertung Meldungsnummern
# Select Case MSGNO
# 'Auswertung BWS
# Case MSG_Trigger.BWS_AUS
# Store_BWS(ServiceSelectorSwitch_States.OFF, MSGDATE)
# Case MSG_Trigger.BWS_Hand
# Store_BWS(ServiceSelectorSwitch_States.MAN, MSGDATE)
# Case MSG_Trigger.BWS_Auto
# Store_BWS(ServiceSelectorSwitch_States.AUTO, MSGDATE)
# 'Auswertung Anforderungswahlschalter
# Case MSG_Trigger.AWS_AUS
# Store_AWS(DemandSelectorSwitch_States.OFF, MSGDATE)
# Case MSG_Trigger.AWS_Ein
# Store_AWS(DemandSelectorSwitch_States.EIN, MSGDATE)
# Case MSG_Trigger.AWS_Remote
# Store_AWS(DemandSelectorSwitch_States.REMOTE, MSGDATE)
# 'Auswertung RAM Trigger
# Case MSG_Trigger.Troubleshooting_ON
# Store_AV_MAN_Activated_Status(Available_States.Troubleshooting, MSGDATE, RowCount, False)
# Case MSG_Trigger.Troubleshooting_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, False)
# Case MSG_Trigger.Troubleshooting_Automatic_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, True)
# Case MSG_Trigger.Maintenace_ON
# Store_AV_MAN_Activated_Status(Available_States.Maintenance, MSGDATE, RowCount, False)
# Case MSG_Trigger.Maintenace_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, False)
# Case MSG_Trigger.Maintenace_Automatic_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, True)
# Case MSG_Trigger.Deactivate_ON
# Store_AV_MAN_Activated_Status(Available_States.Deactivated, MSGDATE, RowCount, False)
# Case MSG_Trigger.Deactivate_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, False)
# Case MSG_Trigger.Deactivate_Automatic_OFF
# Store_AV_MAN_Activated_Status(Available_States.Undefined, MSGDATE, RowCount, True)
# End Select
# End If
# LastMsg = row
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error CalcState " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(RowCount).MsgDate.ToString)
# End Try
# RowCount += 1
# Next
# E_Action.Action_To = LastMsg.MsgDate
# Store_Action(E_Action.Action_Actual, RowCount - 1, Delay_Ckeck.NoCheck)
# End Sub
# Private Sub Check_Action_Blockstart(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# If (Message_Time - A_Action.Action_From).TotalMinutes > 25 Then
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Data_GAP, row, Delay_Ckeck.NoCheck)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# 'A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Blockstart " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Netzparallel_Betrieb(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Island_Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Check_Action_Netzparallel_Betrieb " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Inselbetrieb_Betrieb(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Island_Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Betrieb " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_RampupNetzparallel_Betrieb(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# If (Message_Time - A_Action.Action_From).TotalSeconds > _TaretloadreachTime Then
# A_Action.Action_To = A_Action.Action_From.AddSeconds(_TaretloadreachTime)
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Check_Action_Netzparallel_Betrieb(A_Action, MSGNR, Message_Time, row)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.TargetLoadReached
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_RampupNetzparallelbetrieb " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_RampupInselbetrieb_Betrieb(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# If (Message_Time - A_Action.Action_From).TotalSeconds > _TaretloadreachTime Then
# A_Action.Action_To = A_Action.Action_From.AddSeconds(_TaretloadreachTime)
# Store_Action(Engine_Action.Island_Operation, row, Delay_Ckeck.NoCheck)
# Check_Action_Inselbetrieb_Betrieb(A_Action, MSGNR, Message_Time, row)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.TargetLoadReached
# A_Action.Action_To = Message_Time
# 'MsgBox("der Schei sollte nie eintreten!!")
# Store_Action(Engine_Action.Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_RampupInselbetrieb " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Betrieb(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Betrieb " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Rampdown(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Rampdown " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Cooldown(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Cooldown " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Start(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Start " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Idle(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Idle " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Synch(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NoCheck)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Synch " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Ready(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NoCheck)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# 'Case MSG_Trigger.Anforderung_Aus
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.Ready, row, Delay_Ckeck.AlarmInNextSecond)
# Case MSG_Trigger.Bereit_Automatic_Aus
# If Not CheckBWSNotAuto(row) Then
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Not_Ready, row, Delay_Ckeck.NetzStörung)
# End If
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Ready " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Not_Ready(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.GS_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Not_Ready " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_GAP(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Load_Rampdown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Bereit_Automatic_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Not_Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.GS_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Engine_Cooldown, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.GS_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.IGN_Aus
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NetzStörung)
# Case MSG_Trigger.Netzstoerung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Mains_Failure, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Power_reduction_In_isolated_operation
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Island_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.BWS_Hand, MSG_Trigger.BWS_AUS
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Troubleshooting, row, Delay_Ckeck.NoCheck)
# Case Else
# For Each tstEnum As Powerreduction In GetType(Powerreduction).GetEnumValues
# If tstEnum = MSGNR Then
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Exit Select
# End If
# Next
# If row > 0 AndAlso ((Message_Time - AMM_List(row - 1).MsgDate).TotalSeconds < GapMaxLength) And E_Action.Action_Actual <> Engine_Action.Undefinded Then
# E_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Undefinded, row, Delay_Ckeck.NoCheck)
# End If
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Gap " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Forcedoutage(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# Select Case MSGNR
# Case MSG_Trigger.BWS_Hand, MSG_Trigger.BWS_AUS, MSG_Trigger.BWS_Auto
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Troubleshooting, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.RemoteReset)
# 'Case MSG_Trigger.Netzparallelbetrieb
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# 'Case MSG_Trigger.Inselbetrieb
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# 'Case MSG_Trigger.Synchronisieranforderung
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.GS_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case Else
# End Select
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_forced_outage " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_Troubleshooting(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# Select Case MSGNR
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.Betrieb_NetzOrInsel)
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Idle
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Idle, row, Delay_Ckeck.NoCheck)
# 'Case MSG_Trigger.BWS_Hand
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.Troubleshooting, row, Delay_Ckeck.NoCheck)
# ' E_Action.ServiceSelectorSwitch = ServiceSelectorSwitch_States.MAN
# 'Case MSG_Trigger.BWS_AUS
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.Troubleshooting, row, Delay_Ckeck.NoCheck)
# ' E_Action.ServiceSelectorSwitch = ServiceSelectorSwitch_States.OFF
# 'Case MSG_Trigger.BWS_Auto
# ' A_Action.Action_To = Message_Time
# ' Store_Action(Engine_Action.Troubleshooting, row, Delay_Ckeck.NoCheck)
# ' E_Action.ServiceSelectorSwitch = ServiceSelectorSwitch_States.AUTO
# End Select
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Troubleshooting " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Sub Check_Action_NetzStörung(ByRef A_Action As EngineAction, ByRef MSGNR As Integer, ByRef Message_Time As Date, ByRef row As Integer)
# Try
# If A_Action.Trip_List.Count > 0 Then
# A_Action.Action_To = A_Action.Trip_List(0).MsgDate
# Store_Action(Engine_Action.Forced_Outage, row, Delay_Ckeck.NetzStörung)
# Else
# Select Case MSGNR
# Case MSG_Trigger.Bereit_Automatic_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Ready, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Anforderung_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start_Preparation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Starter_Ein
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Start, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Synchronisieranforderung
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.Synchronisation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Netzparallelbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Mains_Parallel_Operation, row, Delay_Ckeck.NoCheck)
# Case MSG_Trigger.Inselbetrieb
# A_Action.Action_To = Message_Time
# Store_Action(Engine_Action.RampUp_Island_Operation, row, Delay_Ckeck.NoCheck)
# End Select
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Error Check_Action_Netzstörung " & ex.Message + " in row " + row.ToString + " Date: " + AMM_List(row).MsgDate.ToString)
# End Try
# End Sub
# Private Function Select_Primary_MSG(ByRef AT_MSG As List(Of MSG)) As MSG
# Dim _MSG As New List(Of MSG)
# Dim FirstMSGD As DateTime
# Select_Primary_MSG = Nothing
# Try
# If AT_MSG.Count > 0 Then
# FirstMSGD = AT_MSG(0).MsgDate
# Else
# Exit Function
# End If
# 'Check ob ein Kaltstart oder ein Sicherungsfall in den Meldungen enthalten ist
# For Each row As MSG In AT_MSG
# If (row.MsgNo = MSG_Trigger.ColdStartCPU_1254) OrElse (row.MsgNo = MSG_Trigger.WarmStartCPU_1253) OrElse (row.MsgNo = MSG_Trigger.OPC_Server_start_9004) OrElse (row.MsgNo = MSG_Trigger.Sicherungsfall_Modulinterfaceschrank_1140) Then
# Select_Primary_MSG = row
# Select_Primary_MSG.MsgType = "P"
# Exit Function
# Else
# If (row.MsgDate - FirstMSGD).TotalMilliseconds < 1000 Then
# Dim result
# result = _MSG.Find(Function(x) x.MsgNo = row.MsgNo.ToString)
# If result Is Nothing Then
# _MSG.Add(row)
# Else
# result = result
# End If
# Else
# Exit For
# End If
# End If
# Next
# If _MSG.Count = 1 Then
# Select_Primary_MSG = _MSG(0)
# Exit Function
# ElseIf _MSG.Count > 1 Then
# 'Dim DSL As New List(Of MSG)
# _MSG.Sort(Function(x, y) x.MsgNo.CompareTo(y.MsgNo))
# Dim PM_String As New StringBuilder
# 'Erstelle Primary Matrix
# For Each a As MSG In _MSG
# PM_String.Append(a.MsgNo.ToString + "_")
# Next
# PM_String = PM_String.Remove(PM_String.Length - 1, 1)
# Dim result As PrimaryMSG
# result = PrimaryMSGList.Find(Function(x) x.MSGCombination = PM_String.ToString)
# If Not result Is Nothing Then
# Dim Result2 As MSG
# Result2 = _MSG.Find(Function(x) x.MsgNo = result.MsgNo)
# Result2.MsgType = "P"
# Select_Primary_MSG = Result2
# Else
# If Not Silence Then 'Keine Primary gefunden
# Dim sb As New StringBuilder
# Dim Trip As String = "{0} | {1} | {2}"
# For Each a As MSG In _MSG
# sb.AppendFormat(Trip, a.MsgNo.ToString, String.Format("{0:dd/MM/yy H:mm:ss} {1} ", a.MsgDate, a.MsgDate.Subtract(_MSG(0).MsgDate).TotalMilliseconds), a.MsgText.ToString)
# sb.AppendLine()
# Next
# RaiseEvent PrimaryMessageNotfound(sb.ToString)
# End If
# If _MSG.Count > 1 Then
# For Each a As MSG In _MSG
# If a.MsgNo <> 1056 Then
# Select_Primary_MSG = a
# Exit For
# End If
# Next
# Else
# Select_Primary_MSG = _MSG(0)
# End If
# End If
# End If
# If Select_Primary_MSG Is Nothing Then
# Select_Primary_MSG = Select_Primary_MSG
# End If
# Catch ex As Exception
# BGWCalcState.ReportProgress(-1, "Primary Message Error: " & ex.Message)
# Select_Primary_MSG = Nothing
# End Try
# End Function
# Private Function CheckBWSNotAuto(ByVal Row As Integer) As Boolean
# Dim _LastMSG_Date As DateTime = AMM_List(Row).MsgDate
# CheckBWSNotAuto = False
# For i = Row - 1 To 0 Step -1
# If (_LastMSG_Date - AMM_List(i).MsgDate).TotalMilliseconds < 1500 Then
# If AMM_List(i).MsgNo = MSG_Trigger.BWS_Hand Or AMM_List(i).MsgNo = MSG_Trigger.BWS_AUS Then
# CheckBWSNotAuto = True
# Exit For
# End If
# Else
# Exit For
# End If
# Next
# If Not CheckBWSNotAuto Then
# For i = Row + 1 To AMM_List.Count - 1
# If (AMM_List(i).MsgDate - _LastMSG_Date).TotalMilliseconds < 1500 Then
# If AMM_List(i).MsgNo = MSG_Trigger.BWS_Hand Or AMM_List(i).MsgNo = MSG_Trigger.BWS_AUS Then
# CheckBWSNotAuto = True
# Exit For
# End If
# Else
# Exit For
# End If
# Next
# End If
# End Function
# Private Sub BGWCalcState_ProgressChanged(sender As Object, e As ProgressChangedEventArgs) Handles BGWCalcState.ProgressChanged
# If e.ProgressPercentage < 0 Then
# RaiseEvent ErrorInClass(e.UserState.ToString)
# End If
# End Sub
# #End Region
# #End Region
# End Class
| 58.071158 | 250 | 0.557162 | 7,786 | 76,712 | 5.18392 | 0.051117 | 0.068678 | 0.061048 | 0.089465 | 0.803479 | 0.768074 | 0.746841 | 0.726451 | 0.693028 | 0.666568 | 0 | 0.004751 | 0.377177 | 76,712 | 1,320 | 251 | 58.115152 | 0.840031 | 0.93692 | 0 | 0.21875 | 0 | 0 | 0.016865 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0.03125 | 0.015625 | 0.109375 | 0.328125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
ae5328b5e8744ccc262846c5378a557bf762b445 | 8,474 | py | Python | lowfat/migrations/0114_auto_20171115_1546.py | elena-kolomeets/lowfat | f7647f5cd12519f722e41808157a96cc3e37b6ce | [
"BSD-3-Clause"
] | 6 | 2017-02-23T16:44:36.000Z | 2019-03-18T11:39:03.000Z | lowfat/migrations/0114_auto_20171115_1546.py | elena-kolomeets/lowfat | f7647f5cd12519f722e41808157a96cc3e37b6ce | [
"BSD-3-Clause"
] | 286 | 2017-02-07T15:00:41.000Z | 2022-03-08T12:56:09.000Z | lowfat/migrations/0114_auto_20171115_1546.py | elena-kolomeets/lowfat | f7647f5cd12519f722e41808157a96cc3e37b6ce | [
"BSD-3-Clause"
] | 2 | 2018-06-19T12:38:08.000Z | 2020-11-23T12:15:08.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.5 on 2017-11-15 15:46
from __future__ import unicode_literals
from django.db import migrations, models
def fix_career_stages(apps, schema_editor): # pylint: disable=unused-argument
Claimant = apps.get_model("lowfat", "Claimant") # pylint: disable=invalid-name
for claimant in Claimant.objects.all():
if claimant.career_stage_when_apply == "E":
claimant.career_stage_when_apply = "2"
elif claimant.career_stage_when_apply == "M":
claimant.career_stage_when_apply = "3"
elif claimant.career_stage_when_apply == "L":
claimant.career_stage_when_apply = "4"
claimant.save()
def reverse_fix_career_stages(apps, schema_editor): # pylint: disable=unused-argument
Claimant = apps.get_model("lowfat", "Claimant") # pylint: disable=invalid-name
for claimant in Claimant.objects.all():
if claimant.career_stage_when_apply == "2":
claimant.career_stage_when_apply = "E"
elif claimant.career_stage_when_apply == "3":
claimant.career_stage_when_apply = "M"
elif claimant.career_stage_when_apply == "4":
claimant.career_stage_when_apply = "L"
claimant.save()
class Migration(migrations.Migration):
dependencies = [
('lowfat', '0113_merge_20171103_0948'),
]
operations = [
migrations.AddField(
model_name='claimant',
name='activated',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='claimant',
name='department',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='claimant',
name='example_of_writing_url',
field=models.CharField(blank=True, max_length=360),
),
migrations.AddField(
model_name='claimant',
name='google_scholar',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='claimant',
name='group',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='claimant',
name='institutional_website',
field=models.CharField(blank=True, max_length=360),
),
migrations.AddField(
model_name='claimant',
name='interests',
field=models.TextField(blank=True, help_text='25-50 word summary of your professional interests.'),
),
migrations.AddField(
model_name='claimant',
name='is_into_training',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='claimant',
name='job_title_when_apply',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='claimant',
name='linkedin',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='claimant',
name='received_offer',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='claimant',
name='screencast_url',
field=models.CharField(blank=True, max_length=360),
),
migrations.AddField(
model_name='historicalclaimant',
name='activated',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='historicalclaimant',
name='department',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='historicalclaimant',
name='example_of_writing_url',
field=models.CharField(blank=True, max_length=360),
),
migrations.AddField(
model_name='historicalclaimant',
name='google_scholar',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='historicalclaimant',
name='group',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='historicalclaimant',
name='institutional_website',
field=models.CharField(blank=True, max_length=360),
),
migrations.AddField(
model_name='historicalclaimant',
name='interests',
field=models.TextField(blank=True, help_text='25-50 word summary of your professional interests.'),
),
migrations.AddField(
model_name='historicalclaimant',
name='is_into_training',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='historicalclaimant',
name='job_title_when_apply',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='historicalclaimant',
name='linkedin',
field=models.CharField(blank=True, max_length=120),
),
migrations.AddField(
model_name='historicalclaimant',
name='received_offer',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='historicalclaimant',
name='screencast_url',
field=models.CharField(blank=True, max_length=360),
),
migrations.AlterField(
model_name='claimant',
name='career_stage_when_apply',
field=models.CharField(choices=[('1', 'Phase 1 - Junior (e.g. PhD candidate, Junior Research Software Engineer)'), ('2', 'Early (e.g Research Assistant\\/Associate, first grant holder, Lecturer, Research Software Engineer)'), ('3', 'Mid \\/ Recognised (e.g. Senior Lecturer, Reader, Senior Researcher, Senior Research Software Engineer, Research Software Group Leader)'), ('4', 'Established \\/ Experienced \\/ Senior (e.g. Professor, Director of Research Computing, Distinguished Engineer, Chief Data Scientist)')], default='M', max_length=1),
),
migrations.AlterField(
model_name='claimant',
name='photo',
field=models.FileField(blank=True, help_text='A professionally oriented (i.e. work related) main picture of yourself that you are happy to be published on the web - this should be 300px wide and 400px high (exact please).', null=True, upload_to='photos/'),
),
migrations.AlterField(
model_name='claimant',
name='work_description',
field=models.TextField(blank=True, help_text='200-300 words describing the work you do, this can include your plans for Fellowship.'),
),
migrations.AlterField(
model_name='historicalclaimant',
name='career_stage_when_apply',
field=models.CharField(choices=[('1', 'Phase 1 - Junior (e.g. PhD candidate, Junior Research Software Engineer)'), ('2', 'Early (e.g Research Assistant\\/Associate, first grant holder, Lecturer, Research Software Engineer)'), ('3', 'Mid \\/ Recognised (e.g. Senior Lecturer, Reader, Senior Researcher, Senior Research Software Engineer, Research Software Group Leader)'), ('4', 'Established \\/ Experienced \\/ Senior (e.g. Professor, Director of Research Computing, Distinguished Engineer, Chief Data Scientist)')], default='M', max_length=1),
),
migrations.AlterField(
model_name='historicalclaimant',
name='photo',
field=models.TextField(blank=True, help_text='A professionally oriented (i.e. work related) main picture of yourself that you are happy to be published on the web - this should be 300px wide and 400px high (exact please).', max_length=100, null=True),
),
migrations.AlterField(
model_name='historicalclaimant',
name='work_description',
field=models.TextField(blank=True, help_text='200-300 words describing the work you do, this can include your plans for Fellowship.'),
),
migrations.RunPython(
fix_career_stages,
reverse_fix_career_stages
),
]
| 44.366492 | 557 | 0.617064 | 881 | 8,474 | 5.76958 | 0.206583 | 0.053118 | 0.108597 | 0.127484 | 0.934881 | 0.934881 | 0.849302 | 0.819398 | 0.819398 | 0.819398 | 0 | 0.021697 | 0.271182 | 8,474 | 190 | 558 | 44.6 | 0.801328 | 0.022422 | 0 | 0.851648 | 1 | 0.032967 | 0.288873 | 0.024163 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010989 | false | 0 | 0.010989 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ae6812f88761dc974c5d1cafedd328dd346c79db | 158 | py | Python | turbo_seti/__init__.py | PetchMa/turbo_seti | c9dd39e370a4628c400faa7dfd5be9f50f35e299 | [
"MIT"
] | null | null | null | turbo_seti/__init__.py | PetchMa/turbo_seti | c9dd39e370a4628c400faa7dfd5be9f50f35e299 | [
"MIT"
] | 1 | 2020-04-21T05:42:10.000Z | 2020-05-19T08:07:17.000Z | turbo_seti/__init__.py | PetchMa/turbo_seti | c9dd39e370a4628c400faa7dfd5be9f50f35e299 | [
"MIT"
] | 1 | 2020-02-09T16:09:14.000Z | 2020-02-09T16:09:14.000Z | try:
from .findoppler import seti_event, find_candidates, plot_candidates
except:
from findoppler import seti_event, find_candidates, plot_candidates
| 31.6 | 72 | 0.816456 | 20 | 158 | 6.15 | 0.5 | 0.227642 | 0.325203 | 0.390244 | 0.926829 | 0.926829 | 0.926829 | 0.926829 | 0.926829 | 0 | 0 | 0 | 0.139241 | 158 | 4 | 73 | 39.5 | 0.904412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 12 |
ae9c5de85b2efacfa6e774ce53c8d9367bcd972f | 15,787 | py | Python | test_unit.py | romainledru/Postgres-EasyTalk | eececda16135ba43c8b0fc58dacf187679b87009 | [
"MIT"
] | null | null | null | test_unit.py | romainledru/Postgres-EasyTalk | eececda16135ba43c8b0fc58dacf187679b87009 | [
"MIT"
] | null | null | null | test_unit.py | romainledru/Postgres-EasyTalk | eececda16135ba43c8b0fc58dacf187679b87009 | [
"MIT"
] | null | null | null | import pytest
from easytalk import *
from easytalk.exceptions_raise import UnitError
### TABLE ###
class Test_Table:
def test_create_phrase(self):
p = {
'compulsory': False,
}
table = Table('easyTalk-db','tabletest')
table.add_serialField()
table.add_datetimeField()
table.add_varcharField('name')
table.add_intField('price')
table.add_booleanField('to_buy', p)
table.write_TABLE()
ist = table.phrase_test_unit
soll = "CREATE TABLE tabletest (id SERIAL PRIMARY KEY, datetime timestamp with time zone DEFAULT NOW(), name text NOT NULL, price integer NOT NULL, to_buy boolean);"
table.drop_TABLE()
assert soll == ist
def test_create(self):
table = Table('easyTalk-db','tabletest')
table.write_TABLE()
tables = []
man = Manager('easyTalk-db')
answer = man.scan_database()
man.shutdown_manager()
table.drop_TABLE()
if table in answer:
assert True
def test_create_containsId(self):
table = Table('easyTalk-db','tabletest')
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if 'id' in tablePick.patron:
assert True
except:
raise UnitError('id','not found in patron')
finally:
tablePick.drop_TABLE()
## TEST SERIAL ##
def test_create_addSerialId(self):
nameId = 'id'
table = Table('easyTalk-db','tabletest')
table.add_serialField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_serial_pattern(self):
nameId = 'id'
table = Table('easyTalk-db','tabletest')
table.add_serialField()
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': False,
'primary': True,
'type': int,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## TEST DATETIME ##
def test_create_addDatetime(self):
nameId = 'created_at'
table = Table('easyTalk-db','tabletest')
table.add_datetimeField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_datetime_pattern(self):
nameId = 'created_at'
table = Table('easyTalk-db','tabletest')
table.add_datetimeField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': False,
'primary': False,
'type': datetime.datetime,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## TEST VARCHAR ##
def test_create_addVarchar(self):
nameId = 'varchar1'
table = Table('easyTalk-db','tabletest')
table.add_varcharField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_varchar_pattern(self):
nameId = 'varchar1'
table = Table('easyTalk-db','tabletest')
table.add_varcharField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': True,
'primary': False,
'type': str,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## TEST INT ##
def test_create_addInt(self):
nameId = 'int1'
table = Table('easyTalk-db','tabletest')
table.add_intField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_int_pattern(self):
nameId = 'int1'
table = Table('easyTalk-db','tabletest')
table.add_intField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': True,
'primary': False,
'type': int,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## TEST FLOAT ##
def test_create_addFloat(self):
nameId = 'float1'
table = Table('easyTalk-db','tabletest')
table.add_floatField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_float_pattern(self):
nameId = 'float1'
table = Table('easyTalk-db','tabletest')
table.add_floatField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': True,
'primary': False,
'type': float,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## TEST BOOL ##
def test_create_addBool(self):
nameId = 'bool1'
table = Table('easyTalk-db','tabletest')
table.add_booleanField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
try:
if nameId not in tablePick.patron:
raise
except:
raise UnitError(nameId,'not found in patron')
finally:
tablePick.drop_TABLE()
assert True # if no problems occurs until here
def test_create_boolean_pattern(self):
nameId = 'bool1'
table = Table('easyTalk-db','tabletest')
table.add_booleanField(nameId)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': True,
'primary': False,
'type': bool,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## ALTERED COMPULSORY ##
def test_create_compulsory_altered_pattern(self):
nameId = 'bool1'
p = {
'compulsory': False
}
table = Table('easyTalk-db','tabletest')
table.add_booleanField(nameId, p)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': False,
'primary': False,
'type': bool,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
## ALTERED TYPE ##
def test_create_type_altered_pattern(self):
nameId = 'bool1'
p = {
'type': int
}
table = Table('easyTalk-db','tabletest')
table.add_booleanField(nameId, p)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
pattern = {
'compulsory': True,
'primary': False,
'type': bool,
}
problemTag = ''
try:
for key, value in tablePick.patron[nameId].items():
if tablePick.patron[nameId][key] != pattern[key]:
problemTag = '{},{}'.format(key, value)
except:
raise UnitError(problemTag, 'error in try')
finally:
tablePick.drop_TABLE()
if problemTag == '':
assert True
else:
raise UnitError(problemTag, 'error in pattern')
class Test_Insert:
def test_insert_phrase(self):
p = {
'compulsory': False,
}
table = Table('easyTalk-db','tabletest')
table.add_serialField()
table.add_datetimeField()
table.add_varcharField('name')
table.add_intField('price')
table.add_booleanField('to_buy')
table.add_booleanField('to_buy_nc', p)
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
insert = Insert(tablePick)
entry = {
'name': 'name1',
'price': 30,
'to_buy': True,
}
insert.write_ENTRY(entry)
ist = insert.phrase_test_unit
soll = "INSERT INTO tabletest (name, price, to_buy) VALUES ('name1', 30, True);"
table.drop_TABLE()
assert soll == ist
def test_insert_compulsoryMissing(self):
table = Table('easyTalk-db','tabletest')
table.add_varcharField('name') # 'name' is set as compulsory=True
table.add_intField('price')
table.add_booleanField('to_buy')
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
insert = Insert(tablePick)
entry = { # 'name' is not given for the test
'price': 30,
'to_buy': True,
}
try:
insert.write_ENTRY(entry)
raise UnitError(entry, "should raise compulsory=True for 'name', but didn't")
except CompulsoryEntry:
assert True
finally:
table.drop_TABLE()
def test_insert_typeWrong(self):
table = Table('easyTalk-db','tabletest')
table.add_varcharField('name')
table.add_intField('price')
table.add_booleanField('to_buy') # bool is set as type=bool
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
insert = Insert(tablePick)
entry = {
'name': 'name1',
'price': 30,
'to_buy': 10, # but bool is set as int for test
}
try:
insert.write_ENTRY(entry)
raise UnitError(entry, "should raise TypeError (type=int and should str) for 'name', but didn't")
except TypeError:
assert True
finally:
table.drop_TABLE()
def test_insert_wrongKey(self):
table = Table('easyTalk-db','tabletest')
table.add_varcharField('name')
table.add_intField('price')
table.add_booleanField('to_buy')
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
insert = Insert(tablePick)
entry = {
'name': 'name1',
'price': 30,
'to_buy': True,
'hello_I_should_not_be_here': 10,
}
try:
insert.write_ENTRY(entry)
raise UnitError(entry, "should raise WrongEntry for 'hello_I_should_not_be_here', but didn't")
except WrongEntry:
assert True
finally:
table.drop_TABLE()
def test_insert_doubleKey(self):
table = Table('easyTalk-db','tabletest')
table.add_varcharField('name')
table.add_intField('price')
table.add_booleanField('to_buy')
table.write_TABLE()
tablePick = Table('easyTalk-db','tabletest')
insert = Insert(tablePick)
entry = {
'name': 'name1',
'price': 30,
'to_buy': True,
'price': 10,
}
try:
insert.write_ENTRY(entry)
except:
raise UnitError(entry, "should be ok to write the value twice for a key, but didn't")
finally:
table.drop_TABLE()
assert True
class Test_Read:
# verify phrase
# verify filter with one key/value
# verify filter with two key/value
# verify filter with wrong key
# verify filter with wrong value
pass
class Test_Delete: # one insert, one read, one delete, one read
# verify phrase
# verify one key/data
# verify two key/data
# verify falsch entry (as key)
# verify falsch data (as value)
pass | 28.393885 | 173 | 0.538291 | 1,561 | 15,787 | 5.330557 | 0.091608 | 0.051676 | 0.075712 | 0.121139 | 0.821055 | 0.81096 | 0.79041 | 0.79041 | 0.77731 | 0.751592 | 0 | 0.003233 | 0.353519 | 15,787 | 556 | 174 | 28.393885 | 0.812071 | 0.046367 | 0 | 0.824074 | 0 | 0.002315 | 0.148773 | 0.003668 | 0 | 0 | 0 | 0 | 0.050926 | 1 | 0.050926 | false | 0.00463 | 0.006944 | 0 | 0.06713 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
882b3ad218a5d4f66f8f6e7f6a6d51f1c6de32b7 | 23,564 | py | Python | dlbeamformers.py | dung-n-tran/dlbeamformer | 4f3c53f42ec35fd5ad5cc55e6b3054081a33c254 | [
"MIT"
] | null | null | null | dlbeamformers.py | dung-n-tran/dlbeamformer | 4f3c53f42ec35fd5ad5cc55e6b3054081a33c254 | [
"MIT"
] | null | null | null | dlbeamformers.py | dung-n-tran/dlbeamformer | 4f3c53f42ec35fd5ad5cc55e6b3054081a33c254 | [
"MIT"
] | null | null | null | import numpy as np
from dlbeamformer_utilities import compute_mvdr_tf_beamformers, check_distortless_constraint, compute_steering_vectors,\
compute_null_controlling_tf_beamformers, compute_null_controlling_minibatch_tf_beamformers,\
compute_null_controlling_tf_beamformers_2
from tqdm import tnrange, tqdm
from sklearn.linear_model import orthogonal_mp_gram
from omp import omp
class DictionaryLearningBeamformer(object):
def __init__(self, array_geometry, sampling_frequency,
source_angles, stft_params, angle_grid, diagonal_loading_param=1, bf_type="NC"):
"""
Parameters
----------
array_geometry: 2-D numpy array describing the geometry of the microphone array
sampling_frequency
stft_params: Dictionary of STFT transform parameters including
stft_params["n_samples_per_frame"]
stft_params["n_fft_bins"]
stft_params["hop_size"]
stft_params["window"]
bf_type: Type of the beamformer
"""
self.array_geometry = array_geometry
self.sampling_frequency = sampling_frequency
self.source_angles = source_angles
self.stft_params = stft_params
self.angle_grid = angle_grid
self.diagonal_loading_param = diagonal_loading_param
self.bf_type = bf_type
self.weights_ = None
self.source_steering_vectors = self._compute_source_steering_vectors()
self.steering_vectors = self._compute_steering_vectors()
def _compute_source_steering_vectors(self):
source_steering_vectors = []
for i_source_angle, source_angle in enumerate(self.source_angles):
v = compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
source_angle["elevation"], source_angle["azimuth"])
source_steering_vectors.append(v)
return source_steering_vectors
def _compute_steering_vectors(self):
return compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
self.angle_grid["elevation"], self.angle_grid["azimuth"])
def _compute_weights(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99, batch_size=1, n_atoms_each_config=1):
n_configurations = len(training_data)
n_train_samples_each_config, n_fft_bins, n_mics, _ = training_data[0][0].shape
n_sources = len(self.source_steering_vectors)
D = np.zeros((n_sources, n_fft_bins, n_mics, n_configurations*n_atoms_each_config), dtype=complex)
for i_source in range(n_sources):
for i_configuration in tqdm(range(n_configurations), desc="Training configuration"):
for i_atom in range(n_atoms_each_config):
batch_indices = np.random.choice(len(training_data[i_configuration][0]), batch_size, replace=True)
tf_sample_covariance_batch = training_data[i_configuration][0][batch_indices]
# print(training_data[i_configuration][0].shape, tf_sample_covariance_batch.shape, batch_indices)
null_azimuth_range = self._compute_null_angle_ranges(
training_data[i_configuration][1]["azimuth"], desired_null_width)
null_steering_vectors = compute_steering_vectors(
self.array_geometry, self.sampling_frequency,
self.stft_params["n_fft_bins"],
np.unique(training_data[i_configuration][1]["elevation"]),
np.unique(null_azimuth_range)
)
null_steering_vectors = np.transpose(null_steering_vectors[:, :, 0, :], (0, 2, 1))
w = compute_null_controlling_tf_beamformers_2(
self.source_steering_vectors[i_source][:, 0, 0, :], null_steering_vectors,
tf_sample_covariance_batch,
null_constraint_threshold,
eigenvalue_percentage_threshold=0.99, diagonal_loading_param=self.diagonal_loading_param)
D[i_source, :, :, i_configuration*n_atoms_each_config + i_atom] = w
return D
def _compute_null_angle_ranges(self, null_angles, desired_null_width):
angle_ranges = []
for null_angle in null_angles:
angle_ranges.append(
np.arange(null_angle - desired_null_width/2,
null_angle + desired_null_width/2, 0.1))
return np.concatenate(angle_ranges)
# def _initialize(self, X):
# pass
def _choose_weights(self, source_angle_index, x):
weights_ = self.weights_[source_angle_index]
n_fft_bins, n_mics, n_dictionary_atoms = weights_.shape
min_ave_energy = np.inf
optimal_weight_index = None
for i_dictionary_atom in range(n_dictionary_atoms):
w_frequency = weights_[:, :, i_dictionary_atom]
energy = 0
n_fft_bins = w_frequency.shape[0]
for i_fft_bin in range(n_fft_bins):
w = w_frequency[i_fft_bin]
R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate())
energy += np.real(w.transpose().conjugate().dot(R).dot(w))
ave_energy = energy / n_fft_bins
if min_ave_energy > ave_energy:
min_ave_energy = ave_energy
optimal_weight_index = i_dictionary_atom
optimal_weights = weights_[:, :, optimal_weight_index]
# optimal_weights = np.zeros((n_fft_bins, n_mics), dtype=np.complex64)
# # for i_fft_bin in tqdm(range(n_fft_bins), desc="FFT bin"):
# for i_fft_bin in range(n_fft_bins):
# R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate()) + 1*self.diagonal_loading_param*np.identity(n_mics)
# W = weights_[i_fft_bin]
# i_fft_optimal_weight_index = np.argmin(np.diagonal(np.abs(W.transpose().conjugate().dot(
# R).dot(W))))
# optimal_weights[i_fft_bin] = weights_[i_fft_bin, :, i_fft_optimal_weight_index]
return optimal_weights
def fit(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99, batch_size=1, n_atoms_each_config=1):
"""
Parameters
----------
"""
D = self._compute_weights(training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold, batch_size, n_atoms_each_config)
self.weights_ = D
return self
def choose_weights(self, source_angle_index, x):
return self._choose_weights(source_angle_index, x)
class DLBeamformer(object):
def __init__(self, array_geometry, sampling_frequency,
source_angles, stft_params, angle_grid, diagonal_loading_param=1,
n_dict_atoms=None, n_nonzero_coefficients=None,
n_train_max_iterations=100, train_error_tolerance=1e-6, bf_type=None):
"""
Parameters
----------
array_geometry: 2-D numpy array describing the geometry of the microphone array
sampling_frequency
stft_params: Dictionary of STFT transform parameters including
stft_params["n_samples_per_frame"]
stft_params["n_fft_bins"]
stft_params["hop_size"]
stft_params["window"]
bf_type: Type of the beamformer
"""
print("Initialize DLBeamformer")
self.array_geometry = array_geometry
self.sampling_frequency = sampling_frequency
self.source_angles = source_angles
self.stft_params = stft_params
self.angle_grid = angle_grid
self.diagonal_loading_param = diagonal_loading_param
self.bf_type = bf_type
self.weights_ = None
self.source_steering_vectors = self._compute_source_steering_vectors()
self.steering_vectors = self._compute_steering_vectors()
self.n_dict_atoms = n_dict_atoms
self.n_train_max_iterations = n_train_max_iterations
self.n_nonzero_coefficients = n_nonzero_coefficients
self.train_error_tolerance = train_error_tolerance
self.training_loss = []
def _compute_source_steering_vectors(self):
source_steering_vectors = []
for i_source_angle, source_angle in enumerate(self.source_angles):
v = compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
source_angle["elevation"], source_angle["azimuth"])
source_steering_vectors.append(v)
return source_steering_vectors
def _compute_steering_vectors(self):
return compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
self.angle_grid["elevation"], self.angle_grid["azimuth"])
def _compute_weights(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99,
batch_size=1, n_train_batches_each_config=1, random_seed=0):
np.random.seed(random_seed)
n_configurations = len(training_data)
n_train_samples_each_config, n_fft_bins, n_mics, _ = training_data[0][0].shape
n_sources = len(self.source_steering_vectors)
if self.n_dict_atoms == None:
self.n_dict_atoms = n_configurations
# Initialization
# dictionary = np.random.randn(n_sources, n_fft_bins, n_mics, self.n_dict_atoms) + \
# 1j*np.random.randn(n_sources, n_fft_bins, n_mics, self.n_dict_atoms)
# coefficients = np.zeros((n_sources, n_fft_bins, self.n_dict_atoms,
# n_configurations*n_train_batches_each_config),
# dtype=np.complex64)
coefficients = np.random.randn(n_sources, n_fft_bins, self.n_dict_atoms,
n_configurations*n_train_batches_each_config)\
+ 1j*np.random.randn(n_sources, n_fft_bins, self.n_dict_atoms,
n_configurations*n_train_batches_each_config)
dictionary = np.zeros((n_sources, n_fft_bins, n_mics, self.n_dict_atoms), dtype=np.complex64)
# Compute desired weights
desired_weights = np.zeros((n_sources, n_fft_bins, n_mics,
n_configurations*n_train_batches_each_config),
dtype=np.complex64)
# for i_configuration in tqdm(range(n_configurations), desc=" Training configuration"):
for i_source in range(n_sources):
for i_configuration in range(n_configurations):
for i_batch in range(n_train_batches_each_config):
# Get a batch of data
batch_indices = np.random.choice(len(training_data[i_configuration][0]), batch_size, replace=True)
tf_sample_covariance_batch = training_data[i_configuration][0][batch_indices]
# Compute null steering vectors for nulling constraints
null_azimuth_range = self._compute_null_angle_ranges(
training_data[i_configuration][1]["azimuth"], desired_null_width)
null_steering_vectors = compute_steering_vectors(
self.array_geometry, self.sampling_frequency,
self.stft_params["n_fft_bins"],
np.unique(training_data[i_configuration][1]["elevation"]),
np.unique(null_azimuth_range)
)
null_steering_vectors = np.transpose(null_steering_vectors[:, :, 0, :], (0, 2, 1))
# Compute desired weights for the selected batch of data
w = compute_null_controlling_tf_beamformers_2(
self.source_steering_vectors[i_source][:, 0, 0, :], null_steering_vectors,
tf_sample_covariance_batch,
null_constraint_threshold,
eigenvalue_percentage_threshold=0.99, diagonal_loading_param=self.diagonal_loading_param)
desired_weights[i_source, :, :, i_configuration*n_train_batches_each_config + i_batch] = w
n_desired_weights = desired_weights.shape[3]
# Training loop
for i_source in range(n_sources):
for i_train_iteration in tqdm(range(self.n_train_max_iterations), desc="Training iteration"):
# Each config
i_iteration_train_loss = 0
# Update dictionary given the sparse coeficients
for i_fft_bin in range(n_fft_bins):
### Update dictionary
dictionary[i_source][i_fft_bin] = desired_weights[i_source][i_fft_bin].dot(
np.linalg.pinv(coefficients[i_source][i_fft_bin])
)
# Update sparse coefficients given the dictionary
for i_fft_bin in range(n_fft_bins):
for i_sample in range(n_desired_weights):
coefficients[i_source, i_fft_bin, :, i_sample] = omp(
dictionary[i_source][i_fft_bin],
desired_weights[i_source, i_fft_bin, :, i_sample],
nonneg=False, ncoef=self.n_nonzero_coefficients,
tol=self.train_error_tolerance, verbose=False
).coef
for i_fft_bin in range(n_fft_bins):
i_iteration_train_loss += 0.5 * (1./n_desired_weights) \
* np.linalg.norm(
dictionary[i_source][i_fft_bin].dot(
coefficients[i_source][i_fft_bin]) - \
desired_weights[i_source][i_fft_bin]
)**2
i_iteration_train_loss = i_iteration_train_loss / n_fft_bins
print("\t\tTrain loss at current iteration {:.9f}".format(i_iteration_train_loss))
self.training_loss.append(i_iteration_train_loss)
return dictionary, coefficients, desired_weights
def _compute_null_angle_ranges(self, null_angles, desired_null_width):
angle_ranges = []
for null_angle in null_angles:
angle_ranges.append(
np.arange(null_angle - desired_null_width/2,
null_angle + desired_null_width/2, 0.1))
return np.concatenate(angle_ranges)
# def _initialize(self, X):
# pass
def _choose_weights(self, source_angle_index, x):
weights_ = self.weights_[source_angle_index]
n_fft_bins, n_mics, n_dictionary_atoms = weights_.shape
min_ave_energy = np.inf
optimal_weight_index = None
for i_dictionary_atom in range(n_dictionary_atoms):
w_frequency = weights_[:, :, i_dictionary_atom]
energy = 0
n_fft_bins = w_frequency.shape[0]
for i_fft_bin in range(n_fft_bins):
w = w_frequency[i_fft_bin]
R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate())
energy += np.real(w.transpose().conjugate().dot(R).dot(w))
ave_energy = energy / n_fft_bins
if min_ave_energy > ave_energy:
min_ave_energy = ave_energy
optimal_weight_index = i_dictionary_atom
optimal_weights = weights_[:, :, optimal_weight_index]
# optimal_weights = np.zeros((n_fft_bins, n_mics), dtype=np.complex64)
# # for i_fft_bin in tqdm(range(n_fft_bins), desc="FFT bin"):
# for i_fft_bin in range(n_fft_bins):
# R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate()) + 1*self.diagonal_loading_param*np.identity(n_mics)
# W = weights_[i_fft_bin]
# i_fft_optimal_weight_index = np.argmin(np.diagonal(np.abs(W.transpose().conjugate().dot(
# R).dot(W))))
# optimal_weights[i_fft_bin] = weights_[i_fft_bin, :, i_fft_optimal_weight_index]
return optimal_weights
def fit(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99,
batch_size=1, n_train_batches_each_config=1, random_seed=0):
"""
Parameters
----------
"""
D = self._compute_weights(training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold,
batch_size, n_train_batches_each_config, random_seed)
self.weights_ = D
return self
def choose_weights(self, source_angle_index, x):
return self._choose_weights(source_angle_index, x)
class DLBatchBeamformer(object):
def __init__(self, array_geometry, sampling_frequency,
source_angles, stft_params, angle_grid, bf_type="NC"):
"""
Parameters
----------
array_geometry: 2-D numpy array describing the geometry of the microphone array
sampling_frequency
stft_params: Dictionary of STFT transform parameters including
stft_params["n_samples_per_frame"]
stft_params["n_fft_bins"]
stft_params["hop_size"]
stft_params["window"]
bf_type: Type of the beamformer
"""
print("Initialize DL Batch Beamformer")
self.array_geometry = array_geometry
self.sampling_frequency = sampling_frequency
self.source_angles = source_angles
self.stft_params = stft_params
self.angle_grid = angle_grid
self.bf_type = bf_type
self.weights_ = None
self.source_steering_vectors = self._compute_source_steering_vectors()
self.steering_vectors = self._compute_steering_vectors()
def _compute_source_steering_vectors(self):
source_steering_vectors = []
for i_source_angle, source_angle in enumerate(self.source_angles):
v = compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
source_angle["theta"], source_angle["phi"])
source_steering_vectors.append(v)
return source_steering_vectors
def _compute_steering_vectors(self):
return compute_steering_vectors(self.array_geometry,
self.sampling_frequency, self.stft_params["n_fft_bins"],
self.angle_grid["theta"], self.angle_grid["phi"])
def _compute_weights(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99,
batch_size=1, n_atoms_each_config=1):
n_configurations = len(training_data)
n_fft_bins, n_mics, _ = training_data[0][0][0].shape
n_sources = len(self.source_steering_vectors)
D = np.zeros((n_sources, n_fft_bins, n_mics, n_configurations*n_atoms_each_config), dtype=complex)
for i_source in range(n_sources):
for i_configuration in tqdm(range(n_configurations), desc="Configuration"):
for i_atom in range(n_atoms_each_config):
train_data_indices = np.random.choice(len(training_data[i_configuration][0]), batch_size)
tf_sample_covariance_matrices = training_data[i_configuration][0][train_data_indices]
null_angle_range = self._compute_null_angle_ranges(
training_data[i_configuration][1]["theta"], desired_null_width)
null_steering_vectors = compute_steering_vectors(
self.array_geometry, self.sampling_frequency,
self.stft_params["n_fft_bins"],
np.unique(null_angle_range), np.unique(training_data[i_configuration][1]["phi"])
)
null_steering_vectors = np.transpose(null_steering_vectors[:, :, 0, :], (0, 2, 1))
w = compute_null_controlling_minibatch_tf_beamformers(
self.source_steering_vectors[i_source][:, 0, 0, :], null_steering_vectors,
tf_sample_covariance_matrices,
null_constraint_threshold,
eigenvalue_percentage_threshold=0.99)
D[i_source, :, :, i_configuration*n_atoms_each_config + i_atom] = w
return D
def _compute_null_angle_ranges(self, null_thetas, desired_null_width):
theta_ranges = []
for null_theta in null_thetas:
theta_ranges.append(
np.arange(null_theta - desired_null_width/2,
null_theta + desired_null_width/2, 0.1))
return np.concatenate(theta_ranges)
# def _initialize(self, X):
# pass
def _choose_weights(self, source_angle_index, x):
weights_ = self.weights_[source_angle_index]
n_fft_bins, n_mics, n_dictionary_atoms = weights_.shape
# min_ave_energy = np.inf
# optimal_weight_index = None
# for i_dictionary_atom in range(n_dictionary_atoms):
# w_frequency = weights_[:, :, i_dictionary_atom]
# energy = 0
# n_fft_bins = w_frequency.shape[0]
# for i_fft_bin in range(n_fft_bins):
# w = w_frequency[i_fft_bin]
# R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate())
# energy += np.real(w.transpose().conjugate().dot(R).dot(w))
# ave_energy = energy / n_fft_bins
# if min_ave_energy > ave_energy:
# min_ave_energy = ave_energy
# optimal_weight_index = i_dictionary_atom
# optimal_weight = weights_[:, :, optimal_weight_index]
optimal_weights = np.zeros((n_fft_bins, n_mics), dtype=np.complex64)
for i_fft_bin in tqdm(range(n_fft_bins), desc="FFT bin"):
R = x[i_fft_bin].dot(x[i_fft_bin].transpose().conjugate())
W = weights_[i_fft_bin]
i_fft_optimal_weight_index = np.argmin(np.diagonal(np.abs(W.transpose().conjugate().dot(
R).dot(W))))
optimal_weights[i_fft_bin] = weights_[i_fft_bin, :, i_fft_optimal_weight_index]
return optimal_weights
def fit(self, training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold=0.99,
batch_size=1, n_atoms_each_config=1):
"""
Parameters
----------
"""
D = self._compute_weights(training_data, desired_null_width,
null_constraint_threshold, eigenvalue_percentage_threshold,
batch_size, n_atoms_each_config)
self.weights_ = D
return self
def choose_weights(self, source_angle_index, x):
return self._choose_weights(source_angle_index, x) | 52.481069 | 124 | 0.624894 | 2,799 | 23,564 | 4.827796 | 0.06931 | 0.062162 | 0.028417 | 0.00999 | 0.879967 | 0.856805 | 0.8386 | 0.83342 | 0.82935 | 0.816177 | 0 | 0.007982 | 0.292862 | 23,564 | 449 | 125 | 52.481069 | 0.802977 | 0.171066 | 0 | 0.702614 | 0 | 0 | 0.019321 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078431 | false | 0 | 0.01634 | 0.019608 | 0.173203 | 0.009804 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
88af12b2d9cafa70f49f9867567b93fc01055d24 | 198 | py | Python | GeometrA/src/finder/matcher.py | NTUTVisualScript/Visual_Script | 2f78df3bbe17a08022a1ca66ee889f4ed5a42728 | [
"Apache-2.0"
] | 4 | 2017-08-12T15:32:07.000Z | 2018-01-02T16:11:59.000Z | GeometrA/src/finder/matcher.py | NTUTVisualScript/GeometrA | 2f78df3bbe17a08022a1ca66ee889f4ed5a42728 | [
"Apache-2.0"
] | null | null | null | GeometrA/src/finder/matcher.py | NTUTVisualScript/GeometrA | 2f78df3bbe17a08022a1ca66ee889f4ed5a42728 | [
"Apache-2.0"
] | null | null | null | class Matcher:
def next(self):
raise NotImplementedError("Subclasses should implement this!")
def all(self):
raise NotImplementedError("Subclasses should implement this!")
| 24.75 | 70 | 0.70202 | 20 | 198 | 6.95 | 0.6 | 0.129496 | 0.402878 | 0.546763 | 0.820144 | 0.820144 | 0.820144 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 198 | 7 | 71 | 28.285714 | 0.891026 | 0 | 0 | 0.4 | 0 | 0 | 0.335025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
ee211ce7937f5fbf039ec7f156b51a9296437a0e | 17,709 | py | Python | baoming/webapp/controller/search/search_common.py | hanxiaoshun/RegistrationSystem | 2f7310508fc1725e96fe941b1062ce7f26f265a4 | [
"Apache-2.0"
] | null | null | null | baoming/webapp/controller/search/search_common.py | hanxiaoshun/RegistrationSystem | 2f7310508fc1725e96fe941b1062ce7f26f265a4 | [
"Apache-2.0"
] | 14 | 2020-06-06T01:24:24.000Z | 2022-03-12T00:17:22.000Z | baoming/webapp/controller/search/search_common.py | hanxiaoshun/RegistrationSystem | 2f7310508fc1725e96fe941b1062ce7f26f265a4 | [
"Apache-2.0"
] | null | null | null | from webapp.forms_search.StudentSearchForm import *
from webapp.forms_search.UserInfoSearchForm import *
from webapp.forms_search.PageSearchForm import *
from webapp.models import *
sys_msg = '报名系统'
result = {'status': True, 'message': ''}
def get_student_by_conditions(request, chemical_worker):
"""
按条件查询学生
:param request:
:param chemical_worker: 1 化工类 2 非化工类
:return:
"""
student_search = StudentSearchForm(request.POST)
user_search = UserInfoSearchForm(request.POST)
if student_search.is_valid():
if user_search.is_valid():
# 用户信息
kwargs = {}
# kwargs['confirm_status'] = 1,
# kwargs['chemical_worker'] = 1,
user_info = UserInfo()
real_name = user_search.cleaned_data.get('real_name', None)
if real_name:
user_info.real_name = real_name
kwargs['user_info__real_name__icontains'] = real_name
work_unit = user_search.cleaned_data.get('work_unit', None)
if work_unit:
user_info.work_unit = work_unit
kwargs['user_info__work_unit__icontains'] = work_unit
education_degree = user_search.cleaned_data.get('education_degree', None)
if education_degree:
user_info.education_degree = education_degree
if type(education_degree) == str:
if int(education_degree) > 0:
kwargs['user_info__education_degree__id'] = int(education_degree)
else:
kwargs['user_info__education_degree__id'] = education_degree
# 学生信息
student_info = StudentInfo()
identification_level = student_search.cleaned_data.get('identification_level', None)
declaration_of_occupation = student_search.cleaned_data.get('declaration_of_occupation', None)
teacher_info = student_search.cleaned_data.get('teacher_info', None)
school_term = student_search.cleaned_data.get('school_term', None)
if identification_level:
student_info.identification_level = identification_level
if type(identification_level) == str:
if int(identification_level) > 0:
kwargs['identification_level'] = int(identification_level)
else:
kwargs['identification_level'] = identification_level
if declaration_of_occupation:
student_info.declaration_of_occupation = declaration_of_occupation
kwargs['declaration_of_occupation__icontains'] = declaration_of_occupation
if teacher_info:
if type(teacher_info) == str:
if int(teacher_info) > 0:
kwargs['teacher_info'] = int(teacher_info)
student_info.teacher_info = TeacherInfo.objects.get(id=int(teacher_info))
else:
kwargs['teacher_info'] = teacher_info
student_info.teacher_info = TeacherInfo.objects.get(id=teacher_info)
if school_term:
if type(school_term) == str:
if int(school_term) > 0:
kwargs['school_term'] = int(school_term)
student_info.school_term = SchoolTerm.objects.get(id=int(school_term))
else:
kwargs['school_term'] = school_term
student_info.school_term = SchoolTerm.objects.get(id=school_term)
student_info.user_info = user_info
print(kwargs)
if chemical_worker == 0:
student_infos = StudentInfo.objects.filter(confirm_status=1,
**kwargs).order_by('-id')
else:
student_infos = StudentInfo.objects.filter(confirm_status=1,
chemical_worker=chemical_worker,
**kwargs).order_by('-id')
return student_infos
else:
return None
else:
return None
def get_student_by_conditions_status(request, submit_status=1,
review_status=1,
confirm_status=1,
pay_status=1,
cancel_status=2,
cancel_result=2,
teacher_cancel_status=2,
teacher_cancel_result=2,
examination_status=2,
chemical_worker=0,
record_status=1):
"""
按条件查询学生
:param request:
:param submit_status: 提交
:param review_status: 审核
:param confirm_status: 确认
:param pay_status: 是否缴费
:param cancel_status: 是否申请注销
:param cancel_result: 申请注销结果
:param teacher_cancel_status: 负责人申请注销
:param teacher_cancel_result: 负责人申请注销结果
:param examination_status: 考核通过 默认未通过
:param chemical_worker: 是否为化工类学员,默认0 不区分化工和非化工
:param record_status: 学生的状态
:return:
"""
student_search = StudentSearchForm(request.POST)
user_search = UserInfoSearchForm(request.POST)
if student_search.is_valid():
if user_search.is_valid():
# 用户信息
kwargs = {}
# kwargs['confirm_status'] = 1,
# kwargs['chemical_worker'] = 1,
user_info = UserInfo()
real_name = user_search.cleaned_data.get('real_name', None)
if real_name:
user_info.real_name = real_name
kwargs['user_info__real_name__icontains'] = real_name
work_unit = user_search.cleaned_data.get('work_unit', None)
if work_unit:
user_info.work_unit = work_unit
kwargs['user_info__work_unit__icontains'] = work_unit
education_degree = user_search.cleaned_data.get('education_degree', None)
if education_degree:
user_info.education_degree = education_degree
if type(education_degree) == str:
if int(education_degree) > 0:
kwargs['user_info__education_degree__id'] = int(education_degree)
else:
kwargs['user_info__education_degree__id'] = education_degree
# 学生信息
student_info = StudentInfo()
identification_level = student_search.cleaned_data.get('identification_level', None)
declaration_of_occupation = student_search.cleaned_data.get('declaration_of_occupation', None)
teacher_info = student_search.cleaned_data.get('teacher_info', None)
school_term = student_search.cleaned_data.get('school_term', None)
if identification_level:
student_info.identification_level = identification_level
if type(identification_level) == str:
if int(identification_level) > 0:
kwargs['identification_level'] = int(identification_level)
else:
kwargs['identification_level'] = identification_level
if declaration_of_occupation:
student_info.declaration_of_occupation = declaration_of_occupation
kwargs['declaration_of_occupation__icontains'] = declaration_of_occupation
if teacher_info:
if type(teacher_info) == str:
if int(teacher_info) > 0:
kwargs['teacher_info'] = int(teacher_info)
student_info.teacher_info = TeacherInfo.objects.get(id=int(teacher_info))
else:
kwargs['teacher_info'] = teacher_info
student_info.teacher_info = TeacherInfo.objects.get(id=teacher_info)
if school_term:
if type(school_term) == str:
if int(school_term) > 0:
kwargs['school_term'] = int(school_term)
student_info.school_term = SchoolTerm.objects.get(id=int(school_term))
else:
kwargs['school_term'] = school_term
student_info.school_term = SchoolTerm.objects.get(id=school_term)
student_info.user_info = user_info
print(kwargs)
if chemical_worker == 0:
student_infos = StudentInfo.objects.filter(submit_status=submit_status, review_status=review_status,
confirm_status=confirm_status, pay_status=pay_status,
cancel_status=cancel_status,
cancel_result=cancel_result,
teacher_cancel_status=teacher_cancel_status,
teacher_cancel_result=teacher_cancel_result,
examination_status=examination_status,
record_status=record_status,
**kwargs).order_by('-id')
else:
student_infos = StudentInfo.objects.filter(submit_status=submit_status, review_status=review_status,
confirm_status=confirm_status, pay_status=pay_status,
cancel_status=cancel_status,
cancel_result=cancel_result,
teacher_cancel_status=teacher_cancel_status,
teacher_cancel_result=teacher_cancel_result,
examination_status=examination_status,
chemical_worker=chemical_worker,
record_status=record_status,
**kwargs).order_by('-id')
return student_infos
else:
return None
else:
return None
def get_student_by_conditions_status_skill_main_class(request,
submit_status=1,
review_status=1,
confirm_status=1,
pay_status=1,
cancel_status=2,
cancel_result=2,
teacher_cancel_status=2,
teacher_cancel_result=2,
examination_status=2,
chemical_worker=0,
record_status=1):
"""
按条件查询学生
:param request:
:param submit_status: 提交
:param review_status: 审核
:param confirm_status: 确认
:param pay_status: 是否缴费
:param cancel_status: 是否申请注销
:param cancel_result: 申请注销结果
:param teacher_cancel_status: 负责人申请注销
:param teacher_cancel_result: 负责人申请注销结果
:param examination_status: 考核通过 默认未通过
:param chemical_worker: 是否为化工类学员,默认0 不区分化工和非化工
:param record_status: 学生的状态
:return:
"""
student_search = StudentSearchForm(request.POST)
user_search = UserInfoSearchForm(request.POST)
if student_search.is_valid():
if user_search.is_valid():
# 用户信息
kwargs = {}
# kwargs['confirm_status'] = 1,
# kwargs['chemical_worker'] = 1,
user_info = UserInfo()
real_name = user_search.cleaned_data.get('real_name', None)
if real_name:
user_info.real_name = real_name
kwargs['user_info__real_name__icontains'] = real_name
work_unit = user_search.cleaned_data.get('work_unit', None)
if work_unit:
user_info.work_unit = work_unit
kwargs['user_info__work_unit__icontains'] = work_unit
education_degree = user_search.cleaned_data.get('education_degree', None)
if education_degree:
user_info.education_degree = education_degree
if type(education_degree) == str:
if int(education_degree) > 0:
kwargs['user_info__education_degree__id'] = int(education_degree)
else:
kwargs['user_info__education_degree__id'] = education_degree
# 学生信息
student_info = StudentInfo()
identification_level = student_search.cleaned_data.get('identification_level', None)
declaration_of_occupation = student_search.cleaned_data.get('declaration_of_occupation', None)
teacher_info = student_search.cleaned_data.get('teacher_info', None)
school_term = student_search.cleaned_data.get('school_term', None)
if identification_level:
student_info.identification_level = identification_level
if type(identification_level) == str:
if int(identification_level) > 0:
kwargs['identification_level'] = int(identification_level)
else:
kwargs['identification_level'] = identification_level
if declaration_of_occupation:
student_info.declaration_of_occupation = declaration_of_occupation
kwargs['declaration_of_occupation__icontains'] = declaration_of_occupation
if teacher_info:
if type(teacher_info) == str:
if int(teacher_info) > 0:
kwargs['teacher_info'] = int(teacher_info)
student_info.teacher_info = TeacherInfo.objects.get(id=int(teacher_info))
else:
kwargs['teacher_info'] = teacher_info
student_info.teacher_info = TeacherInfo.objects.get(id=teacher_info)
if school_term:
if type(school_term) == str:
if int(school_term) > 0:
kwargs['school_term'] = int(school_term)
student_info.school_term = SchoolTerm.objects.get(id=int(school_term))
else:
kwargs['school_term'] = school_term
student_info.school_term = SchoolTerm.objects.get(id=school_term)
student_info.user_info = user_info
print(kwargs)
skill_main_class = student_search.cleaned_data.get('skill_main_class', None)
if skill_main_class:
report_skills = ReportSkill.objects.filter(skill_main_class=skill_main_class).values('skill_id')
if len(report_skills) > 0:
report_conditions = ReportCondition.objects.filter(condition_for_skill__in=report_skills).values('condition_id')
kwargs['condition_selected__in']=report_conditions
print(kwargs)
if chemical_worker == 0:
student_infos = StudentInfo.objects.filter(submit_status=submit_status, review_status=review_status,
confirm_status=confirm_status, pay_status=pay_status,
cancel_status=cancel_status,
cancel_result=cancel_result,
teacher_cancel_status=teacher_cancel_status,
teacher_cancel_result=teacher_cancel_result,
examination_status=examination_status,
record_status=record_status,
**kwargs).order_by('-id')
else:
student_infos = StudentInfo.objects.filter(submit_status=submit_status, review_status=review_status,
confirm_status=confirm_status, pay_status=pay_status,
cancel_status=cancel_status,
cancel_result=cancel_result,
teacher_cancel_status=teacher_cancel_status,
teacher_cancel_result=teacher_cancel_result,
examination_status=examination_status,
chemical_worker=chemical_worker,
record_status=record_status,
**kwargs).order_by('-id')
return student_infos
else:
return None
else:
return None | 53.663636 | 132 | 0.527754 | 1,581 | 17,709 | 5.518027 | 0.067679 | 0.049175 | 0.04287 | 0.050436 | 0.940738 | 0.928588 | 0.928588 | 0.928588 | 0.925378 | 0.925378 | 0 | 0.004766 | 0.407589 | 17,709 | 330 | 133 | 53.663636 | 0.826804 | 0.059179 | 0 | 0.93985 | 0 | 0 | 0.068955 | 0.034993 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011278 | false | 0 | 0.015038 | 0 | 0.06015 | 0.015038 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee21611a424f59e8274a0d172bc32e3bebe06704 | 30,342 | py | Python | CompetitiveProgrammingLite.py | JameelKaisar/Competitive-Programming-Lite | 39a165c80ac05bb8268990fa5422120122f4551a | [
"MIT"
] | 1 | 2022-02-15T16:24:42.000Z | 2022-02-15T16:24:42.000Z | CompetitiveProgrammingLite.py | JameelKaisar/Competitive-Programming-Lite | 39a165c80ac05bb8268990fa5422120122f4551a | [
"MIT"
] | null | null | null | CompetitiveProgrammingLite.py | JameelKaisar/Competitive-Programming-Lite | 39a165c80ac05bb8268990fa5422120122f4551a | [
"MIT"
] | null | null | null | # Importing Required Libraries
import sublime_plugin
import sublime
import shutil
import json
import os
# Initilise Plugin
def plugin_init():
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
os.makedirs(plugin_data_path, exist_ok=True)
# Copy the Default Files
if not os.path.exists(os.path.join(plugin_data_path, ".cpl_init")):
# Build Folder
plugin_build_path = os.path.join(plugin_data_path, "build")
os.makedirs(plugin_build_path, exist_ok=True)
for f in sublime.find_resources("*"):
if f.startswith("Packages/Competitive Programming Lite/build/"):
content = sublime.load_binary_resource(f)
f_name = f.replace("Packages/Competitive Programming Lite/build/", "")
if f.endswith(".build-json"):
f_name = f_name[:-11] + ".sublime-build"
with open(os.path.join(plugin_build_path, f_name), "wb") as f:
f.write(content)
# Files Folder
plugin_files_path = os.path.join(plugin_data_path, "files")
os.makedirs(plugin_files_path, exist_ok=True)
for f in sublime.find_resources("*"):
if f.startswith("Packages/Competitive Programming Lite/files/"):
content = sublime.load_binary_resource(f)
f_name = f.replace("Packages/Competitive Programming Lite/files/", "")
with open(os.path.join(plugin_files_path, f_name), "wb") as f:
f.write(content)
# Languages Folder
plugin_languages_path = os.path.join(plugin_data_path, "languages")
os.makedirs(plugin_languages_path, exist_ok=True)
for f in sublime.find_resources("*"):
if f.startswith("Packages/Competitive Programming Lite/languages/"):
content = sublime.load_binary_resource(f)
f_name = f.replace("Packages/Competitive Programming Lite/languages/", "")
with open(os.path.join(plugin_languages_path, f_name), "wb") as f:
f.write(content)
# Sublime Folder
plugin_sublime_path = os.path.join(plugin_data_path, "sublime")
os.makedirs(plugin_sublime_path, exist_ok=True)
# Templates Folder
plugin_templates_path = os.path.join(plugin_data_path, "templates")
os.makedirs(plugin_templates_path, exist_ok=True)
for f in sublime.find_resources("*"):
if f.startswith("Packages/Competitive Programming Lite/templates/"):
content = sublime.load_binary_resource(f)
f_name = f.replace("Packages/Competitive Programming Lite/templates/", "")
with open(os.path.join(plugin_templates_path, f_name), "wb") as f:
f.write(content)
# Successfully Initialized
with open(os.path.join(plugin_data_path, ".cpl_init"), "w") as f:
f.write("Do not delete this file")
# Command for Creating a File
class CpNewCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, question, template):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
try:
project_path = project_window.project_data()["folders"][0]["path"]
except:
sublime.error_message("Please Open a Folder First")
return
if not os.path.exists(project_path):
sublime.error_message("Please Open a Valid Folder")
return
try:
os.makedirs(os.path.join(project_path, ".cpl_files"), exist_ok=True)
except:
sublime.error_message("Unable to Create '.cpl_files' Folder")
return
# Handling Input Values
if not (question.isdigit() and int(question)):
question = "1"
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
if template in templates.values():
template = list(templates.keys())[list(templates.values()).index(template)]
else:
template = "1"
# Closing Old Tabs
project_window.focus_group(0)
[v.run_command("save") if v.file_name() else v.erase(edit, sublime.Region(0, v.size())) for v in project_window.views()]
project_window.run_command("close_all")
# Copying Solution Tester File
if not (os.path.exists(os.path.join(project_path, ".cpl_files/solution_tester.py"))):
shutil.copyfile(os.path.join(plugin_path, "files/solution_tester.py"), os.path.join(project_path, ".cpl_files/solution_tester.py"))
# Copying Java Runner File
if not (os.path.exists(os.path.join(project_path, ".cpl_files/java_runner.py"))):
shutil.copyfile(os.path.join(plugin_path, "files/java_runner.py"), os.path.join(project_path, ".cpl_files/java_runner.py"))
# Language Template Code
template_name = language_templates[template][0]
language_extension = language_templates[template][2]
with open(os.path.join(plugin_path, "templates/{0}.{1}".format(template_name, language_extension)), "r") as f:
language_template = f.read()
# Generating CP Files
if (os.path.exists(os.path.join(project_path, "{0}_{1}.{2}".format(question, template_name, language_extension)))):
if sublime.ok_cancel_dialog("{0}_{1}.{2} already exists. Do you want to override the old file?".format(question, template_name, language_extension), "Yes"):
with open(os.path.join(project_path, "{0}_{1}.{2}".format(question, template_name, language_extension)), "w") as f:
f.write(language_template.replace(r"{N}", str(question)))
else:
with open(os.path.join(project_path, "{0}_{1}.{2}".format(question, template_name, language_extension)), "w") as f:
f.write(language_template.replace(r"{N}", str(question)))
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)))):
with open(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)), "w") as f:
f.write("")
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.out.txt".format(question)))):
with open(os.path.join(project_path, ".cpl_files/{0}.out.txt".format(question)), "w") as f:
f.write("")
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)))):
with open(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)), "w") as f:
f.write("")
# Setting CP Layout
project_window.set_minimap_visible(False)
project_window.set_sidebar_visible(False)
project_window.run_command("set_layout", {"cells": [[0, 0, 1, 2], [1, 0, 2, 1], [1, 1, 2, 2]], "cols": [0.0, 0.8, 1.0], "rows": [0.0, 0.5, 1.0]})
# Opening New Tabs
project_window.focus_group(0)
project_window.open_file(os.path.join(project_path, "{0}_{1}.{2}".format(question, language_templates[template][0], language_templates[template][2])))
project_window.focus_group(1)
project_window.open_file(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)))
project_window.focus_group(2)
project_window.open_file(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)))
project_window.focus_group(0)
# Changing Build System
with open(os.path.join(plugin_path, "build/build.json"), "r") as f:
build_systems = json.load(f)
project_window.run_command("set_build_system", {"file": "Packages/User/Competitive Programming Lite/build/{0}.sublime-build".format(build_systems[language_templates[template][2]])})
# Getting Question Nummber and Template Input
def input(self, args):
if "question" not in args:
return QuestionInputHandler()
elif "template" not in args:
return TemplateInputHandler()
# Command for Creating a Set of Files
class CpSetCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, questions, template):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
try:
project_path = project_window.project_data()["folders"][0]["path"]
except:
sublime.error_message("Please Open a Folder First")
return
if not os.path.exists(project_path):
sublime.error_message("Please Open a Valid Folder")
return
try:
os.makedirs(os.path.join(project_path, ".cpl_files"), exist_ok=True)
except:
sublime.error_message("Unable to Create '.cpl_files' Folder")
return
# Handling Input Values
if not (questions.isdigit() and int(questions)):
questions = "6"
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
if template in templates.values():
template = list(templates.keys())[list(templates.values()).index(template)]
else:
template = "1"
# Closing Old Tabs
project_window.focus_group(0)
[v.run_command("save") if v.file_name() else v.erase(edit, sublime.Region(0, v.size())) for v in project_window.views()]
project_window.run_command("close_all")
# Copying Solution Tester File
if not (os.path.exists(os.path.join(project_path, ".cpl_files/solution_tester.py"))):
shutil.copyfile(os.path.join(plugin_path, "files/solution_tester.py"), os.path.join(project_path, ".cpl_files/solution_tester.py"))
# Copying Java Runner File
if not (os.path.exists(os.path.join(project_path, ".cpl_files/java_runner.py"))):
shutil.copyfile(os.path.join(plugin_path, "files/java_runner.py"), os.path.join(project_path, ".cpl_files/java_runner.py"))
# Language Template Code
template_name = language_templates[template][0]
language_extension = language_templates[template][2]
with open(os.path.join(plugin_path, "templates/{0}.{1}".format(template_name, language_extension)), "r") as f:
language_template = f.read()
# Generating CP Files
override_flag = False
for i in range(1, int(questions)+1):
if (not override_flag) and (os.path.exists(os.path.join(project_path, "{0}_{1}.{2}".format(i, template_name, language_extension)))):
choice = sublime.yes_no_cancel_dialog("{0}_{1}.{2} already exists. Do you want to override the old file?".format(i, template_name, language_extension), "Yes", "Yes to All")
if int(choice) == 2:
override_flag = True
if choice:
with open(os.path.join(project_path, "{0}_{1}.{2}".format(i, template_name, language_extension)), "w") as f:
f.write(language_template.replace(r"{N}", str(i)))
else:
with open(os.path.join(project_path, "{0}_{1}.{2}".format(i, template_name, language_extension)), "w") as f:
f.write(language_template.replace(r"{N}", str(i)))
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(i)))):
with open(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(i)), "w") as f:
f.write("")
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.out.txt".format(i)))):
with open(os.path.join(project_path, ".cpl_files/{0}.out.txt".format(i)), "w") as f:
f.write("")
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(i)))):
with open(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(i)), "w") as f:
f.write("")
# Setting CP Layout
project_window.set_minimap_visible(False)
project_window.set_sidebar_visible(False)
project_window.run_command("set_layout", {"cells": [[0, 0, 1, 2], [1, 0, 2, 1], [1, 1, 2, 2]], "cols": [0.0, 0.8, 1.0], "rows": [0.0, 0.5, 1.0]})
# Opening New Tabs
project_window.focus_group(0)
project_window.open_file(os.path.join(project_path, "1_{0}.{1}".format(language_templates[template][0], language_templates[template][2])))
project_window.focus_group(1)
project_window.open_file(os.path.join(project_path, ".cpl_files/1.in.txt"))
project_window.focus_group(2)
project_window.open_file(os.path.join(project_path, ".cpl_files/1.tst.txt"))
project_window.focus_group(0)
# Changing Build System
with open(os.path.join(plugin_path, "build/build.json"), "r") as f:
build_systems = json.load(f)
project_window.run_command("set_build_system", {"file": "Packages/User/Competitive Programming Lite/build/{0}.sublime-build".format(build_systems[language_templates[template][2]])})
# Getting Total Questions and Template Input
def input(self, args):
if "questions" not in args:
return QuestionsInputHandler()
elif "template" not in args:
return TemplateInputHandler()
# Command for Opening a File
class CpOpenCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, question, template):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
try:
project_path = project_window.project_data()["folders"][0]["path"]
except:
sublime.error_message("Please Open a Folder First")
return
if not os.path.exists(project_path):
sublime.error_message("Please Open a Valid Folder")
return
try:
os.makedirs(os.path.join(project_path, ".cpl_files"), exist_ok=True)
except:
sublime.error_message("Unable to Create '.cpl_files' Folder")
return
# Handling Input Values
o_question, o_template = question, template
if not (question.isdigit() and int(question)):
question = "1"
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
if template in templates.values():
template = list(templates.keys())[list(templates.values()).index(template)]
else:
template = "1"
# Closing Old Tabs
project_window.focus_group(0)
[v.run_command("save") if v.file_name() else v.erase(edit, sublime.Region(0, v.size())) for v in project_window.views()]
project_window.run_command("close_all")
# Setting CP Layout
project_window.set_minimap_visible(False)
project_window.set_sidebar_visible(False)
project_window.run_command("set_layout", {"cells": [[0, 0, 1, 2], [1, 0, 2, 1], [1, 1, 2, 2]], "cols": [0.0, 0.8, 1.0], "rows": [0.0, 0.5, 1.0]})
# Opening New Tabs
project_window.focus_group(0)
if (os.path.exists(os.path.join(project_path, "{0}_{1}.{2}".format(question, language_templates[template][0], language_templates[template][2])))):
project_window.open_file(os.path.join(project_path, "{0}_{1}.{2}".format(question, language_templates[template][0], language_templates[template][2])))
elif sublime.ok_cancel_dialog("{0}_{1}.{2} does not exist. Do you want to create {0}_{1}.{2}?".format(question, language_templates[template][0], language_templates[template][2]), "Yes"):
project_window.run_command("cp_new", {"question": o_question, "template": o_template})
return
else:
project_window.run_command("cp_end")
return
project_window.focus_group(1)
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)))):
with open(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)), "w") as f:
f.write("")
project_window.open_file(os.path.join(project_path, ".cpl_files/{0}.in.txt".format(question)))
project_window.focus_group(2)
if not (os.path.exists(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)))):
with open(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)), "w") as f:
f.write("")
project_window.open_file(os.path.join(project_path, ".cpl_files/{0}.tst.txt".format(question)))
project_window.focus_group(0)
# Changing Build System
with open(os.path.join(plugin_path, "build/build.json"), "r") as f:
build_systems = json.load(f)
project_window.run_command("set_build_system", {"file": "Packages/User/Competitive Programming Lite/build/{0}.sublime-build".format(build_systems[language_templates[template][2]])})
# Getting Question Nummber and Template Input
def input(self, args):
if "question" not in args:
return QuestionInputHandler()
elif "template" not in args:
return TemplateInputHandler()
# Command for Exiting CP Mode
class CpEndCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
# Closing All Tabs
project_window.focus_group(0)
[v.run_command("save") if v.file_name() else v.erase(edit, sublime.Region(0, v.size())) for v in project_window.views()]
project_window.run_command("close_all")
# Setting Default Layout
project_window.set_minimap_visible(True)
project_window.set_sidebar_visible(False)
project_window.run_command("set_layout", {"cells": [[0, 0, 1, 1]], "cols": [0.0, 1.0], "rows": [0.0, 1.0]})
# Changing Build System
project_window.run_command("set_build_system", {"file": ""})
# Command for Adding a Template
class CpAddCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, language, name):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
# Handling Input Values
with open(os.path.join(plugin_path, "languages/languages.json"), "r") as f:
languages_data = json.load(f)
languages = {i: j[0] for i, j in languages_data.items()}
if language in languages.values():
language = list(languages.keys())[list(languages.values()).index(language)]
else:
sublime.error_message("Unknown Error")
return
if name == "":
sublime.error_message("Empty Name")
return
elif all(c.isalnum() or c == ' ' for c in name) and not name.isspace():
pass
else:
sublime.error_message("Invalid Name")
return
# Checking if Template Already Exists
if os.path.exists(os.path.join(plugin_path, "templates/{0}.{1}".format(name, languages_data[language][1]))):
sublime.error_message("Template with same name already exists.")
return
# Creating Template File
with open(os.path.join(plugin_path, "templates/Default.{0}".format(languages_data[language][1])), "r") as f:
default_value = f.read()
with open(os.path.join(plugin_path, "templates/{0}.{1}".format(name, languages_data[language][1])), "w") as f:
f.write(default_value)
# Adding Template File
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
templates_data = json.load(f)
templates_data[max([int(x) for x in templates_data.keys()])+1] = [name, languages_data[language][0], languages_data[language][1]]
with open(os.path.join(plugin_path, "templates/templates.json"), "w") as f:
f.write(json.dumps(templates_data, indent=4))
# Opening Template File
language_extension = languages_data[language][1]
project_window.run_command("open_file", {"file": os.path.join(plugin_path, "templates/{0}.{1}".format(name, language_extension))})
# Getting Language and Name Input
def input(self, args):
if "language" not in args:
return LanguageInputHandler()
elif "name" not in args:
return NameInputHandler()
# Command for Editing a Template
class CpEditCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, template):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
project_window = sublime.active_window()
# Handling Input Values
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
if template in templates.values():
template = list(templates.keys())[list(templates.values()).index(template)]
else:
sublime.message_dialog("Please add a template first.")
return
# Opening Template File
template_name = language_templates[template][0]
language_extension = language_templates[template][2]
project_window.run_command("open_file", {"file": os.path.join(plugin_path, "templates/{0}.{1}".format(template_name, language_extension))})
# Getting Template Input
def input(self, args):
if "template" not in args:
return TemplateInputHandlerND()
# Command for Deleting a Template
class CpDeleteCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit, template):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
# Handling Input Values
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
if template in templates.values():
template = list(templates.keys())[list(templates.values()).index(template)]
else:
sublime.message_dialog("Please add a template first.")
return
# Confirmation
if not sublime.ok_cancel_dialog("Are you sure you want to delete {0} template?".format(templates[template]), "Yes"):
return
# Removing Template File
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
templates_data = json.load(f)
template_name = templates_data[template][0]
language_extension = templates_data[template][2]
del templates_data[template]
with open(os.path.join(plugin_path, "templates/templates.json"), "w") as f:
f.write(json.dumps(templates_data, indent=4))
# Deleting Template File
os.remove(os.path.join(plugin_path, "templates/{0}.{1}".format(template_name, language_extension)))
# Getting Template Input
def input(self, args):
if "template" not in args:
return TemplateInputHandlerND()
# Command for Setting Key Bindings
class CpKeyBindingsCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit):
plugin_init()
project_window = sublime.active_window()
project_window.run_command("edit_settings", {
"base_file": "${packages}/Competitive Programming Lite/sublime/Default ($platform).sublime-keymap",
"user_file": "${packages}/User/Competitive Programming Lite/sublime/Default ($platform).sublime-keymap",
"default": "[\n\t$0\n]\n"
})
# Command for Opening Help Page
class CpHelpCommand(sublime_plugin.TextCommand):
# Default run Command
def run(self, edit):
project_window = sublime.active_window()
project_window.run_command("open_url", {"url": "https://github.com/JameelKaisar/Competitive-Programming-Lite"})
# Handler for Getting Total Questions Input
class QuestionsInputHandler(sublime_plugin.TextInputHandler):
def name(self):
return "questions"
def placeholder(self):
return "Enter Number of Questions"
def preview(self, questions):
if questions.isdigit() and int(questions):
return "Questions: {0}".format(questions)
else:
return "Default: 6"
# Handler for Getting Question Number Input
class QuestionInputHandler(sublime_plugin.TextInputHandler):
def name(self):
return "question"
def placeholder(self):
return "Enter Question Number"
def preview(self, question):
if question.isdigit() and int(question):
return "Question: {0}".format(question)
else:
return "Default: 1"
# Handler for Getting Template Input
class TemplateInputHandler(sublime_plugin.ListInputHandler):
def __init__(self):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
self.templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items()}
def name(self):
return "template"
def placeholder(self):
return "Select Template"
def list_items(self):
return [x for x in self.templates.values()]
def preview(self, template):
return "Selected: {0}".format(template)
# Handler for Getting Template Input (No Default)
class TemplateInputHandlerND(sublime_plugin.ListInputHandler):
def __init__(self):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
with open(os.path.join(plugin_path, "templates/templates.json"), "r") as f:
language_templates = json.load(f)
self.templates = {i: "{0} ({1})".format(j[0], j[1]) for i, j in language_templates.items() if j[0] != "Default"}
def name(self):
return "template"
def placeholder(self):
return "Select Template"
def list_items(self):
return [x for x in self.templates.values()]
def preview(self, template):
return "Selected: {0}".format(template)
# Handler for Getting Language Input
class LanguageInputHandler(sublime_plugin.ListInputHandler):
def __init__(self):
plugin_init()
sublime_path = os.path.dirname(sublime.packages_path())
sublime_user_path = os.path.join(sublime_path, "Packages", "User")
plugin_data_path = os.path.join(sublime_user_path, "Competitive Programming Lite")
plugin_path = plugin_data_path
with open(os.path.join(plugin_path, "languages/languages.json"), "r") as f:
languages = json.load(f)
self.languages = {i: j[0] for i, j in languages.items()}
def name(self):
return "language"
def placeholder(self):
return "Select Language"
def list_items(self):
return [x for x in self.languages.values()]
def preview(self, language):
return "Selected: {0}".format(language)
# Handler for Getting Name Input
class NameInputHandler(sublime_plugin.TextInputHandler):
def name(self):
return "name"
def placeholder(self):
return "Enter Name of Template"
def preview(self, name):
if name == "":
return "Enter Name of Template"
elif all(c.isalnum() or c == ' ' for c in name) and not name.isspace():
return "Name: {0}".format(name)
else:
return "Invalid Name"
| 46.394495 | 194 | 0.639839 | 3,931 | 30,342 | 4.764437 | 0.061562 | 0.043248 | 0.055529 | 0.03903 | 0.840568 | 0.814619 | 0.801484 | 0.784132 | 0.769395 | 0.751722 | 0 | 0.01157 | 0.228034 | 30,342 | 653 | 195 | 46.465544 | 0.788029 | 0.061235 | 0 | 0.703782 | 0 | 0.006303 | 0.150491 | 0.045447 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084034 | false | 0.002101 | 0.010504 | 0.037815 | 0.237395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee3e506dbf229fdd0eedc42e4c75c1fdd1869153 | 4,959 | py | Python | pynos/versions/ver_7/ver_7_0_0/yang/tailf_netconf_transactions.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_0_0/yang/tailf_netconf_transactions.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_0_0/yang/tailf_netconf_transactions.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class tailf_netconf_transactions(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def start_transaction_input_target_target_startup_startup(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
startup = ET.SubElement(target, "startup")
startup = ET.SubElement(startup, "startup")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_target_target_running_running(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
running = ET.SubElement(target, "running")
running = ET.SubElement(running, "running")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_target_target_candidate_candidate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
candidate = ET.SubElement(target, "candidate")
candidate = ET.SubElement(candidate, "candidate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_with_inactive(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
with_inactive = ET.SubElement(input, "with-inactive", xmlns="http://tail-f.com/ns/netconf/inactive/1.0")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_target_target_startup_startup(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
startup = ET.SubElement(target, "startup")
startup = ET.SubElement(startup, "startup")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_target_target_running_running(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
running = ET.SubElement(target, "running")
running = ET.SubElement(running, "running")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_target_target_candidate_candidate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
target = ET.SubElement(input, "target")
target = ET.SubElement(target, "target")
candidate = ET.SubElement(target, "candidate")
candidate = ET.SubElement(candidate, "candidate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def start_transaction_input_with_inactive(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
start_transaction = ET.Element("start_transaction")
config = start_transaction
input = ET.SubElement(start_transaction, "input")
with_inactive = ET.SubElement(input, "with-inactive", xmlns="http://tail-f.com/ns/netconf/inactive/1.0")
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 39.672 | 112 | 0.641662 | 505 | 4,959 | 6.116832 | 0.089109 | 0.207187 | 0.16316 | 0.104888 | 0.953707 | 0.953707 | 0.953707 | 0.953707 | 0.953707 | 0.953707 | 0 | 0.001064 | 0.241783 | 4,959 | 125 | 113 | 39.672 | 0.820479 | 0.056261 | 0 | 0.953488 | 1 | 0 | 0.122546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104651 | false | 0 | 0.011628 | 0 | 0.22093 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ee3edc163b2fc18c6bcd2ed524d170733962a439 | 41,032 | py | Python | tests/dhcpv4/kea_only/test_kea_logging.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 22 | 2015-02-27T11:51:05.000Z | 2022-02-28T12:39:29.000Z | tests/dhcpv4/kea_only/test_kea_logging.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 16 | 2018-10-30T15:00:12.000Z | 2019-01-11T17:55:13.000Z | tests/dhcpv4/kea_only/test_kea_logging.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 11 | 2015-02-27T11:51:36.000Z | 2021-03-30T08:33:54.000Z | """Logging in Kea"""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_control
import srv_msg
import misc
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_options_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.options', 'DEBUG', 99)
srv_control.config_srv_opt('time-offset', '50')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.options')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_options_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.options', 'INFO', 'None')
srv_control.config_srv_opt('time-offset', '50')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.options')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_bad_packets_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.bad-packets', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.100')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'NAK')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.bad-packets')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_bad_packets_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.bad-packets', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.100')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'NAK')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.bad-packets')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_dhcp4():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.dhcp4', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.dhcp4')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcp4')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_dhcp4_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.dhcp4', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcp4')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcp4')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_alloc_engine():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.alloc-engine', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.alloc-engine')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_dhcpsrv_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.dhcpsrv', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.dhcpsrv')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcpsrv')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_dhcpsrv_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.dhcpsrv', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcpsrv')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcpsrv')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_leases_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.leases', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:22')
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:22')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:21')
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
srv_msg.log_contains(r'INFO \[kea-dhcp4.leases')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.leases')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_leases_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.leases', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.leases')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_packets_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.packets', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.packets')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_packets_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.packets', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.packets')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_hosts_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.hosts', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.hosts')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_hosts_info():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.hosts', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.hosts')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_all():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:33')
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:33')
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:31')
srv_msg.client_save_option_count(1, 'server_id')
srv_msg.client_add_saved_option_count(1)
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:33')
srv_msg.client_add_saved_option_count(1)
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.packets')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.dhcpsrv')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.alloc-engine')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.dhcp4')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.options')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.leases')
srv_msg.log_contains(r'INFO \[kea-dhcp4.leases')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_all_different_levels_same_file():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.configure_loggers('kea-dhcp4.dhcp4', 'INFO', 'None')
srv_control.configure_loggers('kea-dhcp4.dhcpsrv', 'INFO', 'None')
srv_control.configure_loggers('kea-dhcp4.options', 'DEBUG', 99)
srv_control.configure_loggers('kea-dhcp4.packets', 'DEBUG', 99)
srv_control.configure_loggers('kea-dhcp4.leases', 'WARN', 'None')
srv_control.configure_loggers('kea-dhcp4.alloc-engine', 'DEBUG', 50)
srv_control.configure_loggers('kea-dhcp4.bad-packets', 'DEBUG', 25)
srv_control.configure_loggers('kea-dhcp4.options', 'INFO', 'None')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.100')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'NAK')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.packets')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.leases')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.alloc-engine')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcp4')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcp4')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcpsrv')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcpsrv')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.options')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_v4_loggers_all_different_levels_different_file():
# https://gitlab.isc.org/isc-projects/kea/issues/592
# bug: #592
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.1-192.168.50.1')
srv_control.config_srv_opt('log-servers', '199.199.199.1,100.100.100.1')
srv_control.configure_loggers('kea-dhcp4.dhcp4', 'INFO', 'None', 'kea.log1')
srv_control.configure_loggers('kea-dhcp4.dhcpsrv', 'INFO', 'None', 'kea.log2')
srv_control.configure_loggers('kea-dhcp4.options', 'DEBUG', 99, 'kea.log3')
srv_control.configure_loggers('kea-dhcp4.packets', 'DEBUG', 99, 'kea.log4')
srv_control.configure_loggers('kea-dhcp4.leases', 'WARN', 'None', 'kea.log5')
srv_control.configure_loggers('kea-dhcp4.alloc-engine', 'DEBUG', 50, 'kea.log6')
srv_control.configure_loggers('kea-dhcp4.bad-packets', 'DEBUG', 25, 'kea.log7')
srv_control.configure_loggers('kea-dhcp4.dhcpsrv', 'INFO', 'None', 'kea.log8')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.1')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.1')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
misc.test_procedure()
srv_msg.client_sets_value('Client', 'chaddr', '00:00:00:00:00:11')
srv_msg.client_does_include_with_value('client_id', '00010203040111')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_requests_option(2)
srv_msg.client_requests_option(7)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.1')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_copy_option('server_id')
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.100')
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'NAK')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.packets', 'kea.log4')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.leases', 'kea.log5')
srv_msg.log_contains(r'DEBUG \[kea-dhcp4.alloc-engine', 'kea.log6')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcp4', 'kea.log1')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcp4', 'kea.log1')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcpsrv', 'kea.log2')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.dhcpsrv', 'kea.log8')
# bug: #592
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcpsrv', 'kea.log8')
srv_msg.log_contains(r'INFO \[kea-dhcp4.dhcpsrv', 'kea.log2')
srv_msg.log_doesnt_contain(r'DEBUG \[kea-dhcp4.options', 'kea.log3')
@pytest.mark.v4
@pytest.mark.kea_only
@pytest.mark.logging
def test_ddns4_logging_all_types_debug():
misc.test_setup()
srv_control.config_srv_subnet('192.168.50.0/24', '192.168.50.10-192.168.50.10')
srv_control.add_ddns_server('127.0.0.1', '53001')
srv_control.add_ddns_server_options('enable-updates', True)
srv_control.add_ddns_server_options('qualifying-suffix', 'abc.com')
srv_control.add_forward_ddns('four.example.com.', 'forge.sha1.key')
srv_control.add_reverse_ddns('50.168.192.in-addr.arpa.', 'forge.sha1.key')
srv_control.add_keys('forge.sha1.key', 'HMAC-SHA1', 'PN4xKZ/jDobCMlo4rpr70w==')
srv_control.configure_loggers('kea-dhcp-ddns', 'DEBUG', 99)
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_requests_option(1)
srv_msg.client_send_msg('DISCOVER')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'OFFER')
srv_msg.response_check_include_option(1)
srv_msg.response_check_content('yiaddr', '192.168.50.10')
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
misc.test_procedure()
srv_msg.client_save_option_count(1, 'server_id')
srv_msg.client_add_saved_option_count(1)
srv_msg.client_does_include_with_value('requested_addr', '192.168.50.10')
srv_msg.client_requests_option(1)
srv_msg.client_sets_value('Client', 'FQDN_domain_name', 'aa.four.example.com.')
srv_msg.client_sets_value('Client', 'FQDN_flags', 'S')
srv_msg.client_does_include('Client', 'fqdn')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ACK')
srv_msg.response_check_content('yiaddr', '192.168.50.10')
srv_msg.response_check_include_option(1)
srv_msg.response_check_option_content(1, 'value', '255.255.255.0')
srv_msg.response_check_include_option(81)
srv_msg.response_check_option_content(81, 'flags', 1)
srv_msg.response_check_option_content(81, 'fqdn', 'aa.four.example.com.')
misc.test_procedure()
srv_msg.client_add_saved_option_count(1)
srv_msg.client_sets_value('Client', 'ciaddr', '192.168.50.10')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_dont_wait_for_message()
srv_msg.log_contains(r'INFO \[kea-dhcp-ddns.dhcpddns', 'kea.log_ddns')
srv_msg.log_contains(r'DEBUG \[kea-dhcp-ddns.dhcpddns', 'kea.log_ddns')
# srv_msg.log_contains(r'DEBUG \[kea-dhcp-ddns.libdhcp-ddns', 'kea.log_ddns') # TODO: it is not present in the log
srv_msg.log_contains(r'DEBUG \[kea-dhcp-ddns.d2-to-dns', 'kea.log_ddns')
srv_msg.log_contains(r'ERROR \[kea-dhcp-ddns.d2-to-dns', 'kea.log_ddns')
srv_msg.log_contains(r'DEBUG \[kea-dhcp-ddns.dhcp-to-d2', 'kea.log_ddns')
| 37.166667 | 119 | 0.736571 | 6,430 | 41,032 | 4.315552 | 0.027372 | 0.116545 | 0.131032 | 0.035353 | 0.977549 | 0.973765 | 0.968035 | 0.963602 | 0.963098 | 0.950701 | 0 | 0.071767 | 0.119785 | 41,032 | 1,103 | 120 | 37.200363 | 0.696542 | 0.005849 | 0 | 0.905724 | 0 | 0 | 0.206969 | 0.023738 | 0 | 0 | 0 | 0.000907 | 0 | 1 | 0.021324 | true | 0.087542 | 0.004489 | 0 | 0.025814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
ee7ff214923ab490c53dc68362fdb984469a67f4 | 1,011 | py | Python | ports/devel/meson/dragonfly/patch-mesonbuild_environment.py | danielfojt/DeltaPorts | 5710b4af4cacca5eb1ac577df304c788c07c4217 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | ports/devel/meson/dragonfly/patch-mesonbuild_environment.py | danielfojt/DeltaPorts | 5710b4af4cacca5eb1ac577df304c788c07c4217 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | ports/devel/meson/dragonfly/patch-mesonbuild_environment.py | danielfojt/DeltaPorts | 5710b4af4cacca5eb1ac577df304c788c07c4217 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | --- mesonbuild/environment.py.orig 2019-11-28 17:37:44 UTC
+++ mesonbuild/environment.py
@@ -312,7 +312,7 @@ def detect_cpu_family(compilers: Compile
"""
if mesonlib.is_windows():
trial = detect_windows_arch(compilers)
- elif mesonlib.is_freebsd() or mesonlib.is_netbsd() or mesonlib.is_openbsd():
+ elif mesonlib.is_freebsd() or mesonlib.is_netbsd() or mesonlib.is_openbsd() or mesonlib.is_dragonflybsd():
trial = platform.processor().lower()
else:
trial = platform.machine().lower()
@@ -357,7 +357,7 @@ def detect_cpu_family(compilers: Compile
def detect_cpu(compilers: CompilersDict):
if mesonlib.is_windows():
trial = detect_windows_arch(compilers)
- elif mesonlib.is_freebsd() or mesonlib.is_netbsd() or mesonlib.is_openbsd():
+ elif mesonlib.is_freebsd() or mesonlib.is_netbsd() or mesonlib.is_openbsd() or mesonlib.is_dragonflybsd():
trial = platform.processor().lower()
else:
trial = platform.machine().lower()
| 48.142857 | 111 | 0.694362 | 131 | 1,011 | 5.167939 | 0.290076 | 0.236337 | 0.177253 | 0.124077 | 0.830133 | 0.830133 | 0.830133 | 0.726736 | 0.726736 | 0.726736 | 0 | 0.0358 | 0.171118 | 1,011 | 20 | 112 | 50.55 | 0.772076 | 0 | 0 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ee8cc4396a83621dbfc7ea00d46a58b7d8ef1195 | 94 | py | Python | auth0_client/menu/view/i18n/__init__.py | rubelw/auth0_client | 51e68239babcf7c40e40491d1aaa3f8547a67f63 | [
"MIT"
] | 2 | 2020-10-08T21:42:56.000Z | 2021-03-21T08:17:52.000Z | auth0_client/menu/view/i18n/__init__.py | rubelw/auth0_client | 51e68239babcf7c40e40491d1aaa3f8547a67f63 | [
"MIT"
] | null | null | null | auth0_client/menu/view/i18n/__init__.py | rubelw/auth0_client | 51e68239babcf7c40e40491d1aaa3f8547a67f63 | [
"MIT"
] | null | null | null | import auth0_client.menu.view.i18n.messages_en
import auth0_client.menu.view.i18n.messages_ja
| 31.333333 | 46 | 0.87234 | 16 | 94 | 4.875 | 0.5625 | 0.282051 | 0.435897 | 0.538462 | 0.948718 | 0.948718 | 0.948718 | 0 | 0 | 0 | 0 | 0.066667 | 0.042553 | 94 | 2 | 47 | 47 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 11 |
c9bad263db7d531f1f952f097ae32669d0ad24f3 | 4,050 | py | Python | tests/active/test_receive_get.py | jonyboi396825/COM-Server | e4e8a1a5e9f86c1036ebb7ac3d39c20b63e7e905 | [
"MIT"
] | 4 | 2021-11-09T04:11:51.000Z | 2022-01-30T01:03:16.000Z | tests/active/test_receive_get.py | jonyboi396825/COM-Server | e4e8a1a5e9f86c1036ebb7ac3d39c20b63e7e905 | [
"MIT"
] | 55 | 2021-11-15T16:36:25.000Z | 2022-03-10T04:48:08.000Z | tests/active/test_receive_get.py | jonyboi396825/COM-Server | e4e8a1a5e9f86c1036ebb7ac3d39c20b63e7e905 | [
"MIT"
] | 1 | 2021-11-12T02:14:07.000Z | 2021-11-12T02:14:07.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import requests
import json
import pytest
import time
SERVER = "http://127.0.0.1:8080/v1"
class Test_Receive:
"""
tests the /receive and /receive/x resources
"""
@pytest.fixture
def example_data(self):
return {"data": [1, 2, 3, 4, time.time()], "ending": "\n", "concatenate": ";"}
res = f"{SERVER}/receive"
@pytest.mark.parametrize(
"http_method",
[
requests.post,
requests.put,
requests.patch,
],
)
def test_http_method_with_data(self, example_data, http_method):
"""Tests that only given request works (also tests that HTTP methods are working)"""
r = http_method(self.res, example_data)
assert r.status_code == 405
r = http_method(f"{self.res}/23", example_data)
assert r.status_code == 405
@pytest.mark.parametrize("http_method", [requests.delete])
def test_http_method_no_data(self, http_method):
"""Tests that only given request works (but with options, head, delete requests)"""
r = http_method(self.res)
assert r.status_code == 405
r = http_method(f"{self.res}/23")
assert r.status_code == 405
def test_receive_first_good(self, example_data):
"""Tests that the first thing sent is received correctly"""
curt = example_data["data"][4]
requests.post(f"{SERVER}/send", data=example_data)
time.sleep(1)
r = requests.get(f"{SERVER}/receive/0")
loaded = json.loads(r.text)
assert r.status_code == 200 and loaded["message"] == "OK"
assert loaded["data"] == f'Got: "1;2;3;4;{curt}"'
def test_receive_second_good(self, example_data):
"""Tests that the 2nd most recent received object is correct"""
curt = example_data["data"][4]
requests.post(f"{SERVER}/send", data=example_data)
time.sleep(1)
requests.post(f"{SERVER}/send", data=example_data)
time.sleep(1)
r = requests.get(f"{SERVER}/receive/1")
loaded = json.loads(r.text)
assert r.status_code == 200 and loaded["message"] == "OK"
assert loaded["data"] == f'Got: "1;2;3;4;{curt}"'
def test_receive_all(self, example_data):
"""Tests that things are being received in the order they should be"""
curt = example_data["data"][4]
requests.post(f"{SERVER}/send", data=example_data)
time.sleep(1)
r = requests.get(f"{SERVER}/receive")
loaded = json.loads(r.text)
assert r.status_code == 200 and loaded["message"] == "OK"
assert loaded["data"][-1] == f'Got: "1;2;3;4;{curt}"'
class Test_Get:
"""
tests the /get resource
"""
@pytest.fixture
def example_data(self):
return {"data": [1, 2, 3, 4, time.time()], "ending": "\n", "concatenate": ";"}
res = f"{SERVER}/get"
@pytest.mark.parametrize(
"http_method",
[
requests.post,
requests.put,
requests.patch,
],
)
def test_http_method_with_data(self, example_data, http_method):
"""Tests that only given request works (also tests that HTTP methods are working)"""
r = http_method(self.res, example_data)
assert r.status_code == 405
@pytest.mark.parametrize("http_method", [requests.delete])
def test_http_method_no_data(self, http_method):
"""Tests that only given request works (but with options, head, delete requests)"""
r = http_method(self.res)
assert r.status_code == 405
def test_get_working(self, example_data):
"""tests that get works with example data"""
curt = example_data["data"][4]
requests.post(f"{SERVER}/send", data=example_data)
r = requests.get(f"{SERVER}/get")
loaded = json.loads(r.text)
assert r.status_code == 200 and loaded["message"] == "OK"
assert loaded["data"] == f'Got: "1;2;3;4;{curt}"'
time.sleep(1)
| 28.321678 | 92 | 0.598272 | 544 | 4,050 | 4.321691 | 0.185662 | 0.098256 | 0.055296 | 0.07231 | 0.843046 | 0.814547 | 0.814547 | 0.768609 | 0.768609 | 0.768609 | 0 | 0.027861 | 0.255556 | 4,050 | 142 | 93 | 28.521127 | 0.751907 | 0.158272 | 0 | 0.710843 | 0 | 0 | 0.135151 | 0 | 0 | 0 | 0 | 0 | 0.168675 | 1 | 0.120482 | false | 0 | 0.048193 | 0.024096 | 0.240964 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c9c2165f2c658a9071db02fabb2103ac7b635483 | 4,246 | py | Python | python_code/hash_chapter3_class_impl_test.py | possnfiffer/inside_python_dict | 4a16f41062c98974618b25711d3e56bf71a71294 | [
"MIT"
] | 124 | 2018-12-24T21:17:34.000Z | 2022-02-17T06:32:22.000Z | python_code/hash_chapter3_class_impl_test.py | possnfiffer/inside_python_dict | 4a16f41062c98974618b25711d3e56bf71a71294 | [
"MIT"
] | 1 | 2021-06-18T21:21:38.000Z | 2021-07-05T00:32:59.000Z | python_code/hash_chapter3_class_impl_test.py | possnfiffer/inside_python_dict | 4a16f41062c98974618b25711d3e56bf71a71294 | [
"MIT"
] | 8 | 2018-12-25T11:59:56.000Z | 2020-07-10T16:49:14.000Z | import unittest
from common import DUMMY, EMPTY
from hash_chapter3_class_impl import AlmostPythonDictImplementationRecycling, AlmostPythonDictImplementationNoRecycling
class HashDictImplementationTest(unittest.TestCase):
def test_handcrafted(self):
d = AlmostPythonDictImplementationRecycling()
self.assertEqual(len(d.slots), 8)
def assert_contains(i, h, k, v):
self.assertEqual(d.slots[i].hash_code, h)
self.assertEqual(d.slots[i].key, k)
self.assertEqual(d.slots[i].value, v)
d[""] = 1
d[17] = 2
d[18] = 3
self.assertEqual(d[""], 1)
self.assertEqual(d[17], 2)
self.assertEqual(d[18], 3)
assert_contains(0, 0, "", 1)
assert_contains(1, 17, 17, 2)
assert_contains(2, 18, 18, 3)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 3)
with self.assertRaises(KeyError):
del d[1]
del d[17]
assert_contains(1, 17, DUMMY, EMPTY)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 2)
# hash("abcd") % 8 == 0
# py 3.2 hash()
d["abcd"] = 4
self.assertEqual(d["abcd"], 4)
assert_contains(1, -2835746963027601024, "abcd", 4)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 3)
d["abcd"] = 5
self.assertEqual(d["abcd"], 5)
assert_contains(1, -2835746963027601024, "abcd", 5)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 3)
del d["abcd"]
with self.assertRaises(KeyError):
d["abcd"]
d[15] = 6
d[14] = 7
assert_contains(7, 15, 15, 6)
assert_contains(6, 14, 14, 7)
self.assertEqual(len(d.slots), 8)
self.assertEqual(d.fill, 5)
self.assertEqual(d.used, 4)
d[13] = 8
self.assertEqual(len(d.slots), 16)
self.assertEqual(d.fill, 5)
self.assertEqual(d.used, 5)
assert_contains(0, 0, "", 1)
assert_contains(2, 18, 18, 3)
assert_contains(13, 13, 13, 8)
assert_contains(14, 14, 14, 7)
assert_contains(15, 15, 15, 6)
def test_handcrafted_simple_setitem(self):
d = AlmostPythonDictImplementationNoRecycling()
self.assertEqual(len(d.slots), 8)
def assert_contains(i, h, k, v):
self.assertEqual(d.slots[i].hash_code, h)
self.assertEqual(d.slots[i].key, k)
self.assertEqual(d.slots[i].value, v)
d[""] = 1
d[17] = 2
d[18] = 3
self.assertEqual(d[""], 1)
self.assertEqual(d[17], 2)
self.assertEqual(d[18], 3)
assert_contains(0, 0, "", 1)
assert_contains(1, 17, 17, 2)
assert_contains(2, 18, 18, 3)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 3)
with self.assertRaises(KeyError):
del d[1]
del d[17]
assert_contains(1, 17, DUMMY, EMPTY)
self.assertEqual(d.fill, 3)
self.assertEqual(d.used, 2)
# hash("abcd") % 8 == 0
# py 3.2 hash()
d["abcd"] = 4
self.assertEqual(d["abcd"], 4)
assert_contains(3, -2835746963027601024, "abcd", 4)
self.assertEqual(d.fill, 4)
self.assertEqual(d.used, 3)
d["abcd"] = 5
self.assertEqual(d["abcd"], 5)
assert_contains(3, -2835746963027601024, "abcd", 5)
self.assertEqual(d.fill, 4)
self.assertEqual(d.used, 3)
del d["abcd"]
with self.assertRaises(KeyError):
d["abcd"]
self.assertEqual(len(d.slots), 8)
self.assertEqual(d.fill, 4)
self.assertEqual(d.used, 2)
d[15] = 6
self.assertEqual(len(d.slots), 8)
self.assertEqual(d.fill, 5)
self.assertEqual(d.used, 3)
assert_contains(7, 15, 15, 6)
d[13] = 8
self.assertEqual(len(d.slots), 16)
self.assertEqual(d.fill, 4)
self.assertEqual(d.used, 4)
assert_contains(0, 0, "", 1)
assert_contains(2, 18, 18, 3)
assert_contains(13, 13, 13, 8)
assert_contains(15, 15, 15, 6)
def main():
unittest.main()
if __name__ == "__main__":
main()
| 27.571429 | 119 | 0.556995 | 556 | 4,246 | 4.176259 | 0.107914 | 0.316537 | 0.289406 | 0.111972 | 0.826012 | 0.822567 | 0.80534 | 0.749354 | 0.749354 | 0.722222 | 0 | 0.098383 | 0.300989 | 4,246 | 153 | 120 | 27.751634 | 0.683962 | 0.016722 | 0 | 0.824561 | 0 | 0 | 0.017266 | 0 | 0 | 0 | 0 | 0 | 0.692982 | 1 | 0.04386 | false | 0 | 0.026316 | 0 | 0.078947 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c9f7448dd5969da72848997f232dac739920ecf9 | 183 | py | Python | automl_infrastructure/classifiers/adapters/__init__.py | barak1412/automl_infrastructure | e8a291d175237bb7f74ebae5d6f5d2f8bcf5dc32 | [
"MIT"
] | null | null | null | automl_infrastructure/classifiers/adapters/__init__.py | barak1412/automl_infrastructure | e8a291d175237bb7f74ebae5d6f5d2f8bcf5dc32 | [
"MIT"
] | null | null | null | automl_infrastructure/classifiers/adapters/__init__.py | barak1412/automl_infrastructure | e8a291d175237bb7f74ebae5d6f5d2f8bcf5dc32 | [
"MIT"
] | null | null | null | from automl_infrastructure.classifiers.adapters.sklearn import SklearnClassifierAdapter
from automl_infrastructure.classifiers.adapters.keras_classifier import KerasClassifierAdapter
| 61 | 94 | 0.923497 | 17 | 183 | 9.764706 | 0.647059 | 0.120482 | 0.289157 | 0.421687 | 0.518072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043716 | 183 | 2 | 95 | 91.5 | 0.948571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
14e5ff54f287b5b796f3d47e0c833f50b35e4219 | 51,773 | py | Python | app/baremetal_service/repository/service_model.py | 21vcloud/Controller | 63169d220f412330a22e3a2fe9964c73893d4e0f | [
"Apache-2.0"
] | null | null | null | app/baremetal_service/repository/service_model.py | 21vcloud/Controller | 63169d220f412330a22e3a2fe9964c73893d4e0f | [
"Apache-2.0"
] | null | null | null | app/baremetal_service/repository/service_model.py | 21vcloud/Controller | 63169d220f412330a22e3a2fe9964c73893d4e0f | [
"Apache-2.0"
] | null | null | null | # This is an auto-generated Django model module.
# You'll have to do the following manually to clean this up:
# * Rearrange models' order
# * Make sure each model has one field with primary_key=True
# * Make sure each ForeignKey has `on_delete` set to the desired behavior.
# * Remove `managed = False` lines if you wish to allow Django to create, modify, and delete the table
# Feel free to rename the models, but don't rename db_table values or field names.
import uuid
from django.db import models
from django.db.models.fields.related import ManyToManyField
from common.lark_common import random_object_id
from account.repository.auth_models import BmContract, BmProject
ORDER_TYPE_INCREASE = "increase"
ORDER_TYPE_ALTERATION = "alteration"
ORDER_STATUS_DELIVERED = "delivered"
ORDER_STATUS_DELIVERING = "delivering"
ORDER_TYPE_DICT = {"increase": "新购", "alteration": "变更"}
FLAVOR_ACTIVE = "active"
VOLUME_ACTIVE = "active"
VOLUME_BACKUP_ACTIVE = "active"
FLOATINGIP_ACTIVE = "active"
LB_ERROR = "ERROR"
resource_type_volume = "volume"
resource_type_volume_backup = "volume_backup"
resource_type_flaoting_ip = "flaoting_ip"
resource_type_ecbm = "ECBM"
resource_contract_info = "contract_info"
FLOATINGIP_AATTACH_ECBM = "ECBM"
FLOATINGIP_AATTACH_SLB = "LoadBalance"
class BmServiceOrder(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_bm_service_order_code)
account_id = models.CharField(max_length=64, blank=True, null=True,
help_text="客户id 示例:3f4cb35aeec544d3af33150f38b55286")
contract_number = models.CharField(max_length=64, blank=True, null=True, help_text="合同编码 示例:ding11")
project = models.ForeignKey(BmProject, models.DO_NOTHING, blank=True, null=True, help_text="项目 示例:test")
# contract_number = models.CharField(max_length=64, blank=True, null=True)
# project_id = models.CharField(max_length=64, blank=True, null=True)
# project_name = models.CharField(max_length=64, blank=True, null=True)
region = models.CharField(max_length=64, blank=True, null=True, help_text="可用区 示例:华北区")
order_type = models.CharField(max_length=64, blank=True, null=True, default=ORDER_TYPE_INCREASE,
help_text="订单类型 示例:increase")
order_price = models.FloatField(blank=True, null=True, help_text="订单价格 示例:1300")
product_type = models.CharField(max_length=64, blank=True, null=True, help_text="产品类型 示例:ECBM")
product_info = models.TextField(blank=True, null=True, help_text="产品详细信息")
billing_model = models.CharField(max_length=64, blank=True, null=True, help_text="绑定的合同类型 示例:框架合同 ")
service_count = models.IntegerField(blank=True, null=True, help_text="所购买的服务数量 示例:3")
delivery_status = models.CharField(max_length=64, blank=True, null=True, help_text="")
create_at = models.DateTimeField(blank=True, null=True, help_text="")
update_at = models.DateTimeField(blank=True, null=True, help_text="")
deleted = models.CharField(max_length=11, blank=True, null=True, help_text="")
class Meta:
managed = False
db_table = 'bm_service_order'
class BmServiceMachine(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
order = models.ForeignKey('BmServiceOrder', models.DO_NOTHING, blank=True, null=True)
uuid = models.CharField(max_length=64, blank=True, null=True)
flavor_name = models.CharField(max_length=255, blank=True, null=True, help_text="机器名称 示例:戴尔")
flavor_id = models.CharField(max_length=64, blank=True, null=True,
help_text="机器id 示例:11a2c533-73cc-4f95-8e7b-0055b7ec18a7")
image_name = models.CharField(max_length=255, blank=True, null=True, help_text="镜像名称 示例:Windows2016")
image_id = models.CharField(max_length=64, blank=True, null=True,
help_text="镜像id 示例:198e0048-c8b2-4db9-9f08-395ea005af21")
monitoring = models.CharField(max_length=11, blank=True, null=True, help_text="是否携带监控 示例:True/False")
vulnerability_scanning = models.CharField(max_length=11, blank=True, null=True, help_text="是否携带扫描 示例:True/False")
disk_info = models.TextField(blank=True, null=True, help_text="磁盘信息")
network_path_type = models.CharField(max_length=64, blank=True, null=True, help_text="网络类型")
network = models.CharField(max_length=64, blank=True, null=True, help_text="网络名称")
network_id = models.CharField(max_length=64, blank=True, null=True, help_text="网络id")
floating_ip_info = models.TextField(blank=True, null=True, help_text="弹性公网IP信息")
floating_ip_allocation = models.BooleanField(max_length=64, blank=True, null=True, help_text="弹性公网IP可用量")
floating_ip_bandwidth = models.CharField(max_length=64, blank=True, null=True, help_text="弹性公网IP带宽")
floating_ip_line = models.CharField(max_length=64, blank=True, null=True, help_text="弹性公网IP类型")
firewall_id = models.CharField(max_length=64, blank=True, null=True, help_text="防火墙id")
firewall_name = models.CharField(max_length=64, blank=True, null=True, help_text="防火墙名称")
service_name = models.CharField(max_length=64, blank=True, null=True, help_text="所生成的服务器名称")
login_method = models.CharField(max_length=64, blank=True, null=True,
help_text="登陆方式(user_password-密码登陆)(keypair-密钥登陆)")
service_username = models.CharField(max_length=64, blank=True, null=True, help_text="所生成服务器的登录名")
service_password = models.CharField(max_length=64, blank=True, null=True, help_text="所生成服务器的密码")
public_key = models.TextField(blank=True, null=True, help_text="公钥信息")
create_at = models.DateTimeField(blank=True, null=True, help_text="服务创建时间")
update_at = models.DateTimeField(blank=True, null=True, help_text="服务更新时间")
deleted = models.CharField(max_length=11, blank=True, null=True, help_text="服务删除时间")
status = models.CharField(max_length=64, blank=True, null=True, help_text="状态(active-激活)(DELETED-删除)")
job_model = models.TextField(blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_machine'
class BmRequestLog(models.Model):
account_id = models.CharField(max_length=64, blank=True, null=True)
account_name = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
object_id = models.CharField(max_length=64, blank=True, null=True)
object_name = models.CharField(max_length=255, blank=True, null=True)
object_type = models.CharField(max_length=255, blank=True, null=True)
action = models.CharField(max_length=255, blank=True, null=True)
uri = models.CharField(max_length=255, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
request_info = models.TextField(blank=True, null=True)
extra = models.TextField(blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_request_log'
class BmServiceInstance(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=64, blank=True, null=True,
help_text="客户id 示例:3f4cb35aeec544d3af33150f38b55286")
project_id = models.CharField(max_length=64, blank=True, null=True,
help_text="项目id 示例:7ae5a60714014778baddea703b85cd93")
region = models.CharField(max_length=64, blank=True, null=True, help_text="区域 示例:regionOne")
monitoring = models.CharField(max_length=11, blank=True, null=True,help_text="是否带监控 示例:True")
contract_number = models.CharField(max_length=64, blank=True, null=True, help_text="合同编号 示例:ding111")
product_type = models.CharField(max_length=64, blank=True, null=True, help_text="产品类型 示例:ECBM")
billing_model = models.CharField(max_length=64, blank=True, null=True, help_text="绑定合同类型 示例:标准合同")
uuid = models.CharField(unique=True, max_length=64, help_text="对应机器id 示例:c4130c54-bc4b-4249-928d-c014827653db")
name = models.CharField(max_length=255, help_text="机器名称", blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True, help_text="状态 示例:active/DELETED")
task = models.CharField(max_length=64, blank=True, null=True, help_text="实例创建结果 示例:success/instance_build")
create_at = models.DateTimeField(blank=True, null=True, help_text="实例创建时间")
update_at = models.DateTimeField(blank=True, null=True, help_text="实例更新时间")
deleted = models.NullBooleanField(blank=True, null=True, default=False, help_text="实例删除时间")
class Meta:
managed = False
db_table = 'bm_service_instance'
class BmServiceFloatingIp(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id, help_text="")
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
contract_number = models.CharField(max_length=64, blank=True, null=True)
external_line_type = models.CharField(max_length=64, blank=True, null=True, help_text="进出网络类型 "
"示例:three_line_ip")
external_name = models.CharField(max_length=255, blank=True, null=True)
external_name_id = models.CharField(max_length=255, blank=True, null=True)
floating_ip = models.CharField(max_length=64, blank=True, null=True)
floating_ip_id = models.CharField(max_length=64, blank=True, null=True)
attached = models.NullBooleanField(blank=True, null=True, default=False)
instance_uuid = models.CharField(max_length=64, blank=True, null=True)
instance_name = models.CharField(max_length=255, blank=True, null=True)
fixed_address = models.CharField(max_length=255, blank=True, null=True)
shared_qos_policy_type = models.BooleanField(default=False)
qos_policy_name = models.CharField(max_length=64, blank=True, null=True, help_text="带宽大小 示例:100M")
qos_policy_id = models.CharField(max_length=64, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
first_create_at = models.DateTimeField(blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True)
is_measure_end = models.NullBooleanField(blank=True, null=True, default=False)
attached_type = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_floating_ip'
class BmContractFloatingIpMaterial(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
material_number = models.CharField(max_length=64, blank=True, null=True)
floating_ip_type_name = models.CharField(max_length=255, blank=True, null=True)
floating_ip_type = models.CharField(max_length=255, blank=True, null=True)
external_line_type = models.CharField(max_length=255, blank=True, null=True)
charge_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.DecimalField(max_digits=10, decimal_places=2, blank=True, null=True)
floating_ip_available = models.IntegerField(blank=True, null=True)
floating_ip_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_floating_ip'
class BmContractBandWidthaterial(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
material_number = models.CharField(max_length=64, blank=True, null=True)
band_width_type_name = models.CharField(max_length=255, blank=True, null=True)
band_width_type = models.CharField(max_length=255, blank=True, null=True)
external_line_type = models.CharField(max_length=255, blank=True, null=True)
charge_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.DecimalField(max_digits=10, decimal_places=2, blank=True, null=True)
band_width_available = models.IntegerField(blank=True, null=True)
band_width_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_band_width'
class BmServiceLoadbalanceFlavor(models.Model):
id = models.CharField(primary_key=True, max_length=64)
type = models.CharField(max_length=255, blank=True, null=True)
max_connect = models.CharField(max_length=255, blank=True, null=True)
new_connetc = models.CharField(max_length=255, blank=True, null=True)
second_query = models.CharField(max_length=255, blank=True, null=True)
openstack_name = models.CharField(max_length=255, blank=True, null=True)
memory = models.CharField(max_length=255, blank=True, null=True)
disk = models.CharField(max_length=255, blank=True, null=True)
vcpus = models.CharField(max_length=255, blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_loadbalance_flavor'
class BmServiceMaterialVolume(models.Model):
id = models.CharField(primary_key=True, max_length=64)
material_number = models.CharField(max_length=64, blank=True, null=True)
volume_type_name = models.CharField(max_length=255, blank=True, null=True)
volume_type = models.CharField(max_length=255, blank=True, null=True)
openstack_volume_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.FloatField(blank=True, null=True)
volume_available = models.IntegerField(blank=True, null=True)
volume_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_volume'
class BmServiceMaterialNat(models.Model):
id = models.CharField(primary_key=True, max_length=64)
material_number = models.CharField(max_length=64, blank=True, null=True)
nat_getway_type_name = models.CharField(max_length=255, blank=True, null=True)
charge_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.FloatField(blank=True, null=True)
nat_getway_available = models.IntegerField(blank=True, null=True)
nat_getway_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
nat_getway_type = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_nat'
class BmServiceMaterialLb(models.Model):
id = models.CharField(primary_key=True, max_length=64)
material_number = models.CharField(max_length=64, blank=True, null=True)
lb_type_name = models.CharField(max_length=255, blank=True, null=True)
charge_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.FloatField(blank=True, null=True)
lb_available = models.IntegerField(blank=True, null=True)
lb_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
lb_type = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_lb'
class BmServiceMaterialBandWidth(models.Model):
id = models.CharField(primary_key=True, max_length=64)
material_number = models.CharField(max_length=64, blank=True, null=True)
band_width_type_name = models.CharField(max_length=255, blank=True, null=True)
band_width_type = models.CharField(max_length=255, blank=True, null=True)
external_line_type = models.CharField(max_length=255, blank=True, null=True)
charge_type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.FloatField(blank=True, null=True)
band_width_available = models.IntegerField(blank=True, null=True)
band_width_capacity = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_material_band_width'
class BmMaterialFloatingIpSeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
date = models.DateTimeField(blank=True, null=True)
band_material_number = models.CharField(max_length=255, blank=True, null=True)
band_width_type_name = models.CharField(max_length=255, blank=True, null=True)
material_band_base_price = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
floating_ip_type_name = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
service_expired_time = models.CharField(max_length=255, blank=True, null=True)
service_start_time = models.CharField(max_length=255, blank=True, null=True)
contract_start_date = models.CharField(max_length=255, blank=True, null=True)
contract_expire_date = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer_account = models.CharField(max_length=255, blank=True, null=True)
external_line_type = models.CharField(max_length=255, blank=True, null=True)
external_name = models.CharField(max_length=255, blank=True, null=True)
external_name_id = models.CharField(max_length=255, blank=True, null=True)
floating_ip = models.CharField(max_length=255, blank=True, null=True)
floating_ip_id = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
status = models.CharField(max_length=255, blank=True, null=True)
max_band = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_floating_ip_seasonal'
class BmMaterialVolumeSeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
create_at = models.CharField(max_length=255, blank=True, null=True)
name = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
size = models.CharField(max_length=255, blank=True, null=True)
end_at = models.CharField(max_length=255, blank=True, null=True)
volume_id = models.CharField(max_length=255, blank=True, null=True)
volume_type = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
account = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
volume_type_name = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
date = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_volume_seasonal'
class BmMaterialLbSeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
create_at = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
delete_at = models.CharField(max_length=255, blank=True, null=True)
ip_adress = models.CharField(max_length=255, blank=True, null=True)
loadbalance_id = models.CharField(max_length=255, blank=True, null=True)
loadbalance_name = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
account = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
lb_type_name = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
date = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_lb_seasonal'
class BmMaterialNetGetwaySeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
create_at = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
delete_at = models.CharField(max_length=255, blank=True, null=True)
net_getway_id = models.CharField(max_length=255, blank=True, null=True)
net_getway_name = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
account = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
date = models.CharField(max_length=255, blank=True, null=True)
nat_getway_type_name = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_net_getway_seasonal'
class BmMaterialMachineSeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
create_at = models.CharField(max_length=255, blank=True, null=True)
flavor_name = models.CharField(max_length=255, blank=True, null=True)
image_name = models.CharField(max_length=255, blank=True, null=True)
monitoring = models.CharField(max_length=255, blank=True, null=True)
vulnerability_scanning = models.CharField(max_length=255, blank=True, null=True)
network = models.CharField(max_length=255, blank=True, null=True)
network_path_type = models.CharField(max_length=255, blank=True, null=True)
service_name = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
product_type = models.CharField(max_length=255, blank=True, null=True)
service_count = models.CharField(max_length=255, blank=True, null=True)
delete_at = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
account = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
type = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
flavor_info = models.CharField(max_length=255, blank=True, null=True)
cpu_model = models.CharField(max_length=255, blank=True, null=True)
cpu_core = models.CharField(max_length=255, blank=True, null=True)
cpu_hz = models.CharField(max_length=255, blank=True, null=True)
ram = models.CharField(max_length=255, blank=True, null=True)
disk = models.CharField(max_length=255, blank=True, null=True)
date = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_machine_seasonal'
class BmMaterialVolumeBakSeasonal(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
account_id = models.CharField(max_length=255, blank=True, null=True)
user_account = models.CharField(max_length=255, blank=True, null=True)
identity_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
create_at = models.CharField(max_length=255, blank=True, null=True)
backup_name = models.CharField(max_length=255, blank=True, null=True)
order_id = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=255, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
service_count = models.CharField(max_length=255, blank=True, null=True)
delete_time = models.CharField(max_length=255, blank=True, null=True)
volume_id = models.CharField(max_length=255, blank=True, null=True)
contract_customer_name = models.CharField(max_length=255, blank=True, null=True)
contract_authorizer = models.CharField(max_length=255, blank=True, null=True)
contract_type = models.CharField(max_length=255, blank=True, null=True)
sales = models.CharField(max_length=255, blank=True, null=True)
enterprise_number = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=255, blank=True, null=True)
customer_service_name = models.CharField(max_length=255, blank=True, null=True)
account = models.CharField(max_length=255, blank=True, null=True)
material_number = models.CharField(max_length=255, blank=True, null=True)
volume_type_name = models.CharField(max_length=255, blank=True, null=True)
base_price = models.CharField(max_length=255, blank=True, null=True)
date = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_material_volume_bak_seasonal'
class BmContractFloatingIp(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
contract_number = models.CharField(max_length=64, blank=True, null=True)
external_line_type = models.CharField(max_length=64, blank=True, null=True)
external_name = models.CharField(max_length=255, blank=True, null=True)
external_name_id = models.CharField(max_length=255, blank=True, null=True)
floating_ip = models.CharField(max_length=64, blank=True, null=True)
floating_ip_id = models.CharField(max_length=64, blank=True, null=True)
attached = models.NullBooleanField(blank=True, null=True, default=False)
instance_uuid = models.CharField(max_length=64, blank=True, null=True)
instance_name = models.CharField(max_length=255, blank=True, null=True)
fixed_address = models.CharField(max_length=255, blank=True, null=True)
qos_policy_name = models.CharField(max_length=64, blank=True, null=True)
qos_policy_id = models.CharField(max_length=64, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True)
is_measure_end = models.NullBooleanField(blank=True, null=True, default=False)
class Meta:
managed = False
db_table = 'bm_contract_floating_ip'
class BmServiceFlavor(models.Model):
id = models.CharField(primary_key=True, max_length=64)
material_number = models.CharField(max_length=64, blank=True, null=True)
name = models.CharField(max_length=64, blank=True, null=True)
openstack_flavor_name = models.CharField(max_length=64, blank=True, null=True)
type = models.CharField(max_length=64, blank=True, null=True)
type_name = models.CharField(max_length=64, blank=True, null=True)
resource_class = models.CharField(max_length=255, blank=True, null=True)
cpu_model = models.CharField(max_length=64, blank=True, null=True)
base_price = models.FloatField(blank=True, null=True)
cpu_core = models.CharField(max_length=64, blank=True, null=True)
cpu_hz = models.CharField(max_length=64, blank=True, null=True)
gpu = models.CharField(max_length=255, blank=True, null=True)
ram = models.CharField(max_length=64, blank=True, null=True)
disk = models.CharField(max_length=255, blank=True, null=True)
flavor_info = models.CharField(max_length=255, blank=True, null=True)
count = models.IntegerField(blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_flavor'
class BmServiceImages(models.Model):
id = models.CharField(primary_key=True, max_length=64)
type = models.CharField(max_length=64, blank=True, null=True)
image_name = models.CharField(max_length=64, blank=True, null=True)
openstack_image_name = models.CharField(max_length=64, blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_images'
class BmServiceQosPolicy(models.Model):
# id = models.CharField(primary_key=True, max_length=64)
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
qos_policy_name = models.CharField(max_length=64, blank=True, null=True)
qos_policy_count = models.IntegerField(blank=True, null=True)
shared = models.IntegerField(blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
project_name = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_qos_policy'
class Networks(models.Model):
network_id = models.CharField(primary_key=True, max_length=64)
name = models.CharField(max_length=255, blank=True, null=True)
cidr = models.CharField(max_length=64, blank=True, null=True)
enable_dhcp = models.NullBooleanField(blank=True, null=True, default=True)
gateway_ip = models.CharField(max_length=64, blank=True, null=True)
dns = models.TextField(max_length=255, blank=True, null=True)
vpc = models.ForeignKey('Vpc', on_delete=models.CASCADE)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_networks'
class Vpc(models.Model):
vpc_name = models.CharField(max_length=255)
region = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
status = models.CharField(max_length=32, blank=True, null=True)
deleted = models.IntegerField(blank=True, null=True)
router_id = models.CharField(max_length=64, blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_vpc'
unique_together = (('vpc_name', 'project_id'),)
class FirewallRule(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
firewall = models.ForeignKey('Firewalls', models.CASCADE, blank=True, null=True)
direction = models.CharField(max_length=255, blank=True, null=True)
action = models.CharField(max_length=255, blank=True, null=True)
protocol = models.CharField(max_length=255, blank=True, null=True)
remote_ip = models.CharField(max_length=255, blank=True, null=True)
remote_port = models.TextField(max_length=255, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_firewall_rule'
class Firewalls(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
name = models.CharField(max_length=255, blank=True, null=True)
region = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
enabled = models.IntegerField(blank=True, null=True)
description = models.CharField(max_length=255, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_firewalls'
class BmServiceLoadbalance(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
contract_number = models.CharField(max_length=64, blank=True, null=True)
loadbalance_id = models.CharField(max_length=64, blank=True, null=True)
loadbalance_name = models.CharField(max_length=255, blank=True, null=True)
location = models.CharField(max_length=64, blank=True, null=True)
region = models.CharField(max_length=64, blank=True, null=True)
flavor_id = models.CharField(max_length=64, blank=True, null=True)
is_public = models.IntegerField(blank=True, null=True)
network_name = models.CharField(max_length=255, blank=True, null=True)
vip_network_id = models.CharField(max_length=255, blank=True, null=True)
vpc_id = models.CharField(max_length=64, blank=True, null=True)
created_at = models.DateTimeField(blank=True, null=True)
first_create_at = models.DateTimeField(blank=True, null=True)
updated_at = models.DateTimeField(blank=True, null=True)
listener_count = models.IntegerField(blank=True, null=True)
pool_count = models.IntegerField(blank=True, null=True)
deleted = models.IntegerField(blank=True, null=True)
is_measure_end = models.IntegerField(blank=True, null=True)
is_new_ip = models.IntegerField(blank=True, null=True)
error_type = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_loadbalance'
class BmNetworks(models.Model):
network_id = models.CharField(primary_key=True, max_length=64)
name = models.CharField(max_length=255, blank=True, null=True)
cidr = models.CharField(max_length=64, blank=True, null=True)
enable_dhcp = models.IntegerField(blank=True, null=True)
gateway_ip = models.CharField(max_length=64, blank=True, null=True)
dns = models.TextField(blank=True, null=True)
vpc = models.ForeignKey('BmVpc', models.DO_NOTHING)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_networks'
class BmVpc(models.Model):
vpc_name = models.CharField(max_length=255)
region = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
status = models.CharField(max_length=32, blank=True, null=True)
deleted = models.IntegerField(blank=True, null=True)
router_id = models.CharField(max_length=64, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_vpc'
unique_together = (('vpc_name', 'project_id'),)
class BmServiceVolume(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
volume_type = models.CharField(max_length=255, blank=True, null=True)
volume_id = models.CharField(max_length=255, blank=True, null=True)
size = models.CharField(max_length=255, blank=True, null=True)
name = models.CharField(max_length=255, blank=True, null=True)
create_at = models.DateTimeField(null=True)
update_at = models.DateTimeField(null=True)
is_measure_end = models.IntegerField(blank=True, null=True, default=0)
region = models.CharField(max_length=255, blank=True, null=True)
attached_type = models.CharField(max_length=255, blank=True, null=True)
instance_uuid = models.CharField(max_length=64, blank=True, null=True)
instance_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
first_create_at = models.DateTimeField(blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_service_volume'
class BmFloatingipfirewallMapping(models.Model):
floating_ip_id = models.CharField(max_length=64, blank=True, null=True)
firewall = models.ForeignKey(Firewalls, models.CASCADE, blank=True, null=True)
floating_ip = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_floatingipfirewall_mapping'
class BmInstanceMemberMapping(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=random_object_id.gen_random_object_id)
instance_id = models.CharField(max_length=255, blank=True, null=True)
pool_id = models.CharField(max_length=255, blank=True, null=True)
loadbancer_id = models.CharField(max_length=255, blank=True, null=True)
member_id = models.CharField(max_length=255, blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_instance_member_mapping'
class BmServiceVolumeBackup(models.Model):
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
region = models.CharField(max_length=64, blank=True, null=True)
backup_id = models.CharField(max_length=64, primary_key=True)
volume_id = models.CharField(max_length=64, blank=True, null=True)
volume_name = models.CharField(max_length=64, blank=True, null=True)
is_incremental = models.BooleanField(default=False)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
deleted = models.NullBooleanField(blank=True, null=True, default=False)
status = models.CharField(max_length=255, blank=True, null=True)
backup_name = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
is_measure_end = models.IntegerField(blank=True, null=True, default=0)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_service_volume_backup'
class UUIDTools(object):
@staticmethod
def uuid4_hex():
# retun uuid4 hex string
return uuid.uuid4().hex
class BmServiceNatGateway(models.Model):
id = models.CharField(primary_key=True, max_length=36, default=UUIDTools.uuid4_hex)
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
name = models.CharField(max_length=255, blank=True, null=True)
vpc = models.ForeignKey('Vpc', models.DO_NOTHING, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
is_measure_end = models.BooleanField(blank=True, null=True)
description = models.TextField(blank=True, null=True)
deleted = models.BooleanField(blank=True, null=True, default=False)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
delete_at = models.DateTimeField(blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_nat_gateway'
class NatRule(models.Model):
nat_rule_id = models.CharField(primary_key=True, max_length=36)
nat_gateway = models.ForeignKey('BmServiceNatGateway', on_delete=models.PROTECT, related_name='nat_rule')
floatingip_id = models.CharField(max_length=36)
floating_ip_address = models.CharField(max_length=64, null=True, blank=True)
scenes = models.CharField(max_length=64, blank=True, null=True, default='vpc')
external_port = models.IntegerField(max_length=11, blank=True, null=True)
protocol = models.CharField(max_length=36, blank=True, null=True)
internal_ip_address = models.CharField(max_length=64, blank=True, null=True)
internal_port_id = models.CharField(max_length=64, blank=True, null=True)
internal_port = models.IntegerField(max_length=11, blank=True, null=True)
description = models.TextField(blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
def to_dict(self):
opts = self._meta
data = {}
for f in opts.concrete_fields + opts.many_to_many:
if isinstance(f, ManyToManyField):
if self.pk is None:
data[f.name] = []
else:
data[f.name] = list(f.value_from_object(self).values_list('pk', flat=True))
else:
data[f.name] = f.value_from_object(self)
return data
class Meta:
managed = False
db_table = 'bm_nat_rule'
class BmShareBandWidth(models.Model):
id = models.CharField(primary_key=True, max_length=64, default=UUIDTools.uuid4_hex, help_text="")
shared_bandwidth_id = models.CharField(max_length=64, help_text="")
billing_type = models.CharField(max_length=64, blank=True, null=True)
order_id = models.CharField(max_length=64, blank=True, null=True)
account_id = models.CharField(max_length=64, blank=True, null=True)
contract_id = models.CharField(max_length=255, blank=True, null=True)
contract_number = models.CharField(max_length=255, blank=True, null=True)
project_id = models.CharField(max_length=64, blank=True, null=True)
name = models.CharField(max_length=255, blank=True, null=True)
max_kbps = models.IntegerField(blank=True, null=True)
create_at = models.DateTimeField(blank=True, null=True)
update_at = models.DateTimeField(blank=True, null=True)
is_measure_end = models.IntegerField(blank=True, null=True, default=0)
first_create_at = models.DateTimeField(blank=True, null=True)
status = models.CharField(max_length=64, blank=True, null=True)
deleted_at = models.DateTimeField(blank=True, null=True)
class Meta:
managed = False
db_table = 'bm_share_bandwidth'
class ShareBandWidthQuota(models.Model):
project_id = models.CharField(unique=True, max_length=64)
share_bandwidth_count = models.IntegerField(default=5)
floating_ip_count = models.IntegerField(default=20)
class Meta:
managed = False
db_table = 'share_bandwidth_quota'
| 53.986444 | 119 | 0.73011 | 7,226 | 51,773 | 5.037088 | 0.055217 | 0.112094 | 0.181081 | 0.236799 | 0.895626 | 0.891587 | 0.876504 | 0.855349 | 0.852217 | 0.836392 | 0 | 0.029929 | 0.155873 | 51,773 | 958 | 120 | 54.042797 | 0.80292 | 0.014332 | 0 | 0.65 | 1 | 0 | 0.036144 | 0.016876 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009756 | false | 0.002439 | 0.006098 | 0.00122 | 0.790244 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
0900895053c1b365aae0e7791db24bc523197daf | 95,391 | py | Python | pyVmomi/_typeinfo_sso.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | pyVmomi/_typeinfo_sso.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | pyVmomi/_typeinfo_sso.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | # ******* WARNING - AUTO GENERATED CODE - DO NOT EDIT *******
from .VmomiSupport import CreateDataType, CreateManagedType
from .VmomiSupport import CreateEnumType
from .VmomiSupport import AddVersion, AddVersionParent
from .VmomiSupport import AddBreakingChangesInfo
from .VmomiSupport import F_LINK, F_LINKABLE
from .VmomiSupport import F_OPTIONAL, F_SECRET
from .VmomiSupport import newestVersions, ltsVersions
from .VmomiSupport import dottedVersions, oldestVersions
AddVersion("sso.version.version3", "sso", "version3", 0, "")
AddVersion("vmodl.version.version0", "", "", 0, "vim25")
AddVersion("sso.version.version3_1", "sso", "version3_1", 0, "")
AddVersion("sso.version.version3_2", "sso", "version3_2", 0, "")
AddVersion("sso.version.version1_5", "sso", "version1_5", 0, "")
AddVersion("sso.version.version2_5", "sso", "version2_5", 0, "")
AddVersion("sso.version.version3_5", "sso", "version3_5", 0, "")
AddVersion("sso.version.version1", "sso", "version1", 0, "")
AddVersion("sso.version.version2", "sso", "version2", 0, "")
AddVersionParent("sso.version.version3", "sso.version.version3")
AddVersionParent("sso.version.version3", "vmodl.version.version0")
AddVersionParent("sso.version.version3", "sso.version.version1_5")
AddVersionParent("sso.version.version3", "sso.version.version2_5")
AddVersionParent("sso.version.version3", "sso.version.version1")
AddVersionParent("sso.version.version3", "sso.version.version2")
AddVersionParent("vmodl.version.version0", "vmodl.version.version0")
AddVersionParent("sso.version.version3_1", "sso.version.version3")
AddVersionParent("sso.version.version3_1", "vmodl.version.version0")
AddVersionParent("sso.version.version3_1", "sso.version.version3_1")
AddVersionParent("sso.version.version3_1", "sso.version.version1_5")
AddVersionParent("sso.version.version3_1", "sso.version.version2_5")
AddVersionParent("sso.version.version3_1", "sso.version.version1")
AddVersionParent("sso.version.version3_1", "sso.version.version2")
AddVersionParent("sso.version.version3_2", "sso.version.version3")
AddVersionParent("sso.version.version3_2", "vmodl.version.version0")
AddVersionParent("sso.version.version3_2", "sso.version.version3_1")
AddVersionParent("sso.version.version3_2", "sso.version.version3_2")
AddVersionParent("sso.version.version3_2", "sso.version.version1_5")
AddVersionParent("sso.version.version3_2", "sso.version.version2_5")
AddVersionParent("sso.version.version3_2", "sso.version.version1")
AddVersionParent("sso.version.version3_2", "sso.version.version2")
AddVersionParent("sso.version.version1_5", "vmodl.version.version0")
AddVersionParent("sso.version.version1_5", "sso.version.version1_5")
AddVersionParent("sso.version.version1_5", "sso.version.version1")
AddVersionParent("sso.version.version2_5", "vmodl.version.version0")
AddVersionParent("sso.version.version2_5", "sso.version.version1_5")
AddVersionParent("sso.version.version2_5", "sso.version.version2_5")
AddVersionParent("sso.version.version2_5", "sso.version.version1")
AddVersionParent("sso.version.version2_5", "sso.version.version2")
AddVersionParent("sso.version.version3_5", "sso.version.version3")
AddVersionParent("sso.version.version3_5", "vmodl.version.version0")
AddVersionParent("sso.version.version3_5", "sso.version.version3_1")
AddVersionParent("sso.version.version3_5", "sso.version.version3_2")
AddVersionParent("sso.version.version3_5", "sso.version.version1_5")
AddVersionParent("sso.version.version3_5", "sso.version.version2_5")
AddVersionParent("sso.version.version3_5", "sso.version.version3_5")
AddVersionParent("sso.version.version3_5", "sso.version.version1")
AddVersionParent("sso.version.version3_5", "sso.version.version2")
AddVersionParent("sso.version.version1", "vmodl.version.version0")
AddVersionParent("sso.version.version1", "sso.version.version1")
AddVersionParent("sso.version.version2", "vmodl.version.version0")
AddVersionParent("sso.version.version2", "sso.version.version1_5")
AddVersionParent("sso.version.version2", "sso.version.version1")
AddVersionParent("sso.version.version2", "sso.version.version2")
newestVersions.Add("sso.version.version3_5")
ltsVersions.Add("sso.version.version3_5")
dottedVersions.Add("sso.version.version3_5")
oldestVersions.Add("sso.version.version1")
CreateDataType("sso.AboutInfo", "SsoAboutInfo", "vmodl.DynamicData", "sso.version.version1", [("version", "string", "sso.version.version1", 0), ("build", "string", "sso.version.version1", 0), ("apiRevision", "string", "sso.version.version1", 0), ("clusterId", "string", "sso.version.version1", 0), ("deploymentId", "string", "sso.version.version1", 0), ("ssoProductInfo", "string", "sso.version.version3_1", F_OPTIONAL), ("ssoProductVersionMajor", "string", "sso.version.version3_1", F_OPTIONAL), ("ssoProductVersionMinor", "string", "sso.version.version3_1", F_OPTIONAL), ("ssoProductVersionMaint", "string", "sso.version.version3_1", F_OPTIONAL)])
CreateDataType("sso.PrincipalId", "SsoPrincipalId", "vmodl.DynamicData", "sso.version.version1", [("name", "string", "sso.version.version1", 0), ("domain", "string", "sso.version.version1", 0)])
CreateManagedType("sso.SessionManager", "SsoSessionManager", "vmodl.ManagedObject", "sso.version.version1", [("defaultLocale", "string", "sso.version.version1", 0, "System.Anonymous"), ("supportedLocales", "string[]", "sso.version.version1", 0, "System.Anonymous")], [("login", "Login", "sso.version.version1", (), (0, "void", "void"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("logout", "Logout", "sso.version.version1", (), (0, "void", "void"), "System.Anonymous", ["vmodl.fault.InvalidRequest", ]), ("setLocale", "SetLocale", "sso.version.version1", (("locale", "string", "sso.version.version1", 0, None),), (0, "string", "string"), "System.Anonymous", None), ("getLocale", "GetLocale", "sso.version.version1", (), (0, "string", "string"), "System.Anonymous", None)])
CreateDataType("sso.admin.ActiveDirectoryJoinInfo", "SsoAdminActiveDirectoryJoinInfo", "vmodl.DynamicData", "sso.version.version1_5", [("joinStatus", "string", "sso.version.version1_5", 0), ("name", "string", "sso.version.version1_5", F_OPTIONAL), ("alias", "string", "sso.version.version1_5", F_OPTIONAL), ("dn", "string", "sso.version.version3_1", F_OPTIONAL)])
CreateEnumType("sso.admin.ActiveDirectoryJoinInfo.JoinStatus", "SsoAdminActiveDirectoryJoinInfoJoinStatus", "sso.version.version1_5", ["ACTIVE_DIRECTORY_JOIN_STATUS_UNKNOWN", "ACTIVE_DIRECTORY_JOIN_STATUS_WORKGROUP", "ACTIVE_DIRECTORY_JOIN_STATUS_DOMAIN"])
CreateDataType("sso.admin.AuthenticationAccountInfo", "SsoAdminAuthenticationAccountInfo", "vmodl.DynamicData", "sso.version.version1_5", [("userName", "string", "sso.version.version1_5", F_OPTIONAL), ("spn", "string", "sso.version.version1_5", F_OPTIONAL), ("useMachineAccount", "boolean", "sso.version.version1_5", 0)])
CreateDataType("sso.admin.AuthnPolicy", "SsoAdminAuthnPolicy", "vmodl.DynamicData", "sso.version.version3_2", [("PasswordAuthnEnabled", "boolean", "sso.version.version3_2", 0), ("WindowsAuthEnabled", "boolean", "sso.version.version3_2", 0), ("CertAuthEnabled", "boolean", "sso.version.version3_2", 0), ("clientCertPolicy", "sso.admin.ClientCertPolicy", "sso.version.version3_2", F_OPTIONAL)])
CreateManagedType("sso.admin.CertificateManager", "SsoAdminCertificateManager", "vmodl.ManagedObject", "sso.version.version1", None, [("addCertificate", "AddCertificate", "sso.version.version1", (("certificate", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getAllCertificates", "GetAllCertificates", "sso.version.version1", (), (F_OPTIONAL, "string[]", "string[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findCertificate", "FindCertificate", "sso.version.version1", (("fingerprint", "string", "sso.version.version1", 0, None),), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("deleteCertificate", "DeleteCertificate", "sso.version.version1", (("fingerprint", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.ClientCertPolicy", "SsoAdminClientCertPolicy", "vmodl.DynamicData", "sso.version.version3_2", [("enabled", "boolean", "sso.version.version3_2", 0), ("ocspEnabled", "boolean", "sso.version.version3_2", 0), ("useCRLAsFailOver", "boolean", "sso.version.version3_2", 0), ("sendOCSPNonce", "boolean", "sso.version.version3_2", 0), ("ocspUrl", "string", "sso.version.version3_2", F_OPTIONAL), ("ocspResponderSigningCert", "string", "sso.version.version3_2", F_OPTIONAL), ("useInCertCRL", "boolean", "sso.version.version3_2", 0), ("crlUrl", "string", "sso.version.version3_2", F_OPTIONAL), ("crlCacheSize", "int", "sso.version.version3_2", 0), ("oids", "string[]", "sso.version.version3_2", F_OPTIONAL), ("trustedCAs", "string[]", "sso.version.version3_2", F_OPTIONAL)])
CreateManagedType("sso.admin.ComputerManagementService", "SsoAdminComputerManagementService", "vmodl.ManagedObject", "sso.version.version3_1", None, [("getComputers", "GetComputers", "sso.version.version3_1", (("getDCOnly", "boolean", "sso.version.version3_1", 0, None),), (0, "sso.admin.VmHost[]", "sso.admin.VmHost[]"), "SystemConfiguration.Administrators", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateManagedType("sso.admin.ConfigurationManagementService", "SsoAdminConfigurationManagementService", "vmodl.ManagedObject", "sso.version.version1", None, [("getKnownCertificateChains", "GetKnownCertificateChains", "sso.version.version1", (), (0, "sso.admin.ConfigurationManagementService.CertificateChain[]", "sso.admin.ConfigurationManagementService.CertificateChain[]"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("getTrustedCertificates", "GetTrustedCertificates", "sso.version.version1", (), (0, "string[]", "string[]"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("getIssuersCertificates", "GetIssuersCertificates", "sso.version.version1_5", (), (0, "string[]", "string[]"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("importTrustedSTSConfiguration", "ImportTrustedSTSConfiguration", "sso.version.version2", (("stsConfig", "sso.admin.TrustedSTSConfig", "sso.version.version2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.ExternalSTSCertChainInvalidTrustedPathFault", "sso.admin.fault.ExternalSTSExtraneousCertsInCertChainFault", ]), ("removeTrustedSTSConfiguration", "RemoveTrustedSTSConfiguration", "sso.version.version2", (("issuerName", "string", "sso.version.version2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.ExternalSTSCertChainInvalidTrustedPathFault", "sso.admin.fault.ExternalSTSExtraneousCertsInCertChainFault", ]), ("importExternalIDPConfiguration", "ImportExternalIDPConfiguration", "sso.version.version2", (("externalIDPConfigDoc", "string", "sso.version.version2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.ExternalSTSCertChainInvalidTrustedPathFault", "sso.admin.fault.ExternalSTSExtraneousCertsInCertChainFault", ]), ("createExternalIDPConfiguration", "CreateExternalIDPConfiguration", "sso.version.version3", (("config", "sso.admin.IDPConfiguration", "sso.version.version3", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.admin.fault.CertChainInvalidTrustedPathFault", "sso.admin.fault.ExtraneousCertsInCertChainFault", ]), ("getExternalIDPConfiguration", "GetExternalIDPConfiguration", "sso.version.version3", (("entityID", "string", "sso.version.version3", 0, None),), (0, "sso.admin.IDPConfiguration", "sso.admin.IDPConfiguration"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.admin.fault.NoSuchConfigFault", ]), ("setExternalIDPConfiguration", "SetExternalIDPConfiguration", "sso.version.version3", (("config", "sso.admin.IDPConfiguration", "sso.version.version3", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.admin.fault.CertChainInvalidTrustedPathFault", "sso.admin.fault.ExtraneousCertsInCertChainFault", ]), ("deleteExternalIDPConfiguration", "DeleteExternalIDPConfiguration", "sso.version.version3", (("entityID", "string", "sso.version.version3", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.admin.fault.NoSuchConfigFault", ]), ("deleteExternalIDPConfigurationAndUsers", "DeleteExternalIDPConfigurationAndUsers", "sso.version.version3_5", (("entityID", "string", "sso.version.version3_5", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.admin.fault.NoSuchConfigFault", ]), ("enumerateExternalIDPEntityIDs", "EnumerateExternalIDPEntityIDs", "sso.version.version3", (), (F_OPTIONAL, "string[]", "string[]"), "Sso.AdminServer.Administer", None), ("getExternalIdpTrustedCertificateChains", "GetExternalIdpTrustedCertificateChains", "sso.version.version2", (), (0, "sso.admin.ConfigurationManagementService.CertificateChain[]", "sso.admin.ConfigurationManagementService.CertificateChain[]"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("getExternalIdpTrustedCertificateChain", "GetExternalIdpTrustedCertificateChain", "sso.version.version2", (("entityId", "string", "sso.version.version2", 0, None),), (0, "sso.admin.ConfigurationManagementService.CertificateChain", "sso.admin.ConfigurationManagementService.CertificateChain"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("deleteTrustedCertificate", "DeleteTrustedCertificate", "sso.version.version1_5", (("fingerprint", "string", "sso.version.version1_5", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.admin.fault.CertificateDeletionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setNewSignerIdentity", "SetNewSignerIdentity", "sso.version.version1", (("signingKey", "string", "sso.version.version1", F_SECRET, None),("signingCertificateChain", "sso.admin.ConfigurationManagementService.CertificateChain", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setSignerIdentity", "SetSignerIdentity", "sso.version.version1", (("adminUser", "sso.PrincipalId", "sso.version.version1", 0, None),("adminPass", "string", "sso.version.version1", F_SECRET, None),("signingKey", "string", "sso.version.version1", F_SECRET, None),("signingCertificateChain", "sso.admin.ConfigurationManagementService.CertificateChain", "sso.version.version1", 0, None),), (0, "void", "void"), None, ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getClockTolerance", "GetClockTolerance", "sso.version.version1", (), (0, "long", "long"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("setClockTolerance", "SetClockTolerance", "sso.version.version1", (("milliseconds", "long", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getDelegationCount", "GetDelegationCount", "sso.version.version1", (), (0, "int", "int"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setDelegationCount", "SetDelegationCount", "sso.version.version1", (("delegationCount", "int", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("getRenewCount", "GetRenewCount", "sso.version.version1", (), (0, "int", "int"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setRenewCount", "SetRenewCount", "sso.version.version1", (("renewCount", "int", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getMaximumBearerTokenLifetime", "GetMaximumBearerTokenLifetime", "sso.version.version1", (), (0, "long", "long"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setMaximumBearerTokenLifetime", "SetMaximumBearerTokenLifetime", "sso.version.version1", (("maxLifetime", "long", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getMaximumHoKTokenLifetime", "GetMaximumHoKTokenLifetime", "sso.version.version1", (), (0, "long", "long"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setMaximumHoKTokenLifetime", "SetMaximumHoKTokenLifetime", "sso.version.version1", (("maxLifetime", "long", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getPasswordExpirationConfiguration", "GetPasswordExpirationConfiguration", "sso.version.version1", (), (0, "sso.admin.PasswordExpirationConfig", "sso.admin.PasswordExpirationConfig"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updatePasswordExpirationConfiguration", "UpdatePasswordExpirationConfiguration", "sso.version.version1", (("config", "sso.admin.PasswordExpirationConfig", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("importSAMLMetadata", "ImportSAMLMetadata", "sso.version.version2", (("samlConfigDoc", "string", "sso.version.version2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("deleteRelyingParty", "DeleteRelyingParty", "sso.version.version2", (("rpName", "string", "sso.version.version2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.NoSuchRelyingPartyFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getIssuerName", "GetIssuerName", "sso.version.version1_5", (), (0, "string", "string"), "System.Anonymous", None), ("isExternalIDPJitEnabled", "IsExternalIDPJitEnabled", "sso.version.version3_5", (("entityID", "string", "sso.version.version3_5", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.admin.fault.NoSuchConfigFault", ]), ("setExternalIDPJitAttribute", "SetExternalIDPJitAttribute", "sso.version.version3_5", (("entityID", "string", "sso.version.version3_5", 0, None),("enableJit", "boolean", "sso.version.version3_5", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", ]), ("setAuthnPolicy", "SetAuthnPolicy", "sso.version.version3_2", (("policy", "sso.admin.AuthnPolicy", "sso.version.version3_2", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", ]), ("getAuthnPolicy", "GetAuthnPolicy", "sso.version.version3_2", (), (F_OPTIONAL, "sso.admin.AuthnPolicy", "sso.admin.AuthnPolicy"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", ])])
CreateDataType("sso.admin.ConfigurationManagementService.CertificateChain", "SsoAdminConfigurationManagementServiceCertificateChain", "vmodl.DynamicData", "sso.version.version1", [("certificates", "string[]", "sso.version.version1", 0)])
CreateDataType("sso.admin.ConfigurationManagementService.AttributeConfig", "SsoAdminConfigurationManagementServiceAttributeConfig", "vmodl.DynamicData", "sso.version.version2", [("tokenAttribute", "string", "sso.version.version2", 0), ("storeAttribute", "string", "sso.version.version2", 0)])
CreateDataType("sso.admin.ConfigurationManagementService.TokenClaimAttribute", "SsoAdminConfigurationManagementServiceTokenClaimAttribute", "vmodl.DynamicData", "sso.version.version3_5", [("claimName", "string", "sso.version.version3_5", 0), ("claimValue", "string", "sso.version.version3_5", 0)])
CreateDataType("sso.admin.ConfigurationManagementService.TokenClaimGroupMapping", "SsoAdminConfigurationManagementServiceTokenClaimGroupMapping", "vmodl.DynamicData", "sso.version.version3_5", [("tokenClaim", "sso.admin.ConfigurationManagementService.TokenClaimAttribute", "sso.version.version3_5", 0), ("groups", "string[]", "sso.version.version3_5", 0)])
CreateManagedType("sso.admin.DeploymentInformationService", "SsoAdminDeploymentInformationService", "vmodl.ManagedObject", "sso.version.version1", None, [("isMultiSiteDeployment", "IsMultiSiteDeployment", "sso.version.version1", (), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.fault.InternalFault", ]), ("retrieveHaBackupConfigurationPackage", "RetrieveHaBackupConfigurationPackage", "sso.version.version1", (), (0, "vmodl.Binary", "vmodl.Binary"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.fault.InternalFault", ]), ("retrieveReplicaConfigurationPackage", "RetrieveReplicaConfigurationPackage", "sso.version.version1", (), (0, "vmodl.Binary", "vmodl.Binary"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.fault.InternalFault", "vmodl.fault.NotSupported", ])])
CreateDataType("sso.admin.Domain", "SsoAdminDomain", "vmodl.DynamicData", "sso.version.version1_5", [("name", "string", "sso.version.version1_5", 0), ("alias", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateManagedType("sso.admin.DomainManagementService", "SsoAdminDomainManagementService", "vmodl.ManagedObject", "sso.version.version1", None, [("probeConnectivity", "ProbeConnectivity", "sso.version.version1", (("serviceUri", "vmodl.URI", "sso.version.version1", 0, None),("authenticationType", "string", "sso.version.version1", 0, None),("authnCredentials", "sso.admin.DomainManagementService.AuthenticationCredentails", "sso.version.version1", F_OPTIONAL, None),("certificates", "string[]", "sso.version.version3_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("addExternalDomain", "AddExternalDomain", "sso.version.version1", (("serverType", "string", "sso.version.version1", 0, None),("domainName", "string", "sso.version.version1", 0, None),("domainAlias", "string", "sso.version.version1", F_OPTIONAL, None),("details", "sso.admin.ExternalDomainDetails", "sso.version.version1", 0, None),("authenticationType", "string", "sso.version.version1", 0, None),("authnCredentials", "sso.admin.DomainManagementService.AuthenticationCredentails", "sso.version.version1", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("registerLocalOSDomain", "RegisterLocalOSDomain", "sso.version.version1", (("domainName", "string", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.LocalOSDomainRegistrationFault", ]), ("getDomains", "GetDomains", "sso.version.version1", (), (0, "sso.admin.Domains", "sso.admin.Domains"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findExternalDomain", "FindExternalDomain", "sso.version.version1", (("name", "string", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.ExternalDomain", "sso.admin.ExternalDomain"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setBrandName", "SetBrandName", "sso.version.version2", (("brandName", "string", "sso.version.version2", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getBrandName", "GetBrandName", "sso.version.version2", (), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setLogonBanner", "SetLogonBanner", "sso.version.version3_1", (("logonBanner", "string", "sso.version.version3_1", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getLogonBanner", "GetLogonBanner", "sso.version.version3_1", (), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setLogonBannerTitle", "SetLogonBannerTitle", "sso.version.version3_2", (("logonBannerTitle", "string", "sso.version.version3_2", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getLogonBannerTitle", "GetLogonBannerTitle", "sso.version.version3_2", (), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setLogonBannerContent", "SetLogonBannerContent", "sso.version.version3_2", (("logonBannerContent", "string", "sso.version.version3_2", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getLogonBannerContent", "GetLogonBannerContent", "sso.version.version3_2", (), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setLogonBannerCheckboxFlag", "SetLogonBannerCheckboxFlag", "sso.version.version3_2", (("enableLogonBannerCheckbox", "boolean", "sso.version.version3_2", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getLogonBannerCheckboxFlag", "GetLogonBannerCheckboxFlag", "sso.version.version3_2", (), (F_OPTIONAL, "boolean", "boolean"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("disableLogonBanner", "DisableLogonBanner", "sso.version.version3_2", (), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("isLogonBannerDisabled", "IsLogonBannerDisabled", "sso.version.version3_2", (), (F_OPTIONAL, "boolean", "boolean"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSystemDomainName", "GetSystemDomainName", "sso.version.version1", (), (0, "string", "string"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("getSystemTenantName", "GetSystemTenantName", "sso.version.version2", (), (0, "string", "string"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("getLocalOSDomainName", "GetLocalOSDomainName", "sso.version.version1", (), (F_OPTIONAL, "string", "string"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateExternalDomainDetails", "UpdateExternalDomainDetails", "sso.version.version1", (("name", "string", "sso.version.version1", 0, None),("details", "sso.admin.ExternalDomainDetails", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateExternalDomainAuthnType", "UpdateExternalDomainAuthnType", "sso.version.version1", (("name", "string", "sso.version.version1", 0, None),("authnType", "string", "sso.version.version1", 0, None),("authnCredentials", "sso.admin.DomainManagementService.AuthenticationCredentails", "sso.version.version1", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("registerUpnSuffix", "RegisterUpnSuffix", "sso.version.version2_5", (("domainName", "string", "sso.version.version2_5", 0, None),("upnSuffix", "string", "sso.version.version2_5", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("unRegisterUpnSuffix", "UnRegisterUpnSuffix", "sso.version.version2_5", (("domainName", "string", "sso.version.version2_5", 0, None),("upnSuffix", "string", "sso.version.version2_5", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getUpnSuffixes", "GetUpnSuffixes", "sso.version.version2_5", (("domainName", "string", "sso.version.version2_5", 0, None),), (F_OPTIONAL, "string[]", "string[]"), "System.Read", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("deleteDomain", "DeleteDomain", "sso.version.version1", (("name", "string", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSslCertificateManager", "GetSslCertificateManager", "sso.version.version1", (), (0, "sso.admin.CertificateManager", "sso.admin.CertificateManager"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("setDefaultDomains", "SetDefaultDomains", "sso.version.version1", (("domainNames", "string[]", "sso.version.version1", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getDefaultDomains", "GetDefaultDomains", "sso.version.version1", (), (F_OPTIONAL, "string[]", "string[]"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSslIdentity", "GetSslIdentity", "sso.version.version1_5", (("host", "string", "sso.version.version1_5", 0, None),("ldapsPort", "int", "sso.version.version1_5", 0, None),), (F_OPTIONAL, "sso.admin.ConfigurationManagementService.CertificateChain", "sso.admin.ConfigurationManagementService.CertificateChain"), "System.Read", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.DomainManagementService.AuthenticationCredentails", "SsoAdminDomainManagementServiceAuthenticationCredentails", "vmodl.DynamicData", "sso.version.version1", [("username", "string", "sso.version.version1", 0), ("password", "string", "sso.version.version1", F_SECRET), ("useMachineAccount", "boolean", "sso.version.version1_5", F_OPTIONAL), ("spn", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.Domains", "SsoAdminDomains", "vmodl.DynamicData", "sso.version.version1", [("externalDomains", "sso.admin.ExternalDomain[]", "sso.version.version1", F_OPTIONAL), ("systemDomainName", "string", "sso.version.version1", 0), ("systemDomainUpnSuffixes", "string[]", "sso.version.version2_5", F_OPTIONAL), ("localOSDomainName", "string", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.ExternalDomain", "SsoAdminExternalDomain", "vmodl.DynamicData", "sso.version.version1", [("type", "string", "sso.version.version1", 0), ("name", "string", "sso.version.version1", 0), ("alias", "string", "sso.version.version1", F_OPTIONAL), ("details", "sso.admin.ExternalDomainDetails", "sso.version.version1", 0), ("authenticationDetails", "sso.admin.ExternalDomain.AuthenticationDetails", "sso.version.version1", 0)])
CreateEnumType("sso.admin.ExternalDomain.Type", "SsoAdminExternalDomainType", "sso.version.version1", ["ActiveDirectory", "OpenLdap", "NIS"])
CreateEnumType("sso.admin.ExternalDomain.AuthenticationType", "SsoAdminExternalDomainAuthenticationType", "sso.version.version1", ["anonymous", "password", "reuseSession"])
CreateDataType("sso.admin.ExternalDomain.AuthenticationDetails", "SsoAdminExternalDomainAuthenticationDetails", "vmodl.DynamicData", "sso.version.version1", [("authenticationType", "string", "sso.version.version1", 0), ("username", "string", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.ExternalDomainAttributeMapping", "SsoAdminExternalDomainAttributeMapping", "vmodl.DynamicData", "sso.version.version1_5", [("attributeId", "string", "sso.version.version1_5", 0), ("attributeName", "string", "sso.version.version1_5", 0)])
CreateEnumType("sso.admin.ExternalDomainAttributeMapping.AttributeIds", "SsoAdminExternalDomainAttributeMappingAttributeIds", "sso.version.version1_5", ["UserAttributeAccountName", "UserAttributeLastName", "UserAttributeFirstName", "UserAttributeDescription", "UserAttributeDisplayName", "UserAttributeEmail", "UserAttributeObjectId", "UserAttributePrincipalName", "UserAttributeAcountControl", "UserAttributeMemberOf", "UserAttributePrimaryGroupId", "UserAttributeLockoutTime", "UserAttributePasswordSettingsObject", "UserAttributePwdLastSet", "GroupAttributeAccountName", "GroupAttributeDescription", "GroupAttributeObjectId", "GroupAttributeMemberOf", "GroupAttributeMembersList", "PasswordSettingsAttributeMaximumPwdAge", "DomainAttributeMaxPwdAge"])
CreateDataType("sso.admin.ExternalDomainDetails", "SsoAdminExternalDomainDetails", "vmodl.DynamicData", "sso.version.version1", [("friendlyName", "string", "sso.version.version1", 0), ("userBaseDn", "string", "sso.version.version1", F_OPTIONAL), ("groupBaseDn", "string", "sso.version.version1", F_OPTIONAL), ("primaryUrl", "vmodl.URI", "sso.version.version1", 0), ("failoverUrl", "vmodl.URI", "sso.version.version1", F_OPTIONAL), ("searchTimeoutSeconds", "int", "sso.version.version1", 0), ("schemaDetails", "sso.admin.ExternalDomainSchemaDetails", "sso.version.version2", F_OPTIONAL), ("upnSuffixes", "string[]", "sso.version.version2_5", F_OPTIONAL)])
CreateDataType("sso.admin.ExternalDomainObjectMapping", "SsoAdminExternalDomainObjectMapping", "vmodl.DynamicData", "sso.version.version1_5", [("objectId", "string", "sso.version.version1_5", 0), ("objectClass", "string", "sso.version.version1_5", F_OPTIONAL), ("attributeMappings", "sso.admin.ExternalDomainAttributeMapping[]", "sso.version.version1_5", F_OPTIONAL)])
CreateEnumType("sso.admin.ExternalDomainObjectMapping.ObjectIds", "SsoAdminExternalDomainObjectMappingObjectIds", "sso.version.version1_5", ["ObjectIdUser", "ObjectIdGroup", "ObjectIdPasswordSettings", "ObjectIdDomain"])
CreateDataType("sso.admin.ExternalDomainSchemaDetails", "SsoAdminExternalDomainSchemaDetails", "vmodl.DynamicData", "sso.version.version1_5", [("objectMappings", "sso.admin.ExternalDomainObjectMapping[]", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.Group", "SsoAdminGroup", "vmodl.DynamicData", "sso.version.version1", [("id", "sso.PrincipalId", "sso.version.version1", 0), ("alias", "sso.PrincipalId", "sso.version.version1", F_OPTIONAL), ("details", "sso.admin.GroupDetails", "sso.version.version1", 0)])
CreateDataType("sso.admin.GroupDetails", "SsoAdminGroupDetails", "vmodl.DynamicData", "sso.version.version1", [("description", "string", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.IDPConfiguration", "SsoAdminIDPConfiguration", "vmodl.DynamicData", "sso.version.version3", [("entityID", "string", "sso.version.version3", 0), ("nameIDFormats", "string[]", "sso.version.version3", F_OPTIONAL), ("ssoServices", "sso.admin.ServiceEndpoint[]", "sso.version.version3", F_OPTIONAL), ("sloServices", "sso.admin.ServiceEndpoint[]", "sso.version.version3", F_OPTIONAL), ("signingCertificateChain", "string[]", "sso.version.version3", 0), ("subjectFormatMappings", "sso.admin.ConfigurationManagementService.AttributeConfig[]", "sso.version.version3", F_OPTIONAL), ("isJitEnabled", "boolean", "sso.version.version3_5", F_OPTIONAL), ("tokenClaimGroupMappings", "sso.admin.ConfigurationManagementService.TokenClaimGroupMapping[]", "sso.version.version3_5", F_OPTIONAL), ("upnSuffix", "string", "sso.version.version3_5", F_OPTIONAL)])
CreateDataType("sso.admin.IdentitySource", "SsoAdminIdentitySource", "vmodl.DynamicData", "sso.version.version1_5", [("name", "string", "sso.version.version1_5", 0), ("domains", "sso.admin.Domain[]", "sso.version.version1_5", 0)])
CreateManagedType("sso.admin.IdentitySourceManagementService", "SsoAdminIdentitySourceManagementService", "vmodl.ManagedObject", "sso.version.version1_5", None, [("registerLdap", "RegisterLdap", "sso.version.version1_5", (("serverType", "string", "sso.version.version1_5", 0, None),("domainName", "string", "sso.version.version1_5", 0, None),("domainAlias", "string", "sso.version.version1_5", F_OPTIONAL, None),("details", "sso.admin.LdapIdentitySourceDetails", "sso.version.version1_5", 0, None),("authenticationType", "string", "sso.version.version1_5", 0, None),("authnCredentials", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version1_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.DirectoryServiceConnectionFault", "sso.admin.fault.ADIDSAlreadyExistFault", "sso.admin.fault.InvalidProviderFault", ]), ("registerActiveDirectory", "RegisterActiveDirectory", "sso.version.version1_5", (("domainName", "string", "sso.version.version1_5", 0, None),("authnCredentials", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version1_5", 0, None),("schemaMapping", "sso.admin.ExternalDomainSchemaDetails", "sso.version.version1_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.ADIDSAlreadyExistFault", "sso.admin.fault.HostNotJoinedRequiredDomainFault", "sso.admin.fault.DomainManagerFault", "sso.fault.InvalidPrincipalFault", ]), ("registerLocalOS", "RegisterLocalOS", "sso.version.version1_5", (("name", "string", "sso.version.version1_5", 0, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.LocalOSDomainRegistrationFault", ]), ("get", "Get", "sso.version.version1_5", (), (0, "sso.admin.IdentitySources", "sso.admin.IdentitySources"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getActiveDirectoryAuthnAccountInfo", "GetActiveDirectoryAuthnAccountInfo", "sso.version.version1_5", (), (F_OPTIONAL, "sso.admin.AuthenticationAccountInfo", "sso.admin.AuthenticationAccountInfo"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSystemDomainName", "IdS_getSystemDomainName", "sso.version.version1_5", (), (0, "string", "string"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("updateLdap", "UpdateLdap", "sso.version.version1_5", (("name", "string", "sso.version.version1_5", 0, None),("details", "sso.admin.LdapIdentitySourceDetails", "sso.version.version1_5", 0, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.DirectoryServiceConnectionFault", "sso.admin.fault.InvalidProviderFault", ]), ("updateActiveDirectory", "UpdateActiveDirectory", "sso.version.version1_5", (("domainName", "string", "sso.version.version1_5", 0, None),("authnCredentials", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version1_5", 0, None),("schemaMapping", "sso.admin.ExternalDomainSchemaDetails", "sso.version.version1_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.admin.fault.DomainManagerFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.fault.InvalidPrincipalFault", "sso.admin.fault.DuplicateDomainNameFault", ]), ("updateLdapAuthnType", "UpdateLdapAuthnType", "sso.version.version1_5", (("name", "string", "sso.version.version1_5", 0, None),("authnType", "string", "sso.version.version1_5", 0, None),("authnCredentials", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version1_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "sso.admin.fault.DirectoryServiceConnectionFault", ]), ("delete", "Delete", "sso.version.version1_5", (("name", "string", "sso.version.version1_5", 0, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("setDefaultDomains", "IdS_setDefaultDomains", "sso.version.version1_5", (("domainNames", "string[]", "sso.version.version1_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DomainNotFoundFault", "sso.admin.fault.DuplicateDomainNameFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getDefaultDomains", "IdS_getDefaultDomains", "sso.version.version1_5", (), (F_OPTIONAL, "string[]", "string[]"), "Sso.IdentitySource.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSslCertificateManager", "IdS_getSslCertificateManager", "sso.version.version1_5", (), (0, "sso.admin.CertificateManager", "sso.admin.CertificateManager"), "System.Anonymous", ["sso.fault.InvalidCredentials", ]), ("probeConnectivity", "IdS_probeConnectivity", "sso.version.version1_5", (("serviceUri", "vmodl.URI", "sso.version.version1_5", 0, None),("authenticationType", "string", "sso.version.version1_5", 0, None),("authnCredentials", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version1_5", F_OPTIONAL, None),("certificates", "string[]", "sso.version.version3_5", F_OPTIONAL, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("probeLdapConnectivity", "IdS_probeLdapConnectivity", "sso.version.version3_5", (("domainName", "string", "sso.version.version3_5", 0, None),("authnCredential", "sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "sso.version.version3_5", 0, None),("identitySource", "sso.admin.LdapIdentitySource", "sso.version.version3_5", 0, None),), (0, "void", "void"), "Sso.IdentitySource.Administer", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getSslIdentity", "IdS_getSslIdentity", "sso.version.version1_5", (("host", "string", "sso.version.version1_5", 0, None),("ldapsPort", "int", "sso.version.version1_5", 0, None),), (F_OPTIONAL, "sso.admin.ConfigurationManagementService.CertificateChain", "sso.admin.ConfigurationManagementService.CertificateChain"), "System.Read", ["sso.admin.fault.DirectoryServiceConnectionFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.IdentitySourceManagementService.AuthenticationCredentials", "SsoAdminIdentitySourceManagementServiceAuthenticationCredentials", "vmodl.DynamicData", "sso.version.version1_5", [("username", "string", "sso.version.version1_5", 0), ("password", "string", "sso.version.version1_5", F_SECRET), ("useMachineAccount", "boolean", "sso.version.version1_5", F_OPTIONAL), ("spn", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.IdentitySources", "SsoAdminIdentitySources", "vmodl.DynamicData", "sso.version.version1_5", [("all", "sso.admin.IdentitySource[]", "sso.version.version1_5", 0), ("system", "sso.admin.IdentitySource", "sso.version.version1_5", 0), ("localOS", "sso.admin.IdentitySource", "sso.version.version1_5", F_OPTIONAL), ("ldaps", "sso.admin.LdapIdentitySource[]", "sso.version.version1_5", F_OPTIONAL), ("nativeAD", "sso.admin.IdentitySource", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.LdapIdentitySource", "SsoAdminLdapIdentitySource", "sso.admin.IdentitySource", "sso.version.version1_5", [("type", "string", "sso.version.version1_5", 0), ("details", "sso.admin.LdapIdentitySourceDetails", "sso.version.version1_5", 0), ("authenticationDetails", "sso.admin.LdapIdentitySource.AuthenticationDetails", "sso.version.version1_5", 0)])
CreateEnumType("sso.admin.LdapIdentitySource.Type", "SsoAdminLdapIdentitySourceType", "sso.version.version1_5", ["ActiveDirectory", "OpenLdap"])
CreateEnumType("sso.admin.LdapIdentitySource.AuthenticationType", "SsoAdminLdapIdentitySourceAuthenticationType", "sso.version.version1_5", ["anonymous", "password", "reuseSession"])
CreateDataType("sso.admin.LdapIdentitySource.AuthenticationDetails", "SsoAdminLdapIdentitySourceAuthenticationDetails", "vmodl.DynamicData", "sso.version.version1_5", [("authenticationType", "string", "sso.version.version1_5", 0), ("username", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.LdapIdentitySourceDetails", "SsoAdminLdapIdentitySourceDetails", "vmodl.DynamicData", "sso.version.version1_5", [("friendlyName", "string", "sso.version.version1_5", 0), ("userBaseDn", "string", "sso.version.version1_5", F_OPTIONAL), ("groupBaseDn", "string", "sso.version.version1_5", F_OPTIONAL), ("primaryUrl", "vmodl.URI", "sso.version.version1_5", 0), ("failoverUrl", "vmodl.URI", "sso.version.version1_5", F_OPTIONAL), ("searchTimeoutSeconds", "int", "sso.version.version1_5", 0), ("isSiteAffinityEnabled", "boolean", "sso.version.version3_5", F_OPTIONAL), ("certificates", "string[]", "sso.version.version3_5", F_OPTIONAL)])
CreateDataType("sso.admin.LockoutPolicy", "SsoAdminLockoutPolicy", "vmodl.DynamicData", "sso.version.version1", [("description", "string", "sso.version.version1", F_OPTIONAL), ("maxFailedAttempts", "int", "sso.version.version1", 0), ("failedAttempts", "int", "sso.version.version2", 0), ("failedAttemptIntervalSec", "long", "sso.version.version1", 0), ("autoUnlockIntervalSec", "long", "sso.version.version1", 0)])
CreateManagedType("sso.admin.LockoutPolicyService", "SsoAdminLockoutPolicyService", "vmodl.ManagedObject", "sso.version.version1", None, [("getLockoutPolicy", "GetLockoutPolicy", "sso.version.version1", (), (0, "sso.admin.LockoutPolicy", "sso.admin.LockoutPolicy"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateLockoutPolicy", "UpdateLockoutPolicy", "sso.version.version1", (("policy", "sso.admin.LockoutPolicy", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.MailContent", "SsoAdminMailContent", "vmodl.DynamicData", "sso.version.version1", [("from", "string", "sso.version.version1", 0), ("to", "string", "sso.version.version1", 0), ("subject", "string", "sso.version.version1", 0), ("content", "string", "sso.version.version1", 0)])
CreateDataType("sso.admin.PasswordExpirationConfig", "SsoAdminPasswordExpirationConfig", "vmodl.DynamicData", "sso.version.version1", [("emailNotificationEnabled", "boolean", "sso.version.version1", 0), ("emailFrom", "string", "sso.version.version1", F_OPTIONAL), ("emailSubject", "string", "sso.version.version1", F_OPTIONAL), ("notificationDays", "int[]", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.PasswordFormat", "SsoAdminPasswordFormat", "vmodl.DynamicData", "sso.version.version1", [("lengthRestriction", "sso.admin.PasswordFormat.LengthRestriction", "sso.version.version1", 0), ("alphabeticRestriction", "sso.admin.PasswordFormat.AlphabeticRestriction", "sso.version.version1", 0), ("minNumericCount", "int", "sso.version.version1", 0), ("minSpecialCharCount", "int", "sso.version.version1", 0), ("maxIdenticalAdjacentCharacters", "int", "sso.version.version1", 0)])
CreateDataType("sso.admin.PasswordFormat.LengthRestriction", "SsoAdminPasswordFormatLengthRestriction", "vmodl.DynamicData", "sso.version.version1", [("minLength", "int", "sso.version.version1", 0), ("maxLength", "int", "sso.version.version1", 0)])
CreateDataType("sso.admin.PasswordFormat.AlphabeticRestriction", "SsoAdminPasswordFormatAlphabeticRestriction", "vmodl.DynamicData", "sso.version.version1", [("minAlphabeticCount", "int", "sso.version.version1", 0), ("minUppercaseCount", "int", "sso.version.version1", 0), ("minLowercaseCount", "int", "sso.version.version1", 0)])
CreateDataType("sso.admin.PasswordPolicy", "SsoAdminPasswordPolicy", "vmodl.DynamicData", "sso.version.version1", [("description", "string", "sso.version.version1", F_OPTIONAL), ("prohibitedPreviousPasswordsCount", "int", "sso.version.version1", 0), ("passwordFormat", "sso.admin.PasswordFormat", "sso.version.version1", 0), ("passwordLifetimeDays", "int", "sso.version.version1", F_OPTIONAL)])
CreateManagedType("sso.admin.PasswordPolicyService", "SsoAdminPasswordPolicyService", "vmodl.ManagedObject", "sso.version.version1", None, [("updateLocalPasswordPolicy", "UpdateLocalPasswordPolicy", "sso.version.version1", (("policy", "sso.admin.PasswordPolicy", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.InvalidPasswordPolicyFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getLocalPasswordPolicy", "GetLocalPasswordPolicy", "sso.version.version1", (), (0, "sso.admin.PasswordPolicy", "sso.admin.PasswordPolicy"), "Sso.Self.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.PersonDetails", "SsoAdminPersonDetails", "vmodl.DynamicData", "sso.version.version1", [("description", "string", "sso.version.version1", F_OPTIONAL), ("emailAddress", "string", "sso.version.version1", F_OPTIONAL), ("firstName", "string", "sso.version.version1", F_OPTIONAL), ("lastName", "string", "sso.version.version1", F_OPTIONAL), ("userPrincipalName", "string", "sso.version.version2_5", F_OPTIONAL)])
CreateDataType("sso.admin.PersonUser", "SsoAdminPersonUser", "vmodl.DynamicData", "sso.version.version1", [("id", "sso.PrincipalId", "sso.version.version1", 0), ("alias", "sso.PrincipalId", "sso.version.version1", F_OPTIONAL), ("details", "sso.admin.PersonDetails", "sso.version.version1", 0), ("disabled", "boolean", "sso.version.version1", 0), ("locked", "boolean", "sso.version.version1", 0)])
CreateManagedType("sso.admin.PrincipalDiscoveryService", "SsoAdminPrincipalDiscoveryService", "vmodl.ManagedObject", "sso.version.version1", None, [("lookup", "Lookup", "sso.version.version1_5", (("id", "sso.PrincipalId", "sso.version.version1_5", 0, None),("isGroup", "boolean", "sso.version.version1_5", 0, None),), (F_OPTIONAL, "sso.PrincipalId", "sso.PrincipalId"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findPersonUser", "FindPersonUser", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser", "sso.admin.PersonUser"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findSelfPersonUser", "FindSelfPersonUser", "sso.version.version1_5", (), (0, "sso.admin.PersonUser", "sso.admin.PersonUser"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("findSolutionUser", "FindSolutionUser", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.SolutionUser", "sso.admin.SolutionUser"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findSolutionUserByCertDN", "FindSolutionUserByCertDN", "sso.version.version1", (("certDN", "string", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.SolutionUser", "sso.admin.SolutionUser"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findUser", "FindUser", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.User", "sso.admin.User"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroup", "FindGroup", "sso.version.version1", (("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.Group", "sso.admin.Group"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findPersonUsers", "FindPersonUsers", "sso.version.version1", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findPersonUsersByName", "FindPersonUsersByName", "sso.version.version3", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version3", 0, None),("limit", "int", "sso.version.version3", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findSolutionUsers", "FindSolutionUsers", "sso.version.version1", (("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.SolutionUser[]", "sso.admin.SolutionUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findUsers", "FindUsers", "sso.version.version1", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.User[]", "sso.admin.User[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findUserAccount", "FindUserAccount", "sso.version.version2", (("userName", "string", "sso.version.version2", 0, None),), (0, "sso.admin.PrincipalDiscoveryService.SearchResult", "sso.admin.PrincipalDiscoveryService.SearchResult"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroupAccount", "FindGroupAccount", "sso.version.version2", (("groupName", "string", "sso.version.version2", 0, None),), (F_OPTIONAL, "sso.admin.Group", "sso.admin.Group"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroups", "FindGroups", "sso.version.version1", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroupsByName", "FindGroupsByName", "sso.version.version3", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version3", 0, None),("limit", "int", "sso.version.version3", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("find", "Find", "sso.version.version1", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (0, "sso.admin.PrincipalDiscoveryService.SearchResult", "sso.admin.PrincipalDiscoveryService.SearchResult"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findByName", "FindByName", "sso.version.version3", (("criteria", "sso.admin.PrincipalDiscoveryService.SearchCriteria", "sso.version.version3", 0, None),("limit", "int", "sso.version.version3", 0, None),), (0, "sso.admin.PrincipalDiscoveryService.SearchResult", "sso.admin.PrincipalDiscoveryService.SearchResult"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findUsersInGroup", "FindUsersInGroup", "sso.version.version1", (("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.User[]", "sso.admin.User[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("findPersonUsersByNameInGroup", "FindPersonUsersByNameInGroup", "sso.version.version3", (("groupId", "sso.PrincipalId", "sso.version.version3", 0, None),("searchString", "string", "sso.version.version3", 0, None),("limit", "int", "sso.version.version3", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findPersonUsersInGroup", "FindPersonUsersInGroup", "sso.version.version1", (("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findSolutionUsersInGroup", "FindSolutionUsersInGroup", "sso.version.version1", (("groupName", "string", "sso.version.version1", 0, None),("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.SolutionUser[]", "sso.admin.SolutionUser[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroupsInGroup", "FindGroupsInGroup", "sso.version.version1", (("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findGroupsByNameInGroup", "FindGroupsByNameInGroup", "sso.version.version3", (("groupId", "sso.PrincipalId", "sso.version.version3", 0, None),("searchString", "string", "sso.version.version3", 0, None),("limit", "int", "sso.version.version3", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findDirectParentGroups", "FindDirectParentGroups", "sso.version.version1", (("principalId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findNestedParentGroups", "FindNestedParentGroups", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.Group[]", "sso.admin.Group[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findLockedUsers", "FindLockedUsers", "sso.version.version1", (("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findDisabledPersonUsers", "FindDisabledPersonUsers", "sso.version.version1", (("searchString", "string", "sso.version.version1", 0, None),("limit", "int", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser[]", "sso.admin.PersonUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findDisabledSolutionUsers", "FindDisabledSolutionUsers", "sso.version.version1", (("searchString", "string", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.admin.SolutionUser[]", "sso.admin.SolutionUser[]"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findRegisteredExternalIDPUser", "FindRegisteredExternalIDPUser", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),), (F_OPTIONAL, "sso.admin.PersonUser", "sso.admin.PersonUser"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getImplicitGroupNames", "GetImplicitGroupNames", "sso.version.version2", (), (0, "string[]", "string[]"), "System.Anonymous", ["sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.PrincipalDiscoveryService.SearchResult", "SsoAdminPrincipalDiscoveryServiceSearchResult", "vmodl.DynamicData", "sso.version.version1", [("personUsers", "sso.admin.PersonUser[]", "sso.version.version1", F_OPTIONAL), ("solutionUsers", "sso.admin.SolutionUser[]", "sso.version.version1", F_OPTIONAL), ("groups", "sso.admin.Group[]", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.PrincipalDiscoveryService.SearchCriteria", "SsoAdminPrincipalDiscoveryServiceSearchCriteria", "vmodl.DynamicData", "sso.version.version1", [("searchString", "string", "sso.version.version1", 0), ("domain", "string", "sso.version.version1", 0)])
CreateManagedType("sso.admin.PrincipalManagementService", "SsoAdminPrincipalManagementService", "vmodl.ManagedObject", "sso.version.version1", None, [("createLocalPersonUser", "CreateLocalPersonUser", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),("userDetails", "sso.admin.PersonDetails", "sso.version.version1", 0, None),("password", "string", "sso.version.version1", F_SECRET, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.PasswordPolicyViolationFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("createLocalSolutionUser", "CreateLocalSolutionUser", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),("userDetails", "sso.admin.SolutionDetails", "sso.version.version1", 0, None),("external", "boolean", "sso.version.version2", F_OPTIONAL, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.DuplicateSolutionCertificateFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("createLocalGroup", "CreateLocalGroup", "sso.version.version1", (("groupName", "string", "sso.version.version1", 0, None),("groupDetails", "sso.admin.GroupDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("deleteLocalPrincipal", "DeleteLocalPrincipal", "sso.version.version1", (("principalName", "string", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("removeFromLocalGroup", "RemoveFromLocalGroup", "sso.version.version1", (("principalId", "sso.PrincipalId", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("removePrincipalsFromLocalGroup", "RemovePrincipalsFromLocalGroup", "sso.version.version1", (("principalsIds", "sso.PrincipalId[]", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean[]", "boolean[]"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("addUserToLocalGroup", "AddUserToLocalGroup", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("addUsersToLocalGroup", "AddUsersToLocalGroup", "sso.version.version1", (("userIds", "sso.PrincipalId[]", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean[]", "boolean[]"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("addGroupToLocalGroup", "AddGroupToLocalGroup", "sso.version.version1", (("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.GroupCyclicDependencyFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("addGroupsToLocalGroup", "AddGroupsToLocalGroup", "sso.version.version1", (("groupIds", "sso.PrincipalId[]", "sso.version.version1", 0, None),("groupName", "string", "sso.version.version1", 0, None),), (0, "boolean[]", "boolean[]"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.GroupCyclicDependencyFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateLocalPersonUserDetails", "UpdateLocalPersonUserDetails", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),("userDetails", "sso.admin.PersonDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("resetLocalPersonUserPassword", "ResetLocalPersonUserPassword", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),("newPassword", "string", "sso.version.version1", F_SECRET, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.PasswordPolicyViolationFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateLocalSolutionUserDetails", "UpdateLocalSolutionUserDetails", "sso.version.version1", (("userName", "string", "sso.version.version1", 0, None),("userDetails", "sso.admin.SolutionDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateLocalGroupDetails", "UpdateLocalGroupDetails", "sso.version.version1", (("groupName", "string", "sso.version.version1", 0, None),("groupDetails", "sso.admin.GroupDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateSelfLocalPersonUserDetails", "UpdateSelfLocalPersonUserDetails", "sso.version.version1", (("userDetails", "sso.admin.PersonDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("deleteSelfSolutionUser", "DeleteSelfSolutionUser", "sso.version.version1", (), (0, "void", "void"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("resetSelfLocalPersonUserPassword", "ResetSelfLocalPersonUserPassword", "sso.version.version1", (("newPassword", "string", "sso.version.version1", F_SECRET, None),), (0, "void", "void"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.admin.fault.PasswordPolicyViolationFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("resetLocalUserPassword", "ResetLocalUserPassword", "sso.version.version1", (("username", "string", "sso.version.version1", 0, None),("currentPassword", "string", "sso.version.version1", F_SECRET, None),("newPassword", "string", "sso.version.version1", F_SECRET, None),), (0, "void", "void"), None, ["vmodl.fault.InvalidRequest", "sso.fault.InvalidPrincipalFault", "sso.admin.fault.PasswordPolicyViolationFault", ]), ("updateSelfLocalSolutionUserDetails", "UpdateSelfLocalSolutionUserDetails", "sso.version.version1", (("userDetails", "sso.admin.SolutionDetails", "sso.version.version1", 0, None),), (0, "sso.PrincipalId", "sso.PrincipalId"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("unlockUserAccount", "UnlockUserAccount", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("enableUserAccount", "EnableUserAccount", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("disableUserAccount", "DisableUserAccount", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getDaysRemainingUntilPasswordExpiration", "GetDaysRemainingUntilPasswordExpiration", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),), (0, "int", "int"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("getDaysRemainingUntilSelfPasswordExpiration", "GetDaysRemainingUntilSelfPasswordExpiration", "sso.version.version2", (), (0, "int", "int"), "Sso.Self.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.InvalidCredentials", ]), ("registerExternalUser", "RegisterExternalUser", "sso.version.version2", (("externalUserId", "sso.PrincipalId", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("removeExternalUser", "RemoveExternalUser", "sso.version.version2", (("externalUserId", "sso.PrincipalId", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateManagedType("sso.admin.ReplicationService", "SsoAdminReplicationService", "vmodl.ManagedObject", "sso.version.version1", None, [("exportFullState", "ExportFullState", "sso.version.version1", (), (0, "vmodl.Binary", "vmodl.Binary"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "vmodl.fault.NotSupported", "sso.fault.InternalFault", ]), ("importFullState", "ImportFullState", "sso.version.version1", (("fullState", "vmodl.Binary", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", "vmodl.fault.NotSupported", "vmodl.fault.InvalidArgument", "sso.fault.InternalFault", ])])
CreateManagedType("sso.admin.RoleManagementService", "SsoAdminRoleManagementService", "vmodl.ManagedObject", "sso.version.version1", None, [("setRole", "SetRole", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),("role", "string", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("hasAdministratorRole", "HasAdministratorRole", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("hasConfigurationUserRole", "HasConfigurationUserRole", "sso.version.version3_1", (("userId", "sso.PrincipalId", "sso.version.version3_1", 0, None),), (0, "boolean", "boolean"), "SystemConfiguration.Administrators", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("hasRegularUserRole", "HasRegularUserRole", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("grantWSTrustRole", "GrantWSTrustRole", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),("role", "string", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("revokeWSTrustRole", "RevokeWSTrustRole", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),("role", "string", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("grantIDPProvisioningRole", "GrantIDPProvisioningRole", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),("role", "string", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("revokeIDPProvisioningRole", "RevokeIDPProvisioningRole", "sso.version.version2", (("userId", "sso.PrincipalId", "sso.version.version2", 0, None),("role", "string", "sso.version.version2", 0, None),), (0, "boolean", "boolean"), "Sso.AdminServer.Administer", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateEnumType("sso.admin.RoleManagementService.Role", "SsoAdminRoleManagementServiceRole", "sso.version.version1", ["GuestUser", "RegularUser", "ConfigurationUser", "IdentitySourceAdministrator", "Administrator"])
CreateEnumType("sso.admin.RoleManagementService.WSTrustRole", "SsoAdminRoleManagementServiceWSTrustRole", "sso.version.version2", ["ActAsUser"])
CreateEnumType("sso.admin.RoleManagementService.IDPProvisioningRole", "SsoAdminRoleManagementServiceIDPProvisioningRole", "sso.version.version2", ["IDPAdministrator"])
CreateDataType("sso.admin.ServiceContent", "SsoAdminServiceContent", "vmodl.DynamicData", "sso.version.version1", [("aboutInfo", "sso.AboutInfo", "sso.version.version1", 0), ("sessionManager", "sso.SessionManager", "sso.version.version1", 0), ("configurationManagementService", "sso.admin.ConfigurationManagementService", "sso.version.version1", 0), ("smtpManagementService", "sso.admin.SmtpManagementService", "sso.version.version1", 0), ("principalDiscoveryService", "sso.admin.PrincipalDiscoveryService", "sso.version.version1", 0), ("principalManagementService", "sso.admin.PrincipalManagementService", "sso.version.version1", 0), ("roleManagementService", "sso.admin.RoleManagementService", "sso.version.version1", 0), ("passwordPolicyService", "sso.admin.PasswordPolicyService", "sso.version.version1", 0), ("lockoutPolicyService", "sso.admin.LockoutPolicyService", "sso.version.version1", 0), ("domainManagementService", "sso.admin.DomainManagementService", "sso.version.version1", 0), ("identitySourceManagementService", "sso.admin.IdentitySourceManagementService", "sso.version.version1_5", F_OPTIONAL), ("systemManagementService", "sso.admin.SystemManagementService", "sso.version.version1_5", F_OPTIONAL), ("computerManagementService", "sso.admin.ComputerManagementService", "sso.version.version3_1", F_OPTIONAL), ("ssoHealthManagementService", "sso.admin.SsoHealthManagementService", "sso.version.version3_5", F_OPTIONAL), ("deploymentInformationService", "sso.admin.DeploymentInformationService", "sso.version.version1", 0), ("replicationService", "sso.admin.ReplicationService", "sso.version.version1", 0)])
CreateDataType("sso.admin.ServiceEndpoint", "SsoAdminServiceEndpoint", "vmodl.DynamicData", "sso.version.version3", [("name", "string", "sso.version.version3", 0), ("endpoint", "string", "sso.version.version3", 0), ("binding", "string", "sso.version.version3", 0)])
CreateManagedType("sso.admin.ServiceInstance", "SsoAdminServiceInstance", "vmodl.ManagedObject", "sso.version.version1", None, [("retrieveServiceContent", "SsoAdminServiceInstance", "sso.version.version1", (), (0, "sso.admin.ServiceContent", "sso.admin.ServiceContent"), "System.Anonymous", ["sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.SmtpConfig", "SsoAdminSmtpConfig", "vmodl.DynamicData", "sso.version.version1", [("host", "string", "sso.version.version1", F_OPTIONAL), ("port", "int", "sso.version.version1", F_OPTIONAL), ("authenticate", "boolean", "sso.version.version1", F_OPTIONAL), ("user", "string", "sso.version.version1", F_OPTIONAL), ("password", "string", "sso.version.version1", F_OPTIONAL)])
CreateManagedType("sso.admin.SmtpManagementService", "SsoAdminSmtpManagementService", "vmodl.ManagedObject", "sso.version.version1", None, [("getSmtpConfiguration", "GetSmtpConfiguration", "sso.version.version1", (), (0, "sso.admin.SmtpConfig", "sso.admin.SmtpConfig"), "Sso.AdminServer.Administer", ["sso.admin.fault.SmtpConfigNotSetFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("updateSmtpConfiguration", "UpdateSmtpConfiguration", "sso.version.version1", (("config", "sso.admin.SmtpConfig", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("sendMail", "SendMail", "sso.version.version1", (("content", "sso.admin.MailContent", "sso.version.version1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.SmtpConfigNotSetFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.SolutionDetails", "SsoAdminSolutionDetails", "vmodl.DynamicData", "sso.version.version1", [("description", "string", "sso.version.version1", F_OPTIONAL), ("certificate", "string", "sso.version.version1", 0)])
CreateDataType("sso.admin.SolutionUser", "SsoAdminSolutionUser", "vmodl.DynamicData", "sso.version.version1", [("id", "sso.PrincipalId", "sso.version.version1", 0), ("alias", "sso.PrincipalId", "sso.version.version1", F_OPTIONAL), ("details", "sso.admin.SolutionDetails", "sso.version.version1", 0), ("disabled", "boolean", "sso.version.version1", 0), ("external", "boolean", "sso.version.version2_5", 0)])
CreateManagedType("sso.admin.SsoHealthManagementService", "SsoAdminSsoHealthManagementService", "vmodl.ManagedObject", "sso.version.version3_5", None, [("getSsoStatistics", "GetSsoStatistics", "sso.version.version3_5", (), (0, "sso.admin.SsoHealthStats", "sso.admin.SsoHealthStats"), "SystemConfiguration.Administrators", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.SsoHealthStats", "SsoAdminSsoHealthStats", "vmodl.DynamicData", "sso.version.version3_5", [("tenant", "string", "sso.version.version3_5", 0), ("totalTokensGenerated", "int", "sso.version.version3_5", 0), ("totalTokensRenewed", "int", "sso.version.version3_5", 0), ("generatedTokensForTenant", "int", "sso.version.version3_5", 0), ("renewedTokensForTenant", "int", "sso.version.version3_5", 0), ("uptimeIDM", "long", "sso.version.version3_5", 0), ("uptimeSTS", "long", "sso.version.version3_5", 0)])
CreateManagedType("sso.admin.SystemManagementService", "SsoAdminSystemManagementService", "vmodl.ManagedObject", "sso.version.version1_5", None, [("getActiveDirectoryJoinStatus", "IdS_getActiveDirectoryJoinStatus", "sso.version.version1_5", (), (0, "sso.admin.ActiveDirectoryJoinInfo", "sso.admin.ActiveDirectoryJoinInfo"), "System.Read", ["sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("joinActiveDirectory", "JoinActiveDirectory", "sso.version.version3_1", (("username", "string", "sso.version.version3_1", 0, None),("password", "string", "sso.version.version3_1", 0, None),("domain", "string", "sso.version.version3_1", 0, None),("orgUnit", "string", "sso.version.version3_1", F_OPTIONAL, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.ADDomainAccessDeniedFault", "sso.admin.fault.ADDomainUnknownDomainFault", "sso.admin.fault.ADDomainAlreadyJoinedFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", ]), ("leaveActiveDirectory", "LeaveActiveDirectory", "sso.version.version3_1", (("username", "string", "sso.version.version3_1", 0, None),("password", "string", "sso.version.version3_1", 0, None),), (0, "void", "void"), "Sso.AdminServer.Administer", ["sso.admin.fault.ADIDSAlreadyExistFault", "sso.admin.fault.ADDomainAccessDeniedFault", "sso.admin.fault.ADDomainUnknownDomainFault", "sso.admin.fault.ADDomainNotJoinedFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", ])])
CreateDataType("sso.admin.TrustedSTSConfig", "SsoAdminTrustedSTSConfig", "vmodl.DynamicData", "sso.version.version2", [("issuer", "string", "sso.version.version2", 0), ("signingCertChain", "sso.admin.ConfigurationManagementService.CertificateChain", "sso.version.version2", 0), ("subjectFormatMappings", "sso.admin.ConfigurationManagementService.AttributeConfig[]", "sso.version.version2", F_OPTIONAL), ("tokenClaimGroupMappings", "sso.admin.ConfigurationManagementService.TokenClaimGroupMapping[]", "sso.version.version3_5", F_OPTIONAL)])
CreateDataType("sso.admin.User", "SsoAdminUser", "vmodl.DynamicData", "sso.version.version1", [("id", "sso.PrincipalId", "sso.version.version1", 0), ("alias", "sso.PrincipalId", "sso.version.version1", F_OPTIONAL), ("kind", "string", "sso.version.version1", 0), ("description", "string", "sso.version.version1", F_OPTIONAL)])
CreateEnumType("sso.admin.User.Kind", "SsoAdminUserKind", "sso.version.version1", ["person", "solution"])
CreateDataType("sso.admin.VmHost", "SsoAdminVmHost", "vmodl.DynamicData", "sso.version.version3_1", [("hostName", "string", "sso.version.version3_1", 0), ("domainController", "boolean", "sso.version.version3_1", 0)])
CreateDataType("sso.fault.InvalidCredentials", "SsoFaultInvalidCredentials", "vmodl.fault.SecurityError", "sso.version.version1", None)
CreateDataType("sso.fault.NoPermission", "SsoFaultNoPermission", "vmodl.fault.SecurityError", "sso.version.version1", None)
CreateDataType("sso.fault.NotAuthenticated", "SsoFaultNotAuthenticated", "vmodl.fault.SecurityError", "sso.version.version1", None)
CreateDataType("sso.fault.RuntimeServiceFault", "SsoFaultRuntimeServiceFault", "vmodl.RuntimeFault", "sso.version.version1", None)
CreateDataType("sso.fault.ServiceFault", "SsoFaultServiceFault", "vmodl.MethodFault", "sso.version.version1", None)
CreateManagedType("sso.groupcheck.GroupCheckService", "SsoGroupcheckGroupCheckService", "vmodl.ManagedObject", "sso.version.version1", None, [("isMemberOfGroup", "IsMemberOfGroup", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),("groupId", "sso.PrincipalId", "sso.version.version1", 0, None),), (0, "boolean", "boolean"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findParentGroups", "FindParentGroups", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),("groupList", "sso.PrincipalId[]", "sso.version.version1", F_OPTIONAL, None),), (F_OPTIONAL, "sso.PrincipalId[]", "sso.PrincipalId[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ]), ("findAllParentGroups", "FindAllParentGroups", "sso.version.version1", (("userId", "sso.PrincipalId", "sso.version.version1", 0, None),), (F_OPTIONAL, "sso.PrincipalId[]", "sso.PrincipalId[]"), "System.Read", ["sso.fault.InvalidPrincipalFault", "sso.fault.NotAuthenticated", "sso.fault.NoPermission", "sso.fault.InvalidCredentials", ])])
CreateDataType("sso.groupcheck.ServiceContent", "SsoGroupcheckServiceContent", "vmodl.DynamicData", "sso.version.version1", [("aboutInfo", "sso.AboutInfo", "sso.version.version1", 0), ("sessionManager", "sso.SessionManager", "sso.version.version1", 0), ("groupCheckService", "sso.groupcheck.GroupCheckService", "sso.version.version1", 0)])
CreateManagedType("sso.groupcheck.ServiceInstance", "SsoGroupcheckServiceInstance", "vmodl.ManagedObject", "sso.version.version1", None, [("retrieveServiceContent", "SsoGroupcheckServiceInstance", "sso.version.version1", (), (0, "sso.groupcheck.ServiceContent", "sso.groupcheck.ServiceContent"), "System.Anonymous", ["sso.fault.InvalidCredentials", ])])
CreateDataType("sso.admin.fault.ADDomainAccessDeniedFault", "SsoAdminFaultADDomainAccessDeniedFault", "sso.fault.ServiceFault", "sso.version.version3_1", [("domain", "string", "sso.version.version3_1", F_OPTIONAL), ("username", "string", "sso.version.version3_1", 0)])
CreateDataType("sso.admin.fault.ADDomainAlreadyJoinedFault", "SsoAdminFaultADDomainAlreadyJoinedFault", "sso.fault.ServiceFault", "sso.version.version3_1", None)
CreateDataType("sso.admin.fault.ADDomainNotJoinedFault", "SsoAdminFaultADDomainNotJoinedFault", "sso.fault.ServiceFault", "sso.version.version3_1", None)
CreateDataType("sso.admin.fault.ADDomainUnknownDomainFault", "SsoAdminFaultADDomainUnknownDomainFault", "sso.fault.ServiceFault", "sso.version.version3_1", [("domain", "string", "sso.version.version3_1", F_OPTIONAL)])
CreateDataType("sso.admin.fault.ADIDSAlreadyExistFault", "SsoAdminFaultADIDSAlreadyExistFault", "sso.fault.ServiceFault", "sso.version.version1_5", [("domainName", "string", "sso.version.version1_5", 0)])
CreateDataType("sso.admin.fault.CertChainInvalidTrustedPathFault", "SsoAdminFaultCertChainInvalidTrustedPathFault", "sso.fault.ServiceFault", "sso.version.version3", [("issuerName", "string", "sso.version.version3", 0)])
CreateDataType("sso.admin.fault.CertificateDeletionFault", "SsoAdminFaultCertificateDeletionFault", "sso.fault.ServiceFault", "sso.version.version1_5", [("certificate", "string", "sso.version.version1_5", 0)])
CreateDataType("sso.admin.fault.DirectoryServiceConnectionFault", "SsoAdminFaultDirectoryServiceConnectionFault", "sso.fault.ServiceFault", "sso.version.version1", [("uri", "vmodl.URI", "sso.version.version1", 0)])
CreateDataType("sso.admin.fault.DomainManagerFault", "SsoAdminFaultDomainManagerFault", "sso.fault.RuntimeServiceFault", "sso.version.version1_5", [("domainName", "string", "sso.version.version1_5", 0), ("errorCode", "int", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.fault.DomainNotFoundFault", "SsoAdminFaultDomainNotFoundFault", "sso.fault.ServiceFault", "sso.version.version1", [("domainName", "string", "sso.version.version1", 0)])
CreateDataType("sso.admin.fault.DuplicateDataFault", "SsoAdminFaultDuplicateDataFault", "sso.fault.ServiceFault", "sso.version.version1", None)
CreateDataType("sso.admin.fault.DuplicateDomainNameFault", "SsoAdminFaultDuplicateDomainNameFault", "sso.fault.ServiceFault", "sso.version.version1", [("domainName", "string", "sso.version.version1", 0), ("domainAlias", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.fault.DuplicateSolutionCertificateFault", "SsoAdminFaultDuplicateSolutionCertificateFault", "sso.fault.ServiceFault", "sso.version.version1", None)
CreateDataType("sso.admin.fault.ExternalSTSCertChainInvalidTrustedPathFault", "SsoAdminFaultExternalSTSCertChainInvalidTrustedPathFault", "sso.fault.ServiceFault", "sso.version.version2", [("issuerName", "string", "sso.version.version2", 0)])
CreateDataType("sso.admin.fault.ExternalSTSExtraneousCertsInCertChainFault", "SsoAdminFaultExternalSTSExtraneousCertsInCertChainFault", "sso.fault.ServiceFault", "sso.version.version2", [("issuerName", "string", "sso.version.version2", 0)])
CreateDataType("sso.admin.fault.ExtraneousCertsInCertChainFault", "SsoAdminFaultExtraneousCertsInCertChainFault", "sso.fault.ServiceFault", "sso.version.version3", [("issuerName", "string", "sso.version.version3", 0)])
CreateDataType("sso.admin.fault.GroupCyclicDependencyFault", "SsoAdminFaultGroupCyclicDependencyFault", "sso.fault.ServiceFault", "sso.version.version1", [("groupBeingAdded", "string", "sso.version.version1", 0), ("existingGroup", "string", "sso.version.version1", 0)])
CreateDataType("sso.admin.fault.HostNotJoinedRequiredDomainFault", "SsoAdminFaultHostNotJoinedRequiredDomainFault", "sso.fault.RuntimeServiceFault", "sso.version.version1_5", [("requiredDomainName", "string", "sso.version.version1_5", 0), ("joinedDomainName", "string", "sso.version.version1_5", F_OPTIONAL)])
CreateDataType("sso.admin.fault.InvalidPasswordPolicyFault", "SsoAdminFaultInvalidPasswordPolicyFault", "sso.fault.ServiceFault", "sso.version.version1", None)
CreateDataType("sso.admin.fault.InvalidProviderFault", "SsoAdminFaultInvalidProviderFault", "sso.fault.RuntimeServiceFault", "sso.version.version1", [("fieldName", "string", "sso.version.version1", 0), ("fieldValue", "string", "sso.version.version1", F_OPTIONAL)])
CreateDataType("sso.admin.fault.LocalOSDomainRegistrationFault", "SsoAdminFaultLocalOSDomainRegistrationFault", "sso.fault.RuntimeServiceFault", "sso.version.version1", None)
CreateDataType("sso.admin.fault.NativeADRegistrationFault", "SsoAdminFaultNativeADRegistrationFault", "sso.fault.RuntimeServiceFault", "sso.version.version1_5", None)
CreateDataType("sso.admin.fault.NoSuchConfigFault", "SsoAdminFaultNoSuchConfigFault", "sso.fault.ServiceFault", "sso.version.version3", [("issuerName", "string", "sso.version.version3", 0)])
CreateDataType("sso.admin.fault.NoSuchExternalSTSConfigFault", "SsoAdminFaultNoSuchExternalSTSConfigFault", "sso.fault.ServiceFault", "sso.version.version2", [("issuerName", "string", "sso.version.version2", 0)])
CreateDataType("sso.admin.fault.NoSuchRelyingPartyFault", "SsoAdminFaultNoSuchRelyingPartyFault", "sso.fault.ServiceFault", "sso.version.version2", [("relyingPartyName", "string", "sso.version.version2", 0)])
CreateDataType("sso.admin.fault.PasswordPolicyViolationFault", "SsoAdminFaultPasswordPolicyViolationFault", "sso.fault.ServiceFault", "sso.version.version1", None)
CreateDataType("sso.admin.fault.SmtpConfigNotSetFault", "SsoAdminFaultSmtpConfigNotSetFault", "sso.fault.ServiceFault", "sso.version.version1", None)
CreateDataType("sso.fault.InternalFault", "SsoFaultInternalFault", "sso.fault.RuntimeServiceFault", "sso.version.version1", None)
CreateDataType("sso.fault.InvalidPrincipalFault", "SsoFaultInvalidPrincipalFault", "sso.fault.ServiceFault", "sso.version.version1", [("principal", "string", "sso.version.version1", 0)])
CreateDataType("sso.fault.NoDomainSearchPermission", "SsoFaultNoDomainSearchPermission", "sso.fault.RuntimeServiceFault", "sso.version.version1_5", [("domainName", "string", "sso.version.version2", F_OPTIONAL)])
| 515.627027 | 10,891 | 0.751633 | 9,290 | 95,391 | 7.665447 | 0.07352 | 0.115851 | 0.135988 | 0.054696 | 0.708715 | 0.669325 | 0.618618 | 0.573148 | 0.529742 | 0.498947 | 0 | 0.017713 | 0.048904 | 95,391 | 184 | 10,892 | 518.429348 | 0.7672 | 0.000619 | 0 | 0 | 1 | 0 | 0.734197 | 0.468404 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 0.055556 | 0 | 0.055556 | 0.011111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
090fcf1acbb4194bc6f082f8befd5628ee90fb68 | 2,354 | py | Python | papers/ReTraCk/retriever/schema_retriever/constants_schema_retriever.py | microsoft/KC | 928c74073246ef932f6b80f6fe353117a6cacb55 | [
"MIT"
] | 29 | 2021-07-27T05:48:53.000Z | 2022-03-30T00:05:41.000Z | papers/ReTraCk/retriever/schema_retriever/constants_schema_retriever.py | microsoft/KC | 928c74073246ef932f6b80f6fe353117a6cacb55 | [
"MIT"
] | 5 | 2021-07-29T08:00:26.000Z | 2022-03-24T02:35:15.000Z | papers/ReTraCk/retriever/schema_retriever/constants_schema_retriever.py | microsoft/KC | 928c74073246ef932f6b80f6fe353117a6cacb55 | [
"MIT"
] | 7 | 2021-07-29T07:53:52.000Z | 2022-02-21T08:10:26.000Z | defaultConfig = '{"model": {"GrailQA": {"Relation": {"test_entities": null, "test_mentions": null, "interactive": false, "top_k": 200, "biencoder_model": "./grailqa/relation/pytorch_model.bin", "biencoder_config": "./grailqa/relation/training_params.txt", "entity_catalogue": "../data/relation.jsonl", "entity_encoding": "../data/grailqa/relation/relation_emb.t7", "fast": true, "faiss_index": null, "index_path": null, "bert_model": "bert-base-uncased", "output_path": "logs/", "no_cuda": false}, "Class": {"test_entities": null, "test_mentions": null, "interactive": false, "top_k": 200, "biencoder_model": "../data/grailqa/class/pytorch_model.bin", "biencoder_config": "../data/grailqa/class/training_params.txt", "entity_catalogue": "../data/class.jsonl", "entity_encoding": "../data/grailqa/class/class_emb.t7", "fast": true, "faiss_index": null, "index_path": null, "bert_model": "bert-base-uncased", "output_path": "logs/", "no_cuda": false}}, "WebQSP": {"Relation": {"test_entities": null, "test_mentions": null, "interactive": false, "top_k": 200, "biencoder_model": "../data/webqsp/relation/pytorch_model.bin", "biencoder_config": "../data/webqsp/relation/training_params.txt", "entity_catalogue": "../data/relation.jsonl", "entity_encoding": "../data/webqsp/relation/relation_emb.t7", "fast": true, "faiss_index": null, "index_path": null, "bert_model": "bert-base-uncased", "output_path": "logs/", "no_cuda": false}, "Class": {"test_entities": null, "test_mentions": null, "interactive": false, "top_k": 50, "biencoder_model": "../data/webqsp/class/pytorch_model.bin", "biencoder_config": "../data/webqsp/class/training_params.txt", "entity_catalogue": "../data/class.jsonl", "entity_encoding": "../data/webqsp/class/class_emb.t7", "fast": true, "faiss_index": null, "index_path": null, "bert_model": "bert-base-uncased", "output_path": "logs/", "no_cuda": false}}}, "train": {"data_path": "../data/webqsp/totrain", "top_k": 10, "max_seq_length": 256, "max_context_length": 50, "max_cand_length": 50, "bert_model": "bert-base-uncased", "train_batch_size": 64, "eval_batch_size": 512, "num_train_epochs": 60, "print_interval": 1000, "eval_interval": 1000, "save_interval": 1, "warmup_proportion": 0.1, "gradient_accumulation_steps": 1, "type_optimization": "all_encoder_layers"}}'
defaultConfigFileName = 'config.json'
defaultConfigIndents = 4 | 784.666667 | 2,291 | 0.717927 | 307 | 2,354 | 5.23127 | 0.257329 | 0.043587 | 0.040473 | 0.052927 | 0.752802 | 0.721669 | 0.693026 | 0.615816 | 0.615816 | 0.615816 | 0 | 0.020027 | 0.066695 | 2,354 | 3 | 2,292 | 784.666667 | 0.71097 | 0 | 0 | 0 | 0 | 0.333333 | 0.969851 | 0.292994 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.